Microsoft has released the technical preview of IoT Central, the company's SaaS offering designed to let organizations roll out IoT-based solutions with minimal configuration or knowledge of the complexities of integrating operational systems with IT.
The company, which announced IoT Central back in April, said customers can use these preconfigured application templates to deploy IoT capabilities within hours and without the need for developers skilled in IoT. The new SaaS offering will compliment Microsoft's Azure IoT Suite, a PaaS-based offering that requires more customization and systems integration.
"For Microsoft IoT Central, the skill level [required] is really low," said Sam George, Microsoft's director of Azure IoT, during a press briefing earlier this week. Both IoT Central and the IoT Suite are built upon Azure IoT Hub as a gateway that the company said provides secure connectivity, provisioning and data gathering from IoT-enabled endpoints and devices.
They also both utilize other Azure services such as Machine Learning, Streaming Analytics, Time Series Insights and Logic Apps. The Azure IoT Suite is consumption-based, while IoT Central is based on a subscription model based on the amount of usage starting at 50 cents per user per month (or for a fixed rate of $500 per month). The company is also offering free trials.
"Up until now IoT has been out of reach for the average business and enterprise," George said. "We think it's time for IoT to be broadly available. There is nothing with rapid development like this from a major cloud vendor on the market."
IoT Central gives each device a unique security key and the service provides a set of device libraries, including Azure IoT device SDKs that support different platforms including Node.js, C/C# and Java. The service offers native support for IoT device connectivity protocols such as MQTT 3.1.1, HTTP 1.1 and AMQP 1.0. The company claims IoT Central can scale to millions of connected devices and millions of events per second gathered through the Azure IoT cloud gateway and stored in its time-series storage.
The number of IoT-based connected "things" is forecast at 8.4 billion for this year and on pace to reach 20 billion by 2020, according to Gartner. But ITIC Principal Analyst Laura DiDio warns that most customers are still in the early stages of true IoT deployments, though she said they are on pace to ramp up rapidly over the next two-to-three years. "Scalability will be imperative," DiDio said. "Corporations will require their devices, applications and networks to grow commensurately with their business needs."
Posted by Jeffrey Schwartz on 12/06/2017 at 11:54 AM0 comments
It's no secret that Microsoft wants enterprises to migrate all their PC users to Windows 10 as a service and to move to its new modern approach to configuring, securing and managing those systems and the applications associated with them. This year's launch of Microsoft 365 -- a subscription service that bundles Windows 10 licenses, Office 365 and the Enterprise Mobility + Security (EMS) service -- is the strongest sign yet that the company is pushing IT pros away from the traditional approach of imaging and managing PCs with System Center Configuration Manager (SCCM) in favor of Microsoft Intune in EMS.
While many IT pros have embraced the new modern systems management model, others are bemoaning it and quite a few remain unsure, according to a spot survey of Redmond readers over the weekend.
Nearly half, or 48 percent, of Redmond subscribers who are planning Windows 10 migrations said they intend to continue using SCCM to configure, deploy and manage those systems, according to 201 respondents to an online poll. Yet 31 percent are undecided and only 4 percent have decided on Microsoft's EMS offering. And while 9 percent will use a third-party MDM/EMM/MAM offering, a near-equal amount will implement a mixture of the aforementioned options.
While 19 percent responded that they plan to use Microsoft 365 where it makes sense, only 10 percent plan to use it enterprisewide. A formidable number, 42 percent, said their organization has no plans to use Microsoft 365, while 28 percent are undecided.
As Intune takes on more automated deployment capabilities, organizations upgrading to Windows 10 -- which they must do by 2020 -- may find SCCM becoming less essential in a growing number of scenarios. Brad Anderson, Microsoft's corporate vice president for enterprise client mobility, drove that point home during a keynote session at the recent Ignite conference in Orlando. "One of the big things about modern management is we are encouraging you to move away from imaging," Anderson said. "Stop maintaining those images and all of the libraries and drivers and let's move to a model where we can automatically provision you from the cloud."
Anderson estimated that SCCM now manages 75 percent of all PCs, hundreds of millions, and continues to grow by 1 million users per week -- hence the point that this change is going to happen quickly. Microsoft continues to upgrade SCCM, and has moved from its traditional release cycle of every two to three years to three times per year, and its current branch model with new test builds for insiders issued every month. Microsoft also now offers an SCCM co-management bridge and PowerShell scripts for Intune.
Organizations have various decisions to make, and there are still many moving parts. Which is why despite the fact that 600 million systems now run Windows 10, many organizations still haven't migrated or are only in the early stages of doing so.
Join MVP and longtime Redmond contributor Greg Shields, Enterprise Strategy Group Senior Analyst Mark Bowker and me tomorrow at 11 a.m. PT for a Redmond webinar: Microsoft 365 for Modern Workplace Management: Considerations for Moving to a Post-SCCM World. You can sign up here.
Posted by Jeffrey Schwartz on 12/04/2017 at 1:10 PM0 comments
The Box collaboration and enterprise content management (ECM) service is now available in Microsoft's Azure public cloud, marking the latest integration points between the two companies in recent years. Box and Microsoft, which are also competitors with overlapping collaboration capabilities, found it in their respective best interests two years ago to work together, staring with basic Office 365 integration.
Both companies extended their partnership in June and elaborated on their extended roadmap at last month's BoxWorks 2017 conference. Box had indicated then that the first new capability would start when it began offering its service in all of Microsoft's Azure global regions. The service that Box made available last week allows Box customers to use Azure as their primary storage for Box content, said Sanjay Manchanda, in an interview last week.
"It's Box content management capabilities with the content being stored in Azure," he said. Box historically has run its own datacenter operations throughout the world, but decided its best route to scaling would be to partner with the large global cloud providers. In addition to its arrangement with Microsoft, Box has partnerships with Amazon Web Services (AWS) and IBM. Manchanda, who worked at Microsoft for more than 10 years before joining Box, said the partnership with his former employer is similar to its relationship with IBM, where both include technical and comarketing pacts.
Manchanda noted that giving Box customers access to its service in Azure will simplify cross-organization collaboration among employees and their external partners, suppliers and customers. It will also provide secure content management by tapping Box's integrations with more than 1,400 SaaS-based applications, including those offered via Office 365, and will allow organizations to do the same when building custom applications.
The roadmap calls for Box to use Microsoft's Azure Key Vault service, which lets organizations bring and manage their own encryption keys. While Box offers its own encryption service called Box Key Safe, which uses a hardware security module (HSM) for encryption, it won't be making it into the new service. "We plan to use the service that Azure Key Vault provides," he said.
Box also plans, as part of its roadmap, to integrate Microsoft Cognitive Services with its service, allowing customers to automatically identify content, categorize it, run workflows and make it easier for users to find information.
Now available in the United States, Box intends to roll it out throughout Microsoft's 40 Azure regions, and allow customers with data sovereignty requirements to ensure their data doesn't leave the confines of a specific country or locale, Manchanda explained. Asked whether Box plans to support Azure Stack, either deployed on a customer's premises or via a third party managed services provider, Manchanda said that isn't part of the current roadmap -- but didn't rule it out if there's customer demand.
Posted by Jeffrey Schwartz on 11/29/2017 at 1:47 PM0 comments
McAfee is filling a void in its security portfolio with its plan to acquire leading cloud access security broker (CASB) provider Skyhigh Networks. The deal to acquire Skyhigh for an undisclosed amount, announced today, will give McAfee one of the most regarded CASB offerings as the company looks to join the fray of vendors blending cloud security, network protection and endpoint management.
Prior to spinning off from Intel earlier this year, McAfee determined it needed to focus on two key threat protection and defense control points: the endpoint and cloud, a strategy Symantec similarly concluded last year by acquiring Blue Coat for $4.6 billion. Cisco also came to that conclusion last year with its acquisition of CloudLock, which it is integrating with its OpenDNS, Talos threat analytics and existing firewall and data loss protection (DLP) offerings.
Microsoft jumped into the CASB arena two years ago with the acquisition of Adallom, which was relaunched last year as Cloud App Security and refreshed in September at the Ignite conference with support for conditional access. It's now offered as an option with Microsoft's Enterprise Mobility + Security (EMS) service. Other popular providers of CASB tools include BitGlass, CipherCloud, Forcepoint, Imperva and Netskope.
Raja Patel, McAfee's VP and general manager for corporate products, said in an interview that customers and channel partners have all asked what the company had planned in terms of offering a CASB solution. "We think there is a large market for CASBs and the capabilities that Skyhigh brings," Patel said. "They were the original CASB player in the marketplace, and they have led the category and really moved the needle in terms of their leadership and evolving the category."
Upon closing of the deal, which is scheduled to take place early next year, Skyhigh Networks CEO Rajiv Gupta will report to McAfee CEO Chris Young and will oversee the combined vendor's cloud security business. Gupta's team, which will join McAfee, will also help integrate Skyhigh Networks' CASB with McAfee's endpoint and DLP offerings. "Combined with McAfee's endpoint security capabilities and operations center solutions with actionable threat intelligence, analytics and orchestration, we will be able to deliver a set of end-to-end security capabilities unique in the industry," Gupta said in a blog post.
Patel said the company plans to let customers bring the policies implemented in their McAfee endpoint protection software and network DLP systems to their cloud infrastructure and SaaS platforms. Another priority is bringing more context to its endpoint tools and integrating the CASB with Web gateways and cloud service providers.
While only 15 percent of organizations used CASBs last year, Patel cited a Gartner forecast that found the market will grow to 85 percent by 2020. "If you look at the exponential growth of people adopting public cloud environments over the past two years, it extends to moving the security posture around it."
Posted by Jeffrey Schwartz on 11/27/2017 at 12:17 PM0 comments
Microsoft is adding MariaDB to the list of open source relational database platforms it will bring to Azure. MariaDB will join MySQL and PostgreSQL, announced earlier this year and now available in preview mode, in Azure. In addition to adding it to the menu of open source databases available in Azure, Microsoft has joined the MariaDB Foundation as a Platinum sponsor.
MariaDB is a fork -- or, as the foundation describes it, "an enhanced, drop-in replacement" -- of the MySQL, developed in wake of Oracle's acquisition of Sun Microsystems, which had earlier bought the popular open source database. MySQL Founder Monty (Michael) Widenius developed MariaDB to ensure an option that maintained its open source principles.
Microsoft announced its MariaDB support this week at the annual Connect conference, held in New York. While it's primarily SQL-based, it also has GIS and JSON interfaces and is used by notable organizations including Google, WordPress.com and Wikipedia.
Scott Guthrie, Microsoft's executive VP of cloud and enterprise, told analysts and media that MariaDB has become popular with a growing segment of developers. "Like our PostgreSQL and MySQL, options it's 100 percent compatible with all of the existing drivers, libraries and tools and can take full advantage of the rich MariaDB ecosystem," Guthrie said, during his Connect keynote address.
Microsoft announced MySQL and PostgreSQL as managed services offerings in Azure at its annual Build conference, which took place back in May in Seattle. Since releasing the previews, Microsoft has added PostgreSQL extensions, compute tiers and it's now available in 16 regions and on pace for release in all 40 regions, said Tobias Ternstrom principal group program manager for Microsoft's Database Systems Group, in a blog post.
Ternstrom visited Widenius in Sweden recently, where they discussed adding MariaDB to Azure and ultimately decided to join the foundation and participate in the open source project. "We are committed to working with the community to submit pull requests (hopefully improvements...) with the changes we make to the database engines that we offer in Azure," Ternstrom noted, in a blog post. "It keeps open source open and delivers a consistent experience, whether you run the database in the cloud, on your laptop when you develop your applications, or on-premises."
Microsoft has posted a waitlist for those wanting to test the forthcoming preview.
Posted by Jeffrey Schwartz on 11/20/2017 at 9:26 AM0 comments
Microsoft is looking to help remotely distributed development teams collaborate on code in real time with the introduction of Visual Studio Live Share. The new capability, demonstrated at the Microsoft Connect Conference in New York and streamed online, now works with both the Visual Studio IDE and multi-platform VS Code lightweight development tool, editor and debugger.
Visual Studio Share is among several new capabilities the company is adding to its tooling portfolio and suite of Azure services focused on making it easier for organizations to shift to DevOps management and methodologies, support for continuous integration-continuous deployment release cycles and to target multiple platforms and frameworks.
Based in Azure, the new Visual Studio Live Share lets remotely distributed developers create live sessions in which they can interactively share code projects to troubleshoot problems and iterate in real time, while working in their preferred environment and setup. Visual Studio Live Share aims to do away with the current practice of remote development teams exchanging screen images with one another via e-mail or chat sessions using Slack or Teams. Instead it will be enabling live code sharing sessions.
The core Visual Studio Live Share service runs in Azure and allows for sharing of code among developers using both the full Visual Studio IDE or the new lightweight VS Code editor. It allows developers to share code between the two tools and they don't have to be using the same client platform, or even using all of the same programming environments or languages.
"We really think this is a game changer in terms of enabling real-time collaboration and development," said Scott Guthrie, Microsoft's executive VP for cloud and enterprise, speaking during the Connect keynote. "Rather than just screen sharing, Visual Studio Live Share lets developers share their full project context with a bidirectional, instant and familiar way to jump into opportunistic, collaborative programming" he added in a blog post, outlining all of the Connect announcements.
As Chris Dias, Microsoft's Visual Studio group product manager, demonstrated VS Live Share during the keynote, numerous developers shared their approval on Twitter, which Guthrie remarked on as the demo concluded. Julia Liuson, Microsoft's corporate VP for Visual Studio, in an interview at Connect said the reaction on Twitter mapped with the frustration developers typically encounter with the limitation of sharing screen grabs or relying on chat sessions or phone conversations. "It's painful. We hear this all the time, even in our own day interactions," she said.
The company released a limited private preview of the new VS Live Share capability yesterday, but did not disclose how it will be offered, or whether it might be extended to other development environments over time. "We're going to make sure we have the right offering first and will talk later about the business model," Liuson said, adding that some sort of free iteration is planned.
Microsoft introduced multiple features at Connect that will be coming to Visual Studio, as described below.
Azure DevOps Projects: A complete DevOps pipeline running on Visual Studio Team Services, Azure DevOps Projects is available in the Azure Portal in preview. It's aimed at making DevOps the "foundation" for all new projects, supporting multiple frameworks, languages and Azure-hosted deployment targets.
Visual Studio Connected Environment for Azure Container Service (AKS): Building on Microsoft's new AKS offering, Microsoft Principle Program Manager Scott Hanselman demonstrated how developers can edit and debug modern or cloud native apps running in Kubernetes clusters using Visual Studio, VS Code or using a command line interface. Hanselman demonstrated how developers can switch between .NET Core/C# and Node, Visual Studio and VS Code. The service is also in preview.
Visual Studio App Center: The app lifecycle platform that lets developers build, test, deploy, monitor and iterate based on live usage crash analytics telemetry is now generally available. Microsoft describes Visual Studio App Center as a shared environment for multiplatform mobile, Windows and Mac apps, supporting Objective-C, Swift, Java, Xamarin and React Native, connected to any code repository.
Visual Studio Tools for AI: A new modeling capability for Visual Studio is now in preview and will give developers and data scientists debugging and editing for key deep learning frameworks including Microsoft's Cognitive Toolkit, as well as TensorFlow and Caffe to build, train, manage and deploy their models locally and scale to Azure.
Azure IoT Edge: A service to deploy cloud intelligence to IoT devices via containers that run with the company's Azure IoT Edge, Azure Machine Learning, Azure Functions and Azure Stream Analytics is now in preview.
Posted by Jeffrey Schwartz on 11/16/2017 at 12:54 PM0 comments
Microsoft and Apache Spark creator Databricks are building a globally distributed streaming analytics service natively integrated with Azure for machine learning, graph processing and AI-based applications.
The new Datrabricks Spark as a service was introduced at Microsoft's annual Connect developer conference, which kicked off today in New York. The new service, available in preview, is among an extensive list of announcements focused on its various SQL and NoSQL database products and services, as well as productivity, cross-platform and added language improvements to Visual Studio and VSCode developer tools, as well as new DevOps capabilities, new machine learning, AI and IoT tooling.
During the opening keynote, Scott Guthrie, Microsoft's executive VP for Cloud and Enterprise, emphasized that Databricks is the creator of, and steward of, Apache Spark, and the new service will enable organizations to build modern data warehouses that support self-service analytics and machine learning using all data types in a secure and compliant architecture.
Databricks has engineered a first-party Spark-as-a-service platform for Azure. "It allows you to quickly launch and scale up the Spark service inside the cloud on Azure," Guthrie said. "It includes an incredibly rich, interactive workspace that makes it easy to build Spark-based workflows, and it integrates deeply across our other Azure services."
Those services include Azure SQL Data Warehouse, Azure Storage, Azure Cosmos DB, Azure Active Directory, Power BI and Azure Machine Learning, Guthrie said. It also provides integration with Azure Data Lake stores, Azure Blob storage and Azure Event Hub. "It's an incredibly easy way to integrate Spark deeply across your apps and drive richer intelligence from it," he said.
Databricks customers have been pushing the company to build its Spark platform as a native Azure service, said Ali Ghodsi, the company's cofounder and CEO, who joined Guthrie on stage. "We've been hearing overwhelming demand from our customer base that they want the security, they want the compliance and they want the scalability of Azure," Ghodsi said. "We think it can make AI and big data much simpler."
In addition to integrating with the various Azure services, it's designed to let those who want to create new data models to do so. According to Databricks, a user can target data regardless of size or create projects with various analytics services including Power BI, SQL, Streaming, MLlib and Graph. "Once you manage data at scale in the cloud, you open up massive possibilities for predictive analytics, AI, and real-time applications," according to a technical overview of the Azure Databricks service. "Over the past five years, the platform of choice for building these applications has been Apache Spark. With a massive community at thousands of enterprises worldwide, Spark makes it possible to run powerful analytics algorithms at scale and in real time to drive business insights."
However, deploying, managing and securing Spark at scale has remained a challenge, which Databricks believes will make the Azure service compelling.
Internally, Databricks is using the Azure Container Services to run the Azure Databricks control-plane and data planes using containers, according to the company's technical primer. It's also using accelerated networking services to improve performance on the latest Azure hardware specs.
Posted by Jeffrey Schwartz on 11/15/2017 at 1:33 PM0 comments
More than half of large enterprises that have implemented Cloud Foundry as their application runtime, orchestration and DevOps environment for business modernization projects are using it across multiple clouds, according to a results of a survey published last month.
Adoption of Cloud Foundry Application Runtime is on the rise among large and midsize enterprises, who are running new or transformed business apps across different clouds, including Amazon Web Services, Microsoft Azure, Google Cloud Platform and OpenStack, as well as in VMware vSphere virtual machines.
The survey of 735 users consisting of developers, architects, IT executives and operators, sponsored by the Cloud Foundry Foundation, revealed that 53 percent are using it across multiple clouds. According to the survey that was conducted by the 70-member consortium, 54 percent of Cloud Foundry Application Runtime users are running it on AWS, followed by 40 percent on VMware's vSphere, 30 percent on Microsoft's Azure, 22 percent on OpenStack and 19 percent on Google Cloud Platform. An additional 17 percent said they are running it on various provider-managed PaaS, including IBM Bluemix.
"Whether companies come to Cloud Foundry early or late in their cloud journey, the ability to run Cloud Foundry Application Runtime across multiple clouds is critical to most users," according to an executive summary of the report based on the survey conducted by research and consulting firm ClearPath Strategies in late August.
Accelerated Development and Deployment Reported
Organizations that use the open-source Cloud Foundry Application Runtime are seeing accelerated application development cycles when building their cloud-native applications, the survey also found. A majority, 51 percent, said It previously took three months to deploy a cloud application with only 16 percent able to do so. Upon moving their applications to Cloud Foundry Application Runtime, 46 percent claim their cloud app development cycles of under a week, and among them 25 percent claimed it took less than a day with only 18 percent reporting application development cycles of more than three months.
Before using Cloud Foundry Application Runtime, 58 percent said cloud applications were developed and deployed manually, 52 percent used custom installed scripts, 38 percent relied on configuration management tools, 27 percent VM images, 20 percent Docker containers and 19 percent Linux packages.
A vast majority, 71 percent, said they are using or evaluating adding container orchestration and management to their Cloud Foundry Runtime environment now that the Cloud Foundry Container Runtime is available. Half of Cloud Foundry Application Runtime users are currently using containers, such as Docker, with another 35 percent evaluating or deploying containers.
The release of Cloud Foundry Container Runtime and the Kubernetes-based container management project has generated significant interest among Cloud Foundry Application Runtime users. Nearly three-quarters (71 percent) of Cloud Foundry Application Runtime users currently using or evaluating container engines, primarily based on Docker or the Kubernetes' rtk, and are now interested in adding container orchestration and management to their Cloud Foundry Application Runtime environment.
A majority, 54 percent, use Cloud Foundry to develop, deploy and manage microservices, with 38 percent using it for their Web sites, 31 percent for internal business applications, 27 percent for software-as-a-service (SaaS) and 8 percent for legacy software transformation.
Early Stages
While Cloud Foundry is a relatively new technology -- only 15 percent have used it for more than three years and 45 percent for less than a year with 61 percent in in the early stages, according to the survey -- a noteworthy 39 percent report they have broadly integrated it already. Also, while 49 percent of those adopting it are large enterprises such as Ford, Home Depot and GE, 39 percent are smaller enterprises and 39 percent are small and medium businesses (SMBs), according to the breakdown of respondents.
The number of applications in deployment are still relatively few. More than half, 54 percent, have only deployed less than 10 apps, with 22 percent claiming between 11 and 50 apps and only 8 percent have deployed more than 500.
Pivotal Cloud Foundry, the subsidiary of Dell Technologies that offers the leading commercial distribution of Cloud Foundry, is among the most widely consumed services in Azure, according to Microsoft. Redmond contributor Michael Otey explains how to deploy it in Azure, which you can find here.
Posted by Jeffrey Schwartz on 11/13/2017 at 11:35 AM0 comments
Microsoft is readying a new lightweight database development and management tool that aims to give DBAs and developers common DevOps tooling to manage Microsoft's various on-premises and cloud database offerings. The new Microsoft SQL Operations Studio, demonstrated for the first time at last week's PASS Summit in Seattle, brings together the capabilities of SQL Server Management Studio (SSMS) and SQL Server Data Services (SSDS) with a modern, cross-platform interface.
The company demonstrated the forthcoming tool during the opening PASS keynote, showing the ability to rapidly deploy Linux and Windows Containers into SQL Server, Azure SQL Database and Azure SQL Data Warehouse. The first technical preview is set for release in the coming weeks. SQL Operations Studio lets developers and administrators build and manage T-SQL code in a more agile DevOps environment.
"We believe this is the way of the future," said Rohan Kumar, Microsoft's general manager of database systems engineering, who gave the opening keynote at last week's PASS event. "It's in its infancy. We see a lot of devops experiences, [where] development and testing is being used right on containers, but this is only going to get better and SQL is already prepared for it."
Kumar said Microsoft will release a preview for Windows, Mac and Linux clients within the next few weeks. It will enable "smart" T-SQL code snippets and customizable dashboards to monitor and discover performance bottlenecks in SQL databases, both on-premises or in Azure, he explained. "You'll be able to leverage your favorite command line tools like Bash, PowerShell, sqlcmd, bcp and ssh in the Integrated Terminal window," he noted in a blog post. Kumar added that users can contribute directly to SQL Operations Studio via pull requests from the GitHub repository.
Joseph D 'Antonio, a principal consultant with Denny Cherry and Associates and a SQL Server MVP, has been testing SQL Operations Studio for more than six months. "It's missing some functionality but it's a very solid tool, said D 'Antonio, who also is a Redmond contributor. "For the most part, this is VS code, with a nice database layer."
Posted by Jeffrey Schwartz on 11/06/2017 at 12:01 PM0 comments
Microsoft's Power BI can now query 10 billion rows of data, but a forthcoming release will blow that threshold to 1 trillion, a capability demonstrated at this week's annual PASS Summit, where the company also released the first on-premises version with Power BI Report Server.
Microsoft gave Power BI major play at PASS, held in Seattle, where the company also underscored the recently released SQL Server 2017 and its support for Linux and Docker, hybrid implementations of the on-premises database with SQL Azure and its next-generation NoSQL database CosmosDB.
Christian Wade, a Microsoft senior program manager, demonstrated the ability to search telemetry data from the cell phones of 20 million people traveling across the U.S., picking up their location data and battery usage as often as 500 times per day from each user. Though it wasn't actual usage data, Wade demonstrated how long it was taking people to reach various destinations based on their routes by merely dragging and dropping data from the Power BI dashboard interface. Wade queried Microsoft's Spark-based Azure HD Insights service.
"This is what you will be able to do with the Power BI interface and SQL Azure Analysis Services," Wade said during a brief interview following his demonstration. Wade performed the demo during a session focused on Power BI Wednesday, which followed the opening keynote by Rohan Kumar, Microsoft's general manager of database engineering. In the main keynote Wade demonstrated a query against a 9 TB instance with 10 billion rows supported in the current release.
"This is a vision of what's to come, of how we are going to unlock these massive data sets," Wade said, regarding the prototype demo. According to Wade, this new capability demonstrated was the first time anyone was able to use Power BI to perform both direct and in-memory queries against the same tabular data engine,
Kamal Hathi, general manager of Microsoft's BI engineering team, described the new threshold as a key milestone. "We have a history with analysis services and using the technology to build smart aggregations and apply them to large amounts of data," Hathi said in an interview. "It's something we have been working on for many years. Now we are bringing it to a point where we can bring it to standard big data systems like Spark and with the power of Power BI."
While he wouldn't say when it will be available, or if there will be a technical preview, Hathi said it's likely Microsoft will release it sometime next year.
What is now available is the new Power BI Report Server, which brings the SaaS-based capability on-premises for the first time. It still requires the SaaS service, but addresses organizations that can't let certain datasets leave their own environments.
Microsoft is offering the Power BI Report Server only for its Power BI Premium customers -- typically large organizations that pay based on capacity rather than on a per-user basis. The Power BI Report Server lets users embed reports and analytics directly into their apps by using the APIs of the platform, Kumar said during the keynote. "It essentially allows you to create report on-premises using your Power BI desktop tool. And once your report gets created you can either save it on premises to the reporting server [or the] cloud service," Kumar said. "The flexibility is quite amazing."
Posted by Jeffrey Schwartz on 11/03/2017 at 1:30 PM0 comments
A version of the Microsoft 365 service for small- and mid-size organizations with up to 300 users is now generally available. Microsoft 365 Business, released today, is the last of four versions of the new service announced in July that brings together Office 365, the Enterprise Mobility + Security (EMS) device configuration and management service and Windows 10 upgrades and licenses.
As part of the new release, Microsoft is introducing three new tools for business users called Connections, Listings and Invoicing, on top of previously announced apps, which include a mileage tracker, customer manager and appointment scheduling tool. The tools are available for customers in the U.S., Canada and the U.K. and are included with the $20 per user, per month subscription.
The company today is also releasing Microsoft StaffHub, its new tool for firstline workers designed to help them manage their work days, which is included in Microsoft 365 Business and Office 365 Business Premium subscriptions
Microsoft released the technical preview for the business version back in August and had said it would become generally available this fall. The company has already released Microsoft 365 Enterprise, Microsoft 365 F1 for firstline workers and a version with two licensing options for educational institutions.
The company is betting that the new Microsoft 365 Business option will appeal to the millions of customers who currently pay $12.50 per user per month for Office 365 Business Premium subscription plans. It'll cost them an additional $7.50 per user per month for the new plan, but they'll gain configuration, management and security services, plus Windows 10 and the new apps.
Those who manage Office 365 for Business Premium users will transition to the Microsoft 365 portal. The latter is effectively the same as the Office 365 portal, bringing in the features included with Microsoft Business, said Caroline Goles, Microsoft's director of Office for SMBs.
"We designed it so it looks exactly the same, except it just lights up those extra device cards," she said during a prelaunch demo in New York. "If you manage Office 365, it will be familiar but will bring in those new Microsoft 365 capabilities."
Unlike the enterprise version, Microsoft 365 Business includes a scaled-down iteration of EMS suited for smaller organizations. Garner Foods, a specialty provider of sauces based in Winston-Salem, N.C., is among the first to test and deploy the new service. Already an Office 365 E3 customer, Garner Foods was looking to migrate its Active Directory servers to Azure Active Directory, said COO Heyward Garner, who was present at the New York demo.
"They were able to downgrade most of their Office 365 users and gain the security and management capabilities offered with Microsoft 365," said Chris Oakman, president and CEO of Solace IT Solutions, the partner who recommended and deployed the service for Garner Foods. "It's a tremendous opportunity for small business."
Also now available are three previously announced tools: Microsoft Bookings for scheduling and managing appointments, MileIQ for tracking mileage and Outlook Customer Manager for managing contacts. In addition, Microsoft is adding three new tools: Connections, designed for e-mail marketing campaigns, Listings, for those who want to perform brand engagement on Facebook, Google, Bing and Yelp and Invoicing to generate bills to customers with integration into QuickBooks. These apps can all be managed in the Microsoft Business Center and the company is considering additional tools for future release.
Posted by Jeffrey Schwartz on 11/01/2017 at 12:06 PM0 comments
In a move to make Google's public cloud services more appealing to enterprise customers, the company and Cisco are partnering to bring hybrid cloud infrastructure that's compatible with the Google Cloud Platform (GCP). The pact, announced today, will enable workloads to run on Cisco UCS hyper-converged infrastructure hardware and the GCP.
The partnership is a major boost for Google as it looks to take on Amazon Web Services (AWS) and Microsoft, which both offer hybrid cloud solutions. Both have a wide lead on Google with the world's largest cloud footprints and infrastructure. Microsoft is hoping to maintain its lead over Google and gain ground on AWS with its new hybrid cloud solution, Azure Stack, which is now just starting to ship from Dell EMC, Hewlett Packard Enterprise and Lenovo. Cisco is also taking orders for its Azure Stack solution, which is set for imminent release.
Now that Cisco will also offer infrastructure compatible with GCP, Cisco is widening its cloud reach, while Google is gaining significant extension into enterprises. "Applications in the cloud can take advantage of on-premises capabilities (including existing IT systems)," said Kip Kipton, VP of Cisco's Cloud Platform and Solution Group, in a blog post announcing the pact. "And applications on-premises can take advantage of new cloud capabilities."
Cisco HyperFlex HX-Series systems will enable hybrid workloads to run on-premises and in GCP. The hybrid GCP offering is based on Kubernetes, the open source container orchestration and management platform that will provide lifecycle management, support for hybrid workloads and policy management. Kubernetes now integrates with Cisco's software-defined networking architecture, just upgraded earlier this month with the third release of Cisco's Application Centric Infrastructure (ACI).
The new ACI 3.0 includes improved network automation, security and multi-cloud support. Now that ACI offers Kubernetes integration, customers can deploy workloads as microservices in containers. Cisco said the Kubernetes integration also provides unified networking constructs for containers, virtual machines and bare-metal hardware and lets customers set ACI network policy.
Cisco's hybrid cloud offering also will include the open source Istio service management tooling. Itsio connects, manages and secures microservices. According to a description on its Web site, Istio manages traffic flows between microservices, enforces access polices and aggregates telemetry data without requiring changes to the code within the microservices. Running on Kubernetes, Istio also provides automated HTTP, gRPC, WebSocket and TCP load balancing and various authentication and security controls.
The Cisco offering will also include the Apigee API management tool. Apigee, a leading provider of API management software, was acquired by Google last year. It enables legacy apps to run on-premises and connect to the cloud via the APIs.
"We're working together to deliver a consistent Kubernetes environment for both on-premises Cisco Private Cloud Infrastructure and Google's managed Kubernetes service, Google Container Engine," said Nan Boden, Google's head of global technology partners for GCP, in a blog post published by Cisco. "This way, you can write once, deploy anywhere and avoid cloud lock-in, with your choice of management, software, hypervisor and operating system." Boden added that Google will provide a cloud service broker to connect on-premises workloads to GCP services for machine learning, scalable databases and data warehousing.
The partnership with Cisco promises to make GCP a stronger candidate for enterprises to consider moving workloads to the Google public cloud, though it's not the first. Among some notable partnerships, Google earlier this year announced Nutanix will run a GCP-compatible implementation of Kubernetes on its hyper-converged systems. And at VMworld, Google and Pivotal Cloud Foundry launched the Pivotal Container Service (PCS) to provide compatibility between Kubernetes running vShpere and the Google Container Engine. However, that VMworld announcement was overshadowed by VMware's biggest news of its annual conference, the plan to offer its VMware Cloud on AWS service.
While Cisco is offering customers an alternative to Azure Stack with its new Google partnership, Microsoft has made significant investments in support for Kubernetes orchestration. In addition to its Azure Container Service (ACS) with support for Kubernetes, Microsoft yesterday launched the preview of its managed Kubernetes service, called AKS.
Planned for release in the latter part of 2018, testing for the GCP-compatible Cisco offering will begin early in the year.
Posted by Jeffrey Schwartz on 10/25/2017 at 12:47 PM0 comments
If existing high-performance computing (HPC) isn't enough for you, Microsoft is bringing the supercomputing capabilities provided by Cray to its Azure public cloud.
This is a noteworthy deal because Cray has been regarded for decades as the leading provider of supercomputing systems. Cray's supercomputers are capable of processing some of the most complex, high-performance and scientific workloads performed. The two companies today announced what Cray described as an "exclusive strategic alliance" aimed at bringing supercomputing capability to enterprises.
While the term "exclusive" is nebulous these days, this pact calls for the two companies to work together with customers to offer dedicated Cray supercomputers running in Azure datacenters for such workloads as AI, analytics and complex modeling and simulation "at unprecedented scale, seamlessly connected to the Azure cloud," according to Cray's announcement.
The deal marks the first time Cray is bringing its supercomputers to a cloud service provider, according to a statement by Peter Ungaro, the company's president and CEO. The two companies will offer the Cray XC and Cray CS supercomputers with its ClusterStor storage systems for dedicated customer provisioning in Azure datacenters and offered as Azure services.
"Dedicated Cray supercomputers in Azure not only give customers all of the breadth of features and services from the leader in enterprise cloud, but also the advantages of running a wide array of workloads on a true supercomputer, the ability to scale applications to unprecedented levels, and the performance and capabilities previously only found in the largest on-premise supercomputing centers," according to Ungaro.
Cray's systems will integrate with Azure Virtual Machines, Azure Data Lake storage, Microsoft's AI platform and Azure Machine Learning (ML) services. The Cray Urika-XC analytics software suite and Microsoft's the CycleCloud orchestration service that Microsoft now offers following its August acquisition of Cycle Computing, can be used for hybrid HPC management.
The fact that Microsoft would want to bring Cray into the Azure equation is not surprising given CEO Satya Nadella's focus on bringing supercomputer performance to the company's cloud, a priority he demonstrated last year at the Ignite conference in Atlanta. In last year's Ignite keynote, Nadella revealed some of the supercomputing functions Microsoft had quietly built into Azure including an extensive investment in field programmable gate arrays (FPGAs) throughout the Azure network backbone, bringing 25Gbps backbone connectivity, up from 10Gbps, combined with GPU nodes.
Nadella stepped it up a notch in his opening keynote at the recent Ignite gathering held last month in Orlando, where he revealed extensive research and development effort focused on trying to one day offer quantum computing. The ability to offer this form of high-performance computing requires breakthroughs in physics, mathematics and software programming that are still many years away from achievement. While IBM and others have long showcased some of their R&D efforts, Microsoft' revealed it too has had been working on quantum computing for many years.
Nadella and his team of researchers said Microsoft will release some free tools by year's end that will let individuals experiment with quantum computing concepts and programming models. The pact with Cray will bring supercomputing processing capabilities to Azure that will solve the most complex challenges in climate modeling, precision medicine, energy, manufacturing and other scientific research, according to Jason Zander, Microsoft's corporate VP for Azure.
"Microsoft and Cray are working together to bring customers the right combination of extreme performance, scalability, and elasticity," Zander stated in a blog post. While it's not immediately clear to what extent, if any, Cray will be working with Microsoft on quantum computing, it's a safe bet that they will do so at some level, if not now, then in the future.
Posted by Jeffrey Schwartz on 10/23/2017 at 12:12 PM0 comments
Microsoft has given Azure Stack the green light to run on systems powered by Intel's next-generation Xeon Scalable Processors, code-named "Purley." By validating Azure Stack for Intel's new Purley CPUs, enterprises and service providers can run Microsoft's cloud operating system at much greater scale and expansion capability than the current Xeon CPUs, code-named Intel Xeon E5 v4 family ("Broadwell").
The first crop of Azure Stack appliances, which include those that customers have used over the past year with the technical previews, are based on the older Broadwell platform. Some customers may prefer them for various reasons, notably since gear with the latest processors cost more. Organizations may be fine with older processors, especially for conducting pilots. But for those looking to deploy Azure Stack, the newer generation might be the way to go.
Azure Stack appliances equipped with the new Purley processors offer improved IO, support up to 48 cores per CPU (compared with 28) and provide 50 percent better memory bandwidth (up to 1.5TB), according to a blog post by Vijay Tewari, Microsoft's principle group program manager for Azure Stack.
Intel and Microsoft have worked to tune Azure Stack with the new CPUs for over six months, according to Lisa Davis, Intel's VP of datacenter, enterprise and government and general manager of IT. In addition to the improved memory bandwidth and increased number of cores, the new processors will offer more than 16 percent greater performance and 14 percent higher virtual machine capacity, compared with the current processors, Davis said in a blog post.
The validation of the new CPUs comes three weeks after Microsoft announced the official availability of Azure Stack from Dell EMC, Hewlett Packard Enterprise and Lenovo, enabling customers and service providers to replicate the Azure cloud in their respective datacenters or hosting facilities. Cisco, Huawei and Wortmann/Terra are also readying Azure Stack appliances for imminent release.
The first units available are based on the older Broadwell processor. Tewari noted some of the vendors will offer them for the next year. "For customers who want early as possible adoption, Broadwell is a good fit because that's what's going to be off the truck first," Aaron Spurlock, senior product manager at HPE, said during a meeting at the company's booth at Ignite. "For customers who want the longest possible lifecycle on a single platform, [Purley] might be a better fit. But in terms of the overall user experience it's going to be greater [on the new CPUs] 90 percent the time."
Paul Galjan, Dell EMC's senior director of product management for hybrid cloud solutions, said any organization that wants the flexibility to scale in the future will find the newer processors based on the company's new PowerEdge 14 hyper-converged server architecture a better long-term bet. Systems based on the PowerEdge 14 will offer a 153 percent improvement in capacity, Galjan said in an interview this week.
"It is purely remarkable the amount of density we have been able to achieve with the 14g offering," Galjan said. One of the key limitations of systems based on the Broadwell platform is that they'll lack the ability to expand nodes on a cluster, a capability Microsoft will address in 2018. But it'll require the new Intel CPUs. Galjan said most customers have held off on ordering systems based on Azure Stack, awaiting the new processors from Intel for that reason.
"That's one of the reasons we are being so aggressive about it," he said. "Azure Stack is a future looking cloud platform and customers are looking for a future looking hyper-converged platform." Galjan compared in a blog post the differences between Azure Stack running on its PowerEdge 13 and the new PowerEdge 14 systems.
Lenovo this week also officially announced support for the new Xeon Scalable Processor with its ThinkAgile SX for Azure Stack appliance and is the first to support the Intel Select program early next year, which includes additional testing designed to ensure verified, workload-optimized and tuned systems.
Select Solutions, a program announced by Intel earlier this year, is a system evaluation and testing process designed to simplify system configuration selection for customers, according to Intel's Davis. It targets high-performance applications that use Azure Stack with an all-flash storage architecture.
Posted by Jeffrey Schwartz on 10/20/2017 at 1:55 PM0 comments
Looking to distance itself further from the highest-performing MacBook Pros, Microsoft is unleashing its most powerful Surface PCs to date with the launch of the new Surface Book 2. Microsoft showed off the newest Surface Book 2 today as an extra surprise tied with the release of the Windows 10 Fall Creators Update, which comes with support for new mixed reality and improvements for IT pros including enhancements to the console.
The new Surface Book 2 will be available Nov. 16, initially from the Microsoft Store, with either 13.5- or 15-inch displays and will be up to five times more powerful than the original Surface Book, which Microsoft launched two years ago. The 13.5-inch model will start at $1,499 with the larger unit starting at $2,499, available with either dual- or quad-core Intel 8th Generation Core processors and the current NVIDIA GeForce GTX GPUs.
Engineered to process 4.3 trillion math operations per second, the newest Surface Books are for those who perform compute-intensive processes such as video rendering, CAD, scientific calculations and compiling code at high speeds. Also targeting gamers, the 15-inch model will also include a wireless Xbox Controller. It has a 95-watt power supply and can render 1080p at 60 frames per second.
"This thing is 'Beauty and the Beast,'" said Panos Panay, corporate VP for devices overseeing Microsoft's entire hardware lineup, during a briefing with Windows Insiders and media, where we got to spend some time with the new units. "There's no computer, no laptop, no product that's ever pushed this much computational power in this mobile form factor. It's quite extraordinary, the performance."
Panay emphasized that the Surface Book Pro 2 aims to bring out the best in the new Windows 10 Fall Creators Update, Office 365, digital inking and mixed reality. Like its predecessor, the Surface Book 2 screen detaches so it can be used as a tablet or users can fold it back into studio mode. It sports what Panay said is the thinnest LCD available, with a 10-point multitouch display. It also supports a newly refined Surface Pen and the company's Surface Dial.
In addition to the five times power boost over the original Surface Book of two years ago (the 13.5-inch is four times more powerful), Panay said the newest version is twice more powerful than the latest MacBook Pro and boasts 70 percent greater battery life in video playback mode. Microsoft is claiming a 17-hours battery life (5 hours when used as a tablet), though laptops from Microsoft or anyone else rarely reach those maximums. Nevertheless, it is designed to ensure all-day use in most conditions.
The 13.5-inch model weighs 3.38 pounds, with the 15-inch unit weighing 4.2 pounds with the keyboard attached. "There has never been this much computational power in a mobile form factor this light," Panay said in a blog post announcing the Surface Book 2. "You can show off your meticulously designed PowerPoint deck or complex Pivot tables in Excel with Surface Book 2's stunningly vibrant and crisp PixelSense Display with multi-touch, Surface Pen, and Surface Dial on-screen support. You won't believe how much the colors and 3D images will pop in PowerPoint on these machines."
The 15-inch model offers just shy of 7 million pixels, or 260 DPI, which Microsoft claims is 45 percent more than the MacBook Pro. The 13.5-inch model produces 6 million pixels at 267 DPI (3000x2000).
Among the various configurations, they are available with a choice of Intel 7th or 8th Generation Core i5 or i7 processors and the option of dual- and quad-core processors. Depending on the model, customers can choose between 8GB or 16GB of RAM with 256GB, 512GB or 1TB SSD storage.
Posted by Jeffrey Schwartz on 10/17/2017 at 11:44 AM0 comments
Nearly a year after rolling out its Azure Functions serverless compute option for running event-driven, modern PaaS apps and services, Microsoft has given it a cross-platform boost. The company announced it had ported the Azure Functions service to the new .NET Core 2.0 framework during the Ignite conference in Orlando, Fla., late last month. On the heels of that release, Microsoft made available a public preview of its Java runtime for Azure Functions during last week's JavaOne conference in San Francisco.
Azure Functions provides elastic compute triggered by events within any service in Azure or third-party service, in addition to on-premises infrastructure, according to Microsoft. By porting it to .NET Core 2.0, both the Azure Functions Core Tools and runtime are now cross-platform, Microsoft announced at Ignite, though acknowledged in the Sept. 25 post that there are some known issues and functional gaps.
Java support in Azure Functions has been a top request, according to the announcement posted last week by Nir Mashkowski, partner director for the Azure App Service. "The new Java runtime will share all the differentiated features provided by Azure Functions, such as the wide range of triggering options and data bindings, serverless execution model with auto-scale, as well as pay-per-execution pricing," Mashkowski noted.
The preview includes a new Maven plug-in, a tool for building and deploying Azure Functions from that environment, he noted. "The new Azure Functions Core Tools will support you to run and debug your Java Functions code locally on any platform," he said.
Until now, Azure Functions supported C#, F#, JavaScript (Node.js), PowerShell, PHP, Python, CMD, BAT and Bash. In addition, the new Azure Functions is open source and available on GitHub. Azure Functions integrates with SaaS applications and a list of interfaces and supports authentication via standard OAuth providers including Azure Active Directory, a Microsoft account, Facebook, Google and Twitter.
Microsoft also posted some five-minute tutorials that demonstrate how to build and deploy Java app services and serverless functions. Microsoft also held two sessions at JavaOne that describe how to build and deploy serverless Java apps in Azure that are now available for replay.
Posted by Jeffrey Schwartz on 10/13/2017 at 12:31 PM0 comments
If there was any hope that Microsoft had any plans to come out with any new phones based on Windows 10 Mobile, or add new features to that version of the OS this year, HP and Microsoft both appear to have dashed them.
The chatter about the all-but-forgotten Windows Phone emerged a week ago when The Register reported that HP will no longer sell or support its Elite x3 Windows-based phone after the end of 2019 and would only offer whatever inventory is still available. The report quoted Nick Lazardis, who heads HP's EMEA business, during the Canalys Channels Forum. Noting that HP had insisted as recently as August that it was committed to the platform, and specifically the Windows Continuum feature, Lazardis attributed the change to Microsoft's shift in strategy with Windows Phone.
HP's Elite X3 3-in-1 device received positive reviews, including one published by Redmond magazine earlier this year. However, Samsung's release of the S8 phone and DexStation dock, allowing users to dock their phones and run Windows-based remote desktops and virtual apps, has proven a strong option to those who want Windows on a smartphone device.
Asked to confirm HP's plan, a company spokeswoman responded: "We will continue to fully support HP Elite x3 customers and partners through 2019 and evaluate our plans with Windows Mobile as Microsoft shares additional roadmap details," according the e-mailed statement. "Sales of the HP Elite x3 continue and will be limited to inventories in country. HP remains committed to investing in mobility solutions and have some exciting offerings coming in 2018."
In wake of the HP report, Joe Belfiore Corporate VP in Microsoft's Operating Systems Group, posted a series of Twitter responses on Sunday night about Windows Phone support. One question was about whether it was time to give up on using Windows Mobile.
"Depends who you are," he tweeted. "Many companies still deploy to their employees and we will support them!" But, he continued: "As an individual end-user, I switched platforms for the app/hw diversity. We will support those users too! Choose what's best 4 u."
Responding to a tweet asking directly about the future for Windows Phone, Belfiore stated what was until now largely presumed: "Of course we'll continue to support the platform. bug fixes, security updates, etc. But building new features/hw aren't the focus," he said with regret (based on his emoji).
If that left ambiguity, a response from a Microsoft spokesperson shouldn't. "We get that a lot of people who have a Windows 10 device may also have an iPhone or Android phone, and we want to give them the most seamless experience possible, no matter what device they're carrying," according to the spokeswoman. "In the Fall Creators Update, we're focused on the mobility of experiences and bringing the benefits of Windows to life across devices to enable our customers to create, play and get more done. We will continue to support Lumia phones such as the Lumia 650, Lumia 950 and Lumia 950 XL, as well as devices from our OEM partners."
Posted by Jeffrey Schwartz on 10/11/2017 at 1:42 PM0 comments
Amazon Web Services yesterday announced that it is the "preferred cloud provider" for General Electric, one of the world's largest industrial conglomerates. However, what was not stated in the announcement is that it only pertains to GE's internal IT apps. GE is moving forward with its plans announced last year to run its Predix industrial exchange on Microsoft Azure.
Responding to my query about the nature of yesterday's announcement, issued only by AWS, a GE spokeswoman clarified the company's plans. "The AWS announcement refers to our internal IT applications only," she stated.
The plan to run GE Predix, the giant industrial cloud platform aimed at building and operating Internet of Things (IoT) capabilities, on Azure was announced at Microsoft's Worldwide Partner Conference (WPC) in July of 2016. Predix is a global exchange and hub designed to gather massive quantities of data from large numbers of sensors and endpoints and apply machine learning and predictive analytics to the data to optimize operations and apply automation. The company uses it for its own customers and partners.
Jeffrey Immelt, the chairman and CEO of GE who recently retired, joined Microsoft CEO Satya Nadella on stage during the keynote session kicking off last year's WPC, where he described GE's plans to integrate Predix with the Azure IoT Suite and Cortana Intelligence Suite.
"For our Predix platform, our partnership with Microsoft continues as planned," according to the GE spokeswoman. "By bringing Predix to Azure, we are helping our mutual customers connect their IT and OT [operational technology] systems. We have also integrated with Microsoft tools such as Power BI (which we demonstrated at our Minds + Machines event last year) and have a roadmap for further technical collaboration and integration with additional Microsoft services."
GE will have more to say about that at this year's annual Minds + Machines conference, scheduled to take place at the end of this month in San Francisco.
Posted by Jeffrey Schwartz on 10/06/2017 at 10:27 AM0 comments
Microsoft and NetApp are working to deliver a native version of the storage provider's Network File System (NFS) for Azure. The new Enterprise NFS Service is based on NetApp's flagship Data ONTAP storage operating and management platform and will be available for public preview in early 2018.
Both companies inked their latest of many partnerships over the years to let enterprises move their enterprise storage workloads to Microsoft Azure. The new Azure Enterprise NFS Service was announced at NetApp Insight, the company's annual customer and partner conference.
The announcement was overshadowed by the fact that the conference is taking place at the Mandalay Bay hotel in Las Vegas, the site of Sunday night's deadly shooting massacre. While the preconference sessions were cancelled on Monday because the hotel was still on lockdown, it resumed yesterday and was marked by a moment of silence.
Administrators will be able to access the new service from the Azure console, which NetApp said will appeal to cloud architects and storage managers seeking to bring NFS services natively to Azure for workloads such as database-oriented analytic workloads, e-mail and disaster recovery. In a statement by Anthony Lye, senior VP of Microsoft's cloud business unit, the solution will use NFS to provide "visibility and control across Azure, on-premises and hosted NFS workloads."
The offering will enable the provisioning and automation at scale of NFS services using RESTful APIs, with added data protection offered through the ability to create on-demand, automated snapshots. The service will support both V3 and V4 workloads running in Azure as well as hybrid deployments, according to NetApp. It will also include integration with various Azure services including SQL Server and SAP Hana for Azure.
NetApp also said the pack calls for integrating Data ONTAP software with Azure Stack, which CEO George Kurian said in a recorded video presentation will speed the migration of enterprise applications to Azure Stack and Azure.
Kurian also said the two companies are integrating NetApp's recently launched all-flash-based FabricPool technology and said it "manages cold data by tiering in a cost-effective manner to the cloud and integration of fabric pools together with Azure Blob Storage. He added that it "gives customers a really capable hybrid cloud data service and allows them to optimize their own datacenters," he said.
Cloud Control for Office 365 now supports Azure Storage and will soon have availability in EMEA and APAC regions, allowing local instances of Exchange, OneDrive and SharePoint. The two companies are working to provide more extensive integration with Cloud Control for Office 365 and NetApp AltaVault archiving platform, enabling customers to choose between hot, cool, and cold storage options in Azure for backup and disaster recovery requirements.
Posted by Jeffrey Schwartz on 10/04/2017 at 11:44 AM0 comments
As the world today awoke to the news of last night's horrific mass shooting that originated from a 32nd floor room of the Mandalay Bay hotel in Las Vegas, killing at least 58 and injuring more than 550, many IT pros had just arrived or were on their way to NetApp's Insight conference.
The NetApp conference was set to begin today with preconference technical sessions but as the shooting unfolded, the hotel was evacuated and all of today's activities were cancelled. NetApp later in the day said it had decided to resume the conference tomorrow for those who already attended with the keynote session to be kicked off by President and CEO George Kurian.
"I am shocked and saddened by the tragic event that occurred in Las Vegas at the Mandalay Bay last night," Kurian said in a statement. "I am sure you all share these sentiments. My heard and the hearts of thousands of NetApp employees break for the loved ones of those affected by the terrible events. NetApp will stand strong in the face of senseless violence and continue with the conference for those who want to attend."
Also kicking off this week is the Continuum Conference, an event for managed services providers, taking place further down the Las Vegas Strip at the Cosmopolitan. It too is set to go on, as reported by SMBNation's Harry Brelsford.
Some expressed mixed opinions on whether NetApp should resume the conference where the hotel remains an investigation scene and attendees were either looking to help and donate blood, while others as of last evening still were unable to return to their rooms.
This tragedy, which has shocked the world, is also unnerving to many IT professionals who find themselves in Las Vegas many times a year for large conferences. Las Vegas hosts hundreds of conferences of all sizes every year, and it is the site of many major tech gatherings. Many of you, myself included, were just at the Mandalay Bay just over a month ago for VMworld 2017. Many other tech vendors and conference organizers including Veritas, Okta, Cisco, Hewlett Packard Enterprise, DellEMC and VMware, among others have held major conferences in Las Vegas.
TechMentor, produced by Redmond magazine parent 1105 Media was last held in Las Vegas in early 2016. The upcoming TechMentor conference is scheduled to take place in Orlando, Fla., Nov. 12-17 as part of the Live! 360 cluster of conferences and will return to the Microsoft campus in Redmond next summer as it did this year. After years of steering its conferences away from Las Vegas, Microsoft recently announced next year's Inspire partner conference in Las Vegas.
"My heart and thoughts are with all the people, families and responders in Las Vegas impacted by this horrific and senseless violence," Microsoft CEO Satya Nadella tweeted. VMware CEO Pat Gelsinger, pointed to a GoFundMe page posted by Steve Sisolak, Clark County Commission chair from Las Vegas for victims. "Our thoughts and prayers this morning are with the victims and families of Las Vegas shooting," Gelsinger tweeted.
NetApp is one of the largest providers of enterprise storage hardware and software with $5.52 billion in revenues for its 2017 fiscal year, which ended April 28. Many storage experts attending Microsoft's Ignite conference, which wrapped up on Friday, had signaled they were headed to Las Vegas for the NetApp event. Software and cloud providers listed as major sponsors of the NetApp technical event include Amazon Web Services, Cisco, Fujitsu, Google Cloud, IBM Cloud, Intel, Microsoft, Rackspace, Red Hat, VMware and many others, according to the company's sponsorship roster.
The shooter is alleged to have targeted an outdoor festival where he fired round after round for more than 10 minutes at the crowd. It is believed to be the worst shooting by an individual, in terms of the number of deaths and casualties, in U.S. history. People have been asked to donate blood if they can, and according to reports, attendees at the NetApp conference, and other events taking place are doing that, and helping out however they can.
Unfortunately, what has transpired in Las Vegas could happen in many places in the U.S. and abroad. Hotels may now have to consider scanning the bags of guests just as airlines now do, and we need to remain aware of our surroundings, which we've become conditioned to do since the attacks of Sept. 11, 2001.
Posted by Jeffrey Schwartz on 10/02/2017 at 1:54 PM0 comments
Microsoft created some interesting buzz at this year's Ignite conference with the news that it is putting Bing at the center of its artificial intelligence and enterprise search efforts.
The new Bing for Business will be a key deliverable from the new AI Research Group Microsoft formed a year ago this week, led by Harry Shum, which brought together Microsoft Research with the company's Bing, Cortana, Ambient Computing and Robotics and Information Platform groups.
Bing for Business brings the Microsoft Graph to the browser, allowing employees to conduct personalized and contextual search incorporating interfaces from Azure Active Directory, Delve, Office 365 and SharePoint. Li-Chen Miller, partner group program manager for AI and Research at Microsoft, demonstrated Bing for Business in the opening keynote session at Ignite, showing how to discover her organization's conference budget.
Using machine reading and deep learning models, Bing for Business went through 5,200 IT and HR documents. "It didn't just do a keyword match. It actually understood the meaning of what I was asking, and it actually found the right answer, the pertinent information for a specific paragraph in a specific document and answered my question right there," Miller said. "The good news is Bing for Business is built on Bing and with logic matching, it could actually tell the intent of what I was trying to do."
Miller added that organizations deploying Bing can view aggregated but anonymized usage data. "You can see what employees are searching for, clicking on and what they're asking, so you can truly customize the experience," she said. It can also be used with Cortana.
The idea behind Bing for Business is it takes multiple approaches to acquiring information within an enterprise -- such as from SharePoint, a file server and global address book -- and applies all the Active Directory organizational contexts, as well as Web search queries, to render intelligent results, said Dave Forstrom, Microsoft's director of conversational AI, during an interview at Ignite.
"If you're in an enterprise that has set this up, now you can actually work that into your tenant in Office 365 and then it's set up through your Active Directory for authentication in terms of what you have access to," Forstrom said.
Quite a few customers are using it now in private beta, according to Forstram. The plan is to, at some point, deliver it as a service within Office 365.
Seth Patton, general manager of Microsoft's Office Productivity Group, said in a separate interview that the Microsoft Graph brings together the search capabilities into a common interface. It also includes Microsoft's Bot Framework.
"Being able to have consistent results but contextualized in the experience that you're in when you conduct the search is just super powerful," Patton said. "We've never before been able to do that based on the relevance and the contextual pieces that the Graph gives."
Posted by Jeffrey Schwartz on 09/28/2017 at 10:40 AM0 comments
Rackspace will secure Hyper-V workloads via its managed security service offering and has announced its Microsoft Azure offering is now PCI certified. The company, one of the world's largest managed services provider, is talking up the new Hyper-V protection capabilities at this week's Microsoft Ignite conference, taking place in Orlando, Fla.
The Hyper-V protection extends across the company's Rackspace Managed Security (RMS) service. Also, a new Rackspace Cloud Replication for Hyper-V will be offered under the umbrella of its Rackspace portfolio of Microsoft services, which provides overall threat protection using analytics and remediation across the company's managed services offering. The managed Rackspace Cloud Replication for Hyper-V offering is based on Microsoft's Azure Site Recovery and offers replication, storage and failover of Hyper-V virtual machines.
While Rackspace said the new offering lets organizations use the managed service to target Microsoft Azure rather than on-premises infrastructure, it also provides an alternative to moving those Hyper-V workloads into an infrastructure-as-a-service scenario, said Jeff DeVerter, CTO of Microsoft technologies at Rackspace, during an interview at Ignite.
"As excited as the world is about the cloud, not every workload is ready to be made into a modern application, and so customers can actually have a single tenant using Hyper-V and solve the problem of getting out of the datacenter, which most companies want to do today," DeVerter said. "But they don't have to take on the added expense of running an IaaS inside of Azure, and IaaS workload tends to cost more in azure than it does in a tenant."
DeVerter said running them in Hyper-V solves the problem of moving those workloads out of on-premises datacenters. Rackspace can work with customers over time and starting to extend those application out into Azure, if transforming the application is the ultimate goal. Rackspace had already offered protection of workloads for VMware-based VMs.
Rackspace also used Ignite to announce that its managed Microsoft Azure service is now has PCI certified for those running workloads that carry payment data. Microsoft Azure is already PCI compliant. But because Rackspace manages the workloads, it too had to secure the certification.
Posted by Jeffrey Schwartz on 09/27/2017 at 2:25 PM0 comments
The Azure management portal isn't just for Microsoft's public cloud. Microsoft kicked off its annual Ignite conference in Orlando, Fla. this week announcing that Azure Stack appliances from Dell EMC, Hewlett Packard Enterprise and Lenovo are now available, along with news of bringing PowerShell, change tracking, update management, log analytics and simplified disaster recovery to the Azure portal.
Azure Stack appliances are the cornerstone of Microsoft's long-stated hybrid cloud strategy to bring its Azure cloud in line with the same portal management experience and proved the ability to build and provision instances and applications with the use of common APIs. "Azure Stack also enables you to begin modernizing your on-premises applications even before you move into the public cloud," said Scott Guthrie, Microsoft's executive VP for cloud and infrastructure, speaking in a keynote session at Ignite today.
"The command and control [are] identical," said Sid Nag, Gartner's research director for cloud services, during an interview at Ignite following the Guthrie's session. "If I have a craft, I don't have to learn new skills. I can transition very smoothly without a learning curve."
However, like any major new piece of infrastructure, despite significant interest, the pace and number of deployments remain to be seen, according to Nag. "Clients have been looking for an onramp to the public cloud, but they are not ready to commit," Nag said.
Microsoft maintains that enterprises should embrace the hybrid path it has championed for some time, but the company is also apparently giving them a nudge by bringing the Azure portal to their world, whether they use the public cloud. Adding these new capabilities brings the portal even to those not using Azure Stack. During the session, Corey Sanders, Microsoft's director of Azure compute, demonstrated the new features coming to the Azure portal.
PowerShell Built into Azure Portal: PowerShell is now built into the Azure portal, aimed at simplifying the creation of virtual machines. "It's browser based and can run on any OS or even from an iPhone," Sanders said. "If you are familiar with PowerShell, it used to take many, many, commands to get this going. Now it takes just one parameter," he said. "With that, I put in my user name and password and it creates a virtual machine, so you don't have to worry about the other configurations unless you want to." Sanders said IT pros can use classic PowerShell WhatIf queries to validate what a given command will do.
Change Tracking: When running a virtual machine, it is tracking every change on the VM including every file, event and registry change. It can scan a single machine or an entire environment, letting the IT pro discover all changes and investigate anything that requires attention.
Log Analytics: Administrators can now call on a set of prebuilt operations and set of statistics to discover the number of threats exist. It looks beyond the built-in antimalware, letting the administrator go into the analytics designer to create queries that, Sanders said, are simple to write. "They are ery SQL-like and allow me to do very custom thinvgs," he said. For example, it can query over last seven days the processor time of all the virtual machines in a specific subscription. It can group them by computer and display a time chart that specifies spikes of all the CPUs timed across those seven days.
Update Management: Administrators looking to see what updates or patches have been installed, or are awaiting installation can use this new feature in the Azure portal. It displays details of what the updates include, allows the administrator to choose which ones to act on. Sanders emphasized this works across an entire Azure or on-premise infrastructure of Windows and Linux machines.
Disaster Recovery: Noting that planning how to back up infrastructure and ensure a workable recovery plan is complex, Sanders said the site recovery capability in the Azure portal lets the administrators pick a target region and it will provide a picture of how a failover scenario will actually appear. "The key point here is this isn't just a single machine. You can do this across a set of machines, build a recovery plan across many machines, do the middleware and actually run scripts according to that plan," he said.
Microsoft said these features will appear in preview mode today. The company hasn't disclosed a final release date.
Posted by Jeffrey Schwartz on 09/25/2017 at 12:01 PM0 comments
Veritas is upgrading its entire data management software portfolio with performance improvements and more extensive support for hybrid and public clouds with extended integration with Microsoft Azure.
The deeper integration for Azure is key among the announcements at this week's annual Veritas Vision customer and partner conference in Las Vegas, as well as the news of the company extending data deduplication and optimization of its flagship NetBackup, and Backup Exec backup and recovery and DR offerings. The company also today is introducing a software-defined storage offering designed for massive amounts of data called Veritas Cloud Storage, designed to apply intelligent analytics and classification to improve data discovery.
Late in addressing support for major public clouds including Amazon Web Services, Microsoft Azure and Google Cloud, Veritas last year stepped up its effort after spinning off from Symantec by delivering native integration with the major public clouds in NetBackup 8 and Backup Exec 16. At the time, Veritas pledged to offer better support for all three of the largest global public clouds, making data protection and management across hybrid environments a priority.
Among the key promises at last year's Vision partner and customer conference, Veritas had annouced a partnership with Microsoft to add deeper support for its Azure public cloud. Earlier this year the two companies extended their agreement to include Veritas data archiving solutions, including Enterprise Vault.
The deliverables announced this week include integrated support of Veritas' data archiving, management and orchestration tools with Azure. Specifically, the Veritas Resiliency Platform (VRP), which provides backup and disaster recovery management, orchestration and automation across multiple clouds and on-premises storage, establish Azure as a recovery target. It provides monitoring failover and failback to and from Azure.
Also, a new release of the Veritas Access software-defined storage NAS scale-out file system supports Azure cloud storage. Veritas Access enables simultaneous access for multiprotocol file and object types such as NFS, CIFS, FTP, SMB and S3. And the company's relatively new Veritas Information Map, a visualization tool designed to give administrators a real-time, aggregated and detailed view of where data is stored for cost management and risk mitigation, will support Azure Blob Storage and Azure File Storage, as well as other Microsoft cloud storage services.
Veritas launched Information Map with a grand plan of giving administrators extensive visibility across the spectrum of cloud and on-premises storage. Building on that product, the company announced the new Connection Center, a designed link to provide visibility into 23 different cloud stores, with more planned. In addition to adding support for Azure Blob Storage and Azure File Storage, Veritas said it will be rolling out connectors to OneDrive, Google Drive, Box, G Suite, Google Cloud Platform Cloud Storage, AWS S3, Office 365, Exchange Online and SharePoint Online.
The company said Veritas Information Map can help determine which data must be preserved to help meet compliance regulations, such as the forthcoming European Union's General Data Protection Regulation (GDPR), which takes effect next year. Veritas Information Map connectors will support a variety of other Microsoft and third-party data sources. This integration with Azure will be available in the coming quarters.
The new NetBackup 8.1, aimed at large enterprises, brings improved data deduplication to improve backup and recovery times and lower bandwidth utilization. Also new in the release is Parallel Streaming, aimed at supporting hyperconverged clusters and scale-out backups for big data workloads, including support for HDFS.
A new release of Veritas Backup Exec 16 FP2 for small and midsize organizations also offers improved deduplication and the new CloudConnect Optimizer to improve network bandwidth utilization. Also building on its support for Azure released last year, the Backup Exec upgrade will add compatibility with Azure Government and Azure Germany, as well as Amazon Web Services S3 storage and Google Cloud regional class storage.
The new Veritas Cloud Storage offering, now in beta, is aimed at huge amounts of data. The company said its new data store, which, in addition to supporting AWS S3 and standard Swift object storage protocols, will offer support for MQTT and COAP for addressing new IoT workloads. Veritas said organizations can use it in geodistributed scenarios and can support complex storage polices, suited for GDPR and other environments where sensitive data are handled.
Posted by Jeffrey Schwartz on 09/20/2017 at 12:32 PM0 comments
Given the scope of a growing number of major data breaches, each one is harder to top, although security experts know there's no bottom limit to what could be next. The compromise of 143 million individual accounts reported by Equifax on Sept. 7 that included names, birthdates and credit card numbers, may be one of the most damaging breaches disclosed to date. Apparently tied to the Equifax breach, news surfaced Friday that information on more than 200,000 credit card accounts were also stolen.
The way Equifax executives and its IT security team appears to have failed to adequately apply patches, the amount of time it took to discover the depth of the breach and the delay in ultimately reporting it certainly paints a picture of a colossal failure at all levels, including the curiosly timed stock sales by top executives (who deny knowledge of the breach at the time of the sale) just days before the disclosure, reported by Bloomberg.
Fallout from the breach has, not surprisingly, led to the reported departures of CIO Dave Webb and CSO Susan Maudlin late last week. Signs of trouble trace back to March 8 when Cisco warned of a security flaw in Apache Struts, the open source, Java-based framework widely used on interactive Web sites, that already was "being actively exploited," before the July 29 discovery of trouble at Equifax and the Sept. 7 revelation of how many potential customer records were stolen, according to a detailed report published by The Wall Street Journal today.
While the report noted many details remain unknown, it is understood that hackers pillaged information between May and the July 29 discovery. A few days later, Equifax brought in security consulting firm Mandiant, now a unit of FireEye and associated with many high-profile forensics investigations including the Yahoo breach last year, when data on more than 1 billion accounts were exposed.
Initially, Mandiant believed that 50 million accounts were compromised. But as its investigation continued, it determined it was nearly three times that amount, according to the report, which also noted the company registered the EquifaxSecurity2017.com domain for customers to seek information.
The report also noted last week's revelation by Alex Holden, founder of Hold Security, that an Equifax portal in Argentina was "wide open, protected by perhaps the most easy-to-guess password combination ever: 'admin/admin,'" he told the KrebsonSecurity Web site.
The true impact of the Equifax breach is yet to unfold, but it already has brought a new awareness to the risks at hand that many have long overlooked or ignored. How organizations address their own risks in wake of this remains to be seen.
Posted by Jeffrey Schwartz on 09/18/2017 at 7:39 PM0 comments
Once rumored as an acquisition target by a larger cloud provider, now it's Rackspace that's getting bigger. The company today said it is making its largest acquisition to date with its agreement to buy rival Datapipe, a combination that will bring together two leading providers of managed hosting and cloud services.
While terms weren't disclosed, the deal is expected to close by the end of the year, pending financing and regulatory approval. Bringing together the two companies will create a managed hosting and cloud service provider with a total of 40 datacenters throughout the world, making it the largest, according to Rackspace CEO Joe Eazor.
"It will have a big impact on our ability to deliver the multi-cloud services that today's customers are demanding," Eazor said, in a blog post. "They want us to give them expert, unbiased advice, and manage their applications on whichever clouds will best meet their unique needs. They want us to serve them at scale, from data centers and offices across the globe. And they want the world's best customer experience, across digital tools and results-obsessed customer service. Our mission is to meet those needs -- today and as they evolve."
Rackspace, based in San Antonio, has 11 datacenters worldwide and Jersey City, N.J.-based Datapipe runs 29. Both are privately held and offer managed public and multi-cloud services that collectively include the Alibaba Cloud, Amazon Web Services, Google Cloud Platform and Microsoft Azure.
The companies also have distinct services. Datapipe offers Alibaba Cloud integration, giving Rackspace a foothold in China. Datapipe customers will be able to benefit from Rackspace's Google-based services. Rackspace, which was a key creator of OpenStack before contributing it to the open source community, offers managed OpenStack services. Rackspace also offers VMware-hosted services and though its Microsoft partnership managed Exchange, SharePoint and Windows Server hosting. It also provides Office 365 and Google at Work services. Rackspace also now offers managed Oracle and SAP services.
Datapipe will boost the Rackspace portfolio in several ways. It has a strong public sector business with such customers as the U.S. Department of Defense, Dept. of Energy and the Treasury department, and has FedRAMP and FISMA certifications. It services agencies outside the U.S. including the U.K Cabinet Office and the Ministry of Justice and the Department of Transport. Datapipe gives Rackspace an extended presence in many places it lacked or was limited in including the U.S. west coast, Brazil, China and Russia. Datapipe also brings added managed public cloud migration services, colocation services covering four continents and experience bringing cloud services to midsize organizations as well as large enterprises.
The agreement is indeed a reversal of fortunes for Rackspace, which had struggled to grow a few years ago, and had put itself up for sale in 2014. After failing to forge an acceptable deal, Rackspace decided to go it alone, and added public cloud services from AWS, Google and Microsoft to its managed services portfolio. The company also went private last year after Apollo Global Management and its investors acquired it for $4.3 billion. Eazor came in as Rackspace's CEO earlier this year after leading Earthlink, which was acquired late last year by Windstream.
Posted by Jeffrey Schwartz on 09/11/2017 at 3:03 PM0 comments
Building on its goal to extend the single-sign on capability of its cloud-based directory service, Okta has added native LDAP support to its Okta Universal Directory and has extended its multifactor authentication (MFA) offering to bypass on-premises ADFS servers, among other services. The moves are among several upgrades to its SSO portfolio announced at the company's annual Oktane gathering, held last week in Las Vegas.
Until now, Okta has connected to LDAP directories, which are often found on legacy on-premises applications, security and network systems, by using replication. Now Okta's directory supports the LDAP protocol natively, allowing LDAP-based applications to authenticate against it directly, which the company said eliminates the need for multiple on-premises directories tied to specific systems and applications, including VPNs.
"You just point [applications] at the Okta Universal Directory, and it speaks the protocol, and you're integrated and on your way," said Okta CEO Todd McKinnon. "You can now retire those legacy directory workloads, make it much easier for you and more cost effective." By adding LDAP support, organizations can eliminate multiple on-premises directories, IDC Analysts Tom Austin and Frank Dickson, noted in a research note.
Okta said it is also responding to the growing push to bring multifactor authentication (MFA) into broader use. The company said basic two-factor authentication will be a standard feature for all customers. "Every company using our SSO product gets basic multifactor authentication for free," McKinnon said. "We think it pushes the industry forward. It makes it incredibly easy to deploy multifactor authentication in a usable, scalable way across your entire ecosystem and we think this will push the security industry forward."
The company has added new functionality to its Adaptive MFA (AMFA) offering, which provides context-based security capabilities. Among them is a new capability that will prevent its users from using common passwords or those already exposed from known breaches. Okta has also added IP blacklisting to protect against DDoS attacks. "AMFA can also detect anomalies based on user location and client, and determine whether authentication event is using a trusted/untrusted device," Austin and Dickson noted.
AMFA can also now be used with LDAP as well as a broader set of on-premises custom applications, including ADFS, Web apps, RADIUS and other SSO products such as CA SiteMinder, Oracle Access Manager and IBM's Tivolli Access Manager, among others. "We've now extended our Adaptive MFA offering, enabling you to connect to anything behind an ADFS server, also to connect directly to anything speaking the remote desktop protocol," McKinnon said. "What you are seeing here is a broadening and a deepening of this integration and this product. It's not about applications, it's about being securely connected to everything. This is critical as you manage and secure your extended enterprise."
Okta, which said it now has 3,000 customers using the Okta Identity Network and 5,000 native connections, also announced a new developer toolkit and integrations with a number of providers including ServiceNow, Workato, Palo Alto Labs, Cisco, F5 Networks, Citrix (Netscaler), Akamai, Box, Google (G Suite and Google Cloud), Sumo Logic, Splunk, Skyhigh, Netskope, MuleSoft, IBM (DataPower and Radar) and Amazon Web Services.
Okta is regarded as a leading provider of SSO providers to large enterprises and one whose business is now easier to gauge than others, thanks to the fact that it went public earlier this year. While the Andreesen-Horowitz-backed company is still quite in the red, Okta surprised Wall Street yesterday by beating revenue estimates and upping its forecast for the rest of the year. Revenues of $61 million during its second FY18 quarter rose 63% over the same period last year. The company showed incremental progress on its road to profitability, reporting a net loss of $27.2 million, or 44.5% of total revenue, compared with $20.6 million, or 54.9% of revenues, year-over year.
Posted by Jeffrey Schwartz on 09/08/2017 at 10:18 AM0 comments
VMware has upped the stakes in delivering unified client and application enrollment and management with broad extensions to its Workspace One platform. In addition to configuring mobile phones and tablets, Workspace One can now enroll and manage Windows 10 devices and Chromebooks with Mac support coming this fall, VMware announced at the company's VMworld 2017 conference in Las Vegas this week. VMware also revealed that Workspace One will enforce Office 365 data loss protection policies, peer-to-peer distribution of policies and patches via the Adaptiva software it licensed earlier this year, along with automation of Windows desktops in its Horizon offering.
Employees can now enroll their own or company provided Windows 10 PCs the same way they configure their mobile phones and tablets using the Workspace One's AirWatch mobile device management (MDM) rather than joining them to an Active Directory domain. AirWatch last week also became the first third party to support the new Google Enterprise option for managing Chromebooks, scheduled for release by the end of September. The Mac client support will come this fall when Apple releases the next version of its operating system, known as Sierra.
"What we are trying to do is give our customers and users a great experience to access all of their applications on any device in a consumer-simple and enterprise-secure way," said Sumit Dhawan, senior VP and general manager of the company's End User Computing business in his keynote address Monday afternoon at VMworld. "We do this with two things: disruptive innovations in our product and those partnerships in the ecosystem. The reason is, today's environment that most customers have are siloes. Siloes of our desktop, mobile and line of business applications. And with the product innovations and partnerships we believe we can stitch together [them] into a platform."
That common user and management experience gives Workspace One a much broader capability than existing MDM offerings including Microsoft's Enterprise Mobility + Security (EMS) suite, said Mitch Berry, VP of Unified Endpoint Management at Mobi, which provides managed mobility lifecycle management services and software. "I think their technology is a lot more advanced than a Microsoft, or a MobileIron or Citrix in that the experience they are able to provide across multiple device types really gives them the lead," said Berry, whose company has partnerships with all of the major MDM providers including Microsoft.
"Few vendors provide the breadth of Workspace ONE's offering, and VMware did a good job of telling a comprehensive EUC transformation story at VMworld," said Gartner Analyst Andrew Garver. Enterprises looking to shift to this more holistic approach to system and applications management, which Gartner calls "unified workspaces," will find the Workspace One appealing, according to Garver, because it provides "modern management across traditional and mobile endpoints, tight coupling with Horizon VDI and Apps and robust set of gateways for both cloud and on-premises."
The unified MDM capability is now possible because Microsoft, Apple and Google have released their management APIs for Windows, Mac OS X and the Chrome operating system. Microsoft did so earlier this year when it released the Intune portion of its Graph APIs. Google said it would do the same when it made its partnership announcement last week with VMware to enable Chromebook management with Workspace One and Apple came on board during Dhawan's VMworld keynote when Matt Brennan, head of global enterprise strategy at Apple, joined him on stage.
"Within VMware, we have leveraged those public APIs extensively," Dhawan said. By "extensively," Dhawan explained that their use goes beyond just enrollment and providing policy management; it's about integrating identity management and applying context, while striking a balance between providing user control and privacy and ensuring that corporate data remains secure.
Dhawan said Workspace One has evolved to meet its mission of bringing mobile, desktop and application management together. The company has added the VMware Identity Manager into its AirWatch console, which it said will provide a common interface for managing devices, context and identity. It also has a simplified mobile single sign-on interface and, using the Microsoft Graph API, it can apply Office 365 enrollment and management, as well as support for other SaaS apps. The new Workspace One release will manage and enforce security polices and provide Office 365 data loss prevention (DLP) upon release of the Office APIs by Microsoft.
"It gives you one way of unifying the experience across all applications and one place to unify your management across all devices," Dhawan said. "This we believe is a massive change and we think is a great opportunity for you."
Workspace One will enable administrators to control how policies, patches and upgrades are pushed out to branch offices using the Adaptiva OneSite tool that VMware licensed earlier this year. By distributing the updates on a peer-to-peer basis using a content delivery network (CDN), organizations don't need to have servers at those branch locations, said Jason Roszak, VMware's Windows 10 director of product management.
In addition to enabling PCs, Macs and Chromebooks to be configured and managed like mobile devices, VMware also said that the Workspace One Horizon 7 VDI and virtual application platform will be available on Microsoft's Azure cloud service in October. VMware, which announced its plans to offer Horizon 7 on Azure back in May, released the technical preview last week. The company, which first extended Horizon beyond vSphere to the IBM Cloud earlier this year, said the Horizon Cloud service running on Microsoft Azure will start at $8 per user, per month.
VMware also plans to enable automation of Windows desktops and applications using its Just in Time Management Platform (JMP) tools, which include Instant Clone, VMware App Volumes and User Environment Manager, by bringing them into a single console.
That will let administrators more easily design desktop workspaces based on a users' needs, said Courtney Burry, senior director of product marketing for Horizon at VMware, who gave a demo of the new capability during the keynote. "The underlying JMP automation engine [will] build those desktops for you," she said. The integrated JMP console is now in preview.
Posted by Jeffrey Schwartz on 09/01/2017 at 1:32 PM0 comments
Samsung unveiled its widely anticipated new Galaxy Note8, a smartphone that will continue to tip the scale at providing PC power in a smartphone. It may also test the practical size limit of a phone with its 6.3-inch Quad HD+ AMOLED "Infinity Edge" display. The new Note8 will be available Sept. 15 and is among several major smartphone upgrades anticipated over the next month, including a new iPhone and Google's new Pixel 2.
The Galaxy Note is Samsung's premium (and most expensive, smartphone) that is aiming to appeal to "the ultimate multitaskers and power users," said D.J. Koh, president of Samsung's Mobile Communications Business, at the launch event held in New York City. Besides its large, high-resolution display (2960x1440 and 521ppi), the Note8's most prominent features include the new S Pen, aimed at broadening the usage of drawing, annotating and notetaking, the ability to multitask on a split screen and two rear 12MP cameras with image stabilization on both lenses. The Note8 is configured with an octa-core 64-bit processor with 6GB of RAM, 64GB of flash storage with a microSD slot that supports up to 256GB of additional external storage and support for Bluetooth 5 and USB-C.
Given its name, the Note's hallmark feature is the S Pen, designed to let users draw and take handwritten notes on the device. The Note8's new S Pen appears more practical for mainstream use with this release and may even appeal to those that have had little interest in that capability in the past. "I think for the first time the pen is actually usable for more people. There's no excuse not to use the pen -- you can use it with the screen off," said Patrick Moorhead, president and principal analyst with Moor Insights and Strategy.
Improvements to the S Pen includes a much finer tip and the ability to respond more naturally do to refined pressure sensors on the display. A new Live Messages feature lets you add GIFs in popular messaging apps. The S Pen supports Samsung's "Always On" capability, letting users draw or take handwritten notes within apps as well as on a notepad app that is now easier to use when the phone is locked. The S Pen offers better precision and, based on my brief test of it, works quite intuitively. The offscreen notepad supports up to 100 pages of notes.
The new App Pair feature now lets users simultaneously launch and use on the same screen two apps. App Pair will let users link most apps to the device's "Edge" panel, allowing any two to simultaneously launch in the new Multi window mode. One could look at a document while exchanging e-mails or participating in a videoconference.
Samsung officials gave significant emphasis to the Note8's two rear 12MP cameras that have Optical Image Stabilization (OIS) on both lenses, including the 2x telephoto lens. The camera's dual-capture mode lets users take two pictures simultaneously and keep both images, enabling the bokeh effect or allowing a background to have the same level of detail and focus as the foreground. A dual-pixel sensor on the wide-angle lens aims to compensate for low-light situations, the company said. It has an 8MP camera in front for selfies and conferencing.
With the launch of the Note8, the company is also looking to put to rest the embarrassing and costly hit its reputation took following the release of the device's predecessor, the Galaxy Note7, last year. Samsung had to pull the Note7 from the market shortly after its release when a flaw in the battery caused them to catch fire. Besides the billions of dollars in losses the company incurred, it briefly tarnished the reputation of one of the top brands among consumers and business users. "None of us will ever forget what happened last year," Koh said. "I know I won't. I will never forget how millions of dedicated Note users stayed with us."
Samsung appears to have already recovered from last year's debacle with this spring's release of the Galaxy S8, S8+ and the new DeX Station, the new dock that allows home and business users to connect the phones to a full-size display, mouse and keyboard as well as run it on a network and run virtual Windows desktops with Amazon Work Spaces, Citrix Receiver and the VMware desktops, as we reported this month. The Galaxy Note8 will also work with the DeX Station.
Posted by Jeffrey Schwartz on 08/25/2017 at 11:12 AM0 comments
Leading up to today's solar eclipse, Intel's client computing team celebrated the launch of its new Intel 8th Gen Core i5 and i7 processors, which will fuel the next crop of PCs coming this fall for both the consumer holiday buying season and commercial and enterprise upgrades. Designed to give a major boost to its entire line of mobile, desktop, commercial and high-performance PC processors, including the release of a quad-core processor in a thin and light notebook form factor, the company used today's historic total eclipse to showcase how it can enable virtual reality experiences.
Gregory Bryant, senior vice president of Intel's Client Computing Group, this morning gave a 15-minute live description with key members of his engineering team, talked up key characteristics of the new processor, which, like previous upgrades, offers a 40 percent performance boost over the previous 7th Gen Core processors. Intel also claims anywhere from a two-time boost over PCs over five years old. The company also said a system with the new processors can view a 4K UHD video locally on a single charge.
Systems with the new processors will also allow users to create a slide show up to 48 percent faster with the 8th Gen processor versus a five-year-old device and render an edited video that would take 45 minutes on that same machine in just three minutes, Bryant said in a blog post announcing the new processors. Bryant also noted it is optimized for Microsoft's new Windows Mixed Reality technology coming to the Windows 10 Creators Fall Update and use Thunderbolt 3 external graphics, up to 4K, for advanced gaming and virtual reality.
Intel is releasing the new processors over the coming months in stages, starting with its U-Series mobile CPUs in the coming weeks with new PCs from Dell, HP, Lenovo, Asus and Acer. "We've put more performance and capabilities than ever into our mobile 15-watt Core i5 and i7 processors, we've added two powerful and power efficient cores and we've created a quad-core processor that fits into the same tiny package as its dual-core predecessor," said Karen Regis, Intel's mobile marketing manager, speaking during the brief broadcast event. The initial release will be based on Intel's existing 14 nanometer form factor chipsets but the company is moving to 10 nanometer processors in the future.
Later this fall, Intel will ship its S-Series processors for desktop and all-in-one PCs, followed by the Y-Series processors intended for fanless detachable tablets and finally the H-Series for high-performance laptops and mobile workstations.
Timing the launch with the solar eclipse was indeed a publicity effort. Nevertheless, the company used the backdrop of the eclipse, to showcase its capabilities for advanced photography. Bryant said the company plans to release a virtual reality experience created by artist and photogrammetrist Greg Dowling, who is creating a virtual reality experience for the solar eclipse in Jackson Hole, Wyo. One of the challenges Downing said he faces with the eclipse is the very rapid change in light with only two minutes to take all of the photographs needed. "The advancement of the amount of compute that we can throw at a problem has completely transformed what we've been able to do with photogrammetry," Downing said. "The eclipse is a very rare and wonderful natural phenomenon that we hope to allow the rest of the world to see that missed it. We'll use everything you can give us for processing power."
Posted by Jeffrey Schwartz on 08/21/2017 at 12:12 PM0 comments
Microsoft this week launched Azure Event Grid, a new managed service aimed at developers who are increasingly building event-based, responsive applications that require event routing and handling. This aims to fill what Microsoft said is a key missing piece in its serverless computing platform.
Azure Event Grid, available as a technical preview, extends Microsoft's existing serverless offerings including its Azure Functions compute engine and Azure Logic Apps, which provides serverless workflow orchestration. The addition of Azure Event Grid addresses the growth of the responsive applications appearing on Web sites and mobile apps, as well as from data streams generated from sensors, embedded systems and other IoT devices, according to Corey Sanders, director of Azure compute, who announced the new service in a blog post.
The new service is the latest addition to address what Microsoft sees as a growing shift toward serverless computing that is allowing developers to build their applications without having to focus on infrastructure, provisioning or scaling. Sanders said Azure Event Grid is a single service that manages the routing of programmed events from any source to any endpoint and with any application. "Azure Event Grid completes the missing half of serverless applications. It simplifies event routing and event handling with unparalleled flexibility," according to Sanders.
"With Azure Event Grid, you can subscribe to any event that is happening across your Azure resources and react using serverless platforms like Functions or Logic Apps," he added. "In addition to having built-in publishing support for events with services like Blob Storage and Resource Groups, Event Grid provides flexibility and allows you to create your own custom events to publish directly to the service."
It also supports various Azure services with built-in handlers for events such as Functions, Logic Apps and Azure Automation, Sanders noted, adding it also provides flexibility in handling events, supporting custom Web hooks to publish events to any service, including third-party services outside of Azure.
Azure Event Grid allows for direct event filtering, is designed to scale "massively" and eases the move toward ops automation with a common event management interface for addressing operational and security automation, including policy enforcement, enabled by Azure Automation's ability to respond to the creation of virtual machines or changes within the infrastructure.
In the current preview, Azure Event Grid can integrate with five event publishers including Blob Storage, Resource Groups, Azure Subscriptions, Event Hubs, Custom Topics and four Event Handlers: Azure Functions, Logic Apps, Azure Automation and WebHooks.
Additional event sources that Microsoft plans to add include Azure Active Directory, API Management, IoT Hub, Service Bus, Azure Data Lake Store, Azure Cosmos DB, Azure Data Factory and Storage Queues. Pricing for Azure Event Grid is usage-based and is detailed on Microsoft's pricing page. During the preview Microsoft is offering the first 100,000 operations at no charge and the cost thereafter is $0.30 per million operations.
Posted by Jeffrey Schwartz on 08/16/2017 at 11:20 AM0 comments
Microsoft has acquired Cycle Computing, a leading provider of orchestration software designed to deploy and manage large workloads in the three major public clouds, as well as in on-premises and hybrid datacenter infrastructures.
Cycle Computing was ahead of its time 12 years ago with its orchestration software designed to deploy and manage virtual clusters and storage to enable HPC workloads such as machine learning, genomic research and running complex simulations. The company's orchestration software now can run those workloads using any combination of the three largest public clouds: Amazon Web Services, Microsoft Azure and Google Cloud. Cycle Computing says its software is used by companies of all sizes including some of those with some of the largest workloads including JP Morgan Chase, Lockheed Martin, Pfizer and Purdue University.
The orchestration platform, called CloudCycle, includes a workflow engine and load balancer which utilizes cloud resources to enable computation at any scale, while using encryption to ensure data remains secure. Cycle Computing is on pace to manage 1 billion core hours this year and is growing at a 2.7x annual pace, according to the company's CEO Jason Stowe, in an announcement about the Microsoft acquisition posted today.
Cycle Computing's customers spend $50 to $100 million on cloud services, according to Stowe, though he emphasized its software also supports on-premises and hybrid workloads. "Our products have helped customers fight cancer and other diseases, design faster rockets, build better hard drives, create better solar panels and manage risk for peoples' retirements," Stowe noted.
Microsoft Corporate VP Jason Zander, who announced the acquisition today, said Cycle Computing brings its tools, capabilities and experience supporting some of the largest supercomputing scenarios to Azure. Zander also believes Cycle Computing will play a role in picking up the pace in mainstream cloud migration.
"Cycle Computing will help customers accelerate their movement to the cloud and make it easy to take advantage of the most performant and compliant infrastructure available in the public cloud today," according to Zander.
Asked about whether CloudCycle will continue to support AWS and Google Cloud, a spokeswoman for the company said it will continue to support existing clients whose implementations include the two competitors but "future Microsoft versions released will be Azure focused. We are committed to providing customers a seamless migration experience to Azure if and when they choose to migrate."
Interestingly, Amazon Web Services showcased Cycle Computing at its first AWS Summit event in New York four years ago. As noted in Stowe's blog post announcing the acquisition, he noted that the company was self-funded with just $8,000 put on a credit card and only took in some funding from angel investors.
In an interview during the 2013 AWS Summit, Stowe described an HPC job that used 10,600 server instances in Amazon's EC2 to perform a job for a major pharmaceutical firm. To run that simulation in-house would require the equivalent of a 14,400 square-foot datacenter which, based on calculations from market researcher IDC, would cost $44 million, he said at the time. "Essentially, we created an HPC cluster in two hours and ran 40 years of computing in approximately 11 hours," Stowe explained. "The total cost for the usage from Amazon was just $4,472."
At the time of our discussion, when I asked if he was looking to other clouds services such as Microsoft Azure, Google Cloud or others, he responded that only AWS was suited to such jobs. "Who knows what the future holds but from our perspective AWS has a tremendous advantage in its technology," he said back then. Clearly, Stowe didn't envision a future that meant Cycle Computing becoming part of Microsoft but that future has arrived.
Posted by Jeffrey Schwartz on 08/15/2017 at 7:46 AM0 comments
In a rebuke of Microsoft's Surface Pros, Surface Books and new Surface Laptops, Consumer Reports magazine last week removed its "recommended" designation from the PCs and tablets. The nonprofit, subscriber-funded publication rescinded its stamp-of-approval after a survey of its readership led Consumer Reports to forecast that 25 percent of Microsoft's Surface-based systems will "present owners with problems" within two years of owning them.
The Consumer Reports National Research Center surveyed subscribers who bought new tablets and laptops between 2014 and the beginning of 2017. Complaints ranged from systems freezing or shutting down unexpectedly, with others dissatisfied with the responsiveness of the touchscreens.
Consumer Reports, which purchases and tests products in its labs on various metrics including display quality, battery life and ergonomics, had previously found several of Microsoft's Surface systems either "Very Good" or "Excellent," including its newest Surface Pro, released in June. However, the publication made the change because many customers care equally about reliability, which its surveys now brought into question.
"While we respect Consumer Reports, we disagree with their findings," wrote Microsoft's Corporate VP for devices Panos Panay, in a blog post challenging the 25 percent failure rate. "In the Surface team we track quality constantly, using metrics that include failure and return rates -- both our predicted 1-2-year failure and actual return rates for Surface Pro 4 and Surface Book are significantly lower than 25 percent. Additionally, we track other indicators of quality such as incidents per unit (IPU), which have improved from generation to generation and are now at record lows of well below 1 percent"
Panay also noted that a survey conducted by researcher Ipsos, commissioned by Microsoft, found a 98 percent satisfaction rate among Surface Pro 4 customers. Interestingly, the Consumer Reports advisory didn't make note of the widely publicized battery issues faced by Surface Pro 3 users. My Surface Pro 3 barely gets three hours these days, despite applying all of the firmware patches Microsoft released.
Among those commenting to the Consumer Reports advisory, a good number had various complaints, while others offered more positive assessments of their experience with the hardware. Personally, I turn to Consumer Reports when making a major purchase such as a car or home appliance (though not for computer hardware or software). Having just bought the new Surface Laptop of my own a few weeks ago, I'm not planning on returning it based on the Consumer Reports advisory, for a number of reasons.
First, based on the tests of the Surface hardware, Consumer Reports testers were impressed with the various systems. It's hard to conclude that 25 percent of every machine will become inoperable after two years, though any system can experience issues of various degrees. But perhaps most important, the survey period ended earlier this year -- way before Microsoft released its latest systems, meaning it has no bearing on the current hardware.
Panay said in his post that Microsoft has learned a lot since rolling out its first Surface devices five years ago. Analyst Patrick Moorhead, CEO of Moor Computing and Insights, has owned every Surface model Microsoft has released, agreed. In an e-mail, Moorhead noted the various software issues it experienced with its Skylake-based SKUs almost two years ago. "These issues have been resolved and led to Microsoft's conservativeness with their latest crop of products," Moorhead noted. "Notice on Surface Laptop and Surface 5 that Microsoft did not embrace Kaby Lake, USB-C or Thunderbolt 3, as this conservatism should lead to a very high-quality experiences."
So far, I'm happy with the Surface Laptop and hope that I'll feel the same way in a few years. I'll follow up with my impressions of the device after a few more weeks of using it.
Posted by Jeffrey Schwartz on 08/14/2017 at 10:56 AM0 comments
Aaron Margosis (@AaronMargosis), a self-described "Windows nerd" and an 18-year Microsoft veteran who is now a principal consultant for the company's Global Cybersecurity Practice, is one of the leading experts when it comes to Sysinternals, a set of free Windows management utility tools. In addition to working with key customers on various security issues, Margosis focuses on his core expertise of Windows security, least privilege, app compatibility and configuring locked-down environment, according to his bio. He has also collaborated with Microsoft's Mark Russinovich on the recently updated book Troubleshooting with the Windows Sysinternals Tools, 2nd Edition (Microsoft Press).
During Margosis' presentation at this week's annual TechMentor conference in Redmond, Wash., titled "The Sysinternals Tools Can Make You Better at Your Job," he gave a deep dive into several of the more popular tools, including autoruns, process monitor and desktop. "These valuable, lightweight tools that aren't available anywhere else," Margosis said.
Autoruns for Troubleshooting
During his presentation, Margosis shared how to use Autoruns to check out some startup scripts. By default, it hides anything that is part of Windows or comes from Microsoft, to expedite troubleshooting. "The pink items are potentially suspicious," he explained. It can also help verify code signatures, to ensure entries are digitally signed by their vendor and submit questionable entries to Virus Total, a Web application service that hosts more than 60 antivirus engines. "You can submit files and ask it what it thinks of this program," he said. "You can also click on links and get details."
To find the things that are problematic, Hide VirusTotal Clean Entries engine, which leaves you with the unknowns or entries that were flagged by one or more anti-virus engines. "So, what do you do when you find something in there you don't want on your computer?" he asked, segueing into another demo. "Right click on the entry to go to where that item is configured or right click and go to the image," he said. "This goes to actual file in Windows Explorer. Then you can right click to delete the entry completely. When you delete it, there's no way to get it back. That might cause damage you can't undo."
The safer thing to do is simply uncheck the questionable entry. "That will disable the thing from running, but don't delete it." Margosis then used used the Process Monitor to help with a problem he had recently experienced in Office where the white theme occasionally reverted back to the default colorful. "The next thing I want to show you is something I was just working on last week. The 'colorful' theme is new default for Office. It kept switching back to 'white'. The tool I decided to use for this is the Process monitor.
Process Monitor
The Process Monitor is the tool for tracking all system activity. It loads up all registry entries and files. "This is the tool I want to use to determine what is setting the colorful theme back in place," he says. "By default, all that data is stored in virtual memory of the process monitor. So, there are a few different ways to run a trace for a long time."
One is backing files. Instead of writing to virtual memory, write to the file and keep appending that file. Another is history depth, which only stores a certain number of events, then drops off the older ones. Drop filtered events is the third. "And that's what I used," says Margosis. "I set the filter only looking for specific actions. I'm going to look for things happening in Excel and writing to the registry."
Margios said he ran his trace for six hours, captured 21 events and did not bog down system at all. "I was able to nail down exactly what happened," he says. "It was actually [a bug within] Office itself."
Desktops
Margosis wrapped up his presentation with a demo the Sysinternal Desktop tool. Within windows station, you can run one or more desktops. Windows runs on each desktop, and can send messages back and forth to communicate between desktops. You can create up to four desktops and use hotkeys to switch between them. The theory being you could have work on three of them and soccer on the fourth. "The Desktop tool takes care of all that," he says. "And it will not lose track of which should be hidden or shown."
While Sysinternals are a critical tool for maintaining the security of Windows, MVP Sami Laiho will give an all-day workshop on how to secure Windows workstations, servers and domains at the next TechMentor conference, which takes place as a track in the annual Live! 360 confab Nov. 13-17 in Orlando.
Posted by Lafe Low on 08/11/2017 at 8:00 AM0 comments
Like it or not, IT pros must now cater to the whims and tastes of the millennial workforce to keep them productive and satisfied with their jobs. Giving employees of this digital generation, who expect more in terms of capability and less in terms of restrictions, is counter to the way IT organizations have traditionally run and requires a whole new mindset.
These sweeping changes mean IT must give their employees the leeway they need to remain productive, while maintaining security and controls of enterprise information. IT needs to apply enough control to ensure productivity and security, but not so much as to scare away this new generation, according to Michael Niehaus director of product marketing for Microsoft's Windows Commercial group, speaking earlier this week in the opening keynote address at the TechMentor conference, taking place at Microsoft's headquarters in Redmond. Wash. (TechMentor is produced by Redmond magazine parent company 1105 Media).
The number of millennials in the workforce is quickly outpacing the combination of Gen Xers and the Baby Boomer generation, and is expected to account for more than half by 2020, according to various forecasts. According to surveys, they're more likely than previous generations to leave an organization if they're not satisfied. This has created a contrast between classic and modern IT, as Niehaus calls it, but it's not as much a question of one model against the other. "I really see this as embracing both at the same time; using both of them when appropriate," Niehaus told attendees. "There is a place for multiple devices and a place for proactive versus reactive support. We need to take both into account."
Niehaus said migrating to Windows 10, which is on the rise this year, is the best approach to supporting both models. During his one-hour keynote titled "Modernizing Windows 10 Deployment and Servicing," Niehaus outlined the new cadence and steps IT professionals should take to keep their Windows shops up to date. Moving to Windows 10 not only means deploying a new operating system but it introduces a shift to more rapid updates and servicing. Niehaus recognizes that this in itself can be quite disruptive from the status quo.
"While technology clearly changes on a rapid regular basis, when it comes to profound changes, there haven't been as many lately," he said. "It feels like we've been stagnant for last 15 years or so. The pace of change hasn't really affected IT as much. The PC environment probably looks same today as it did 10 or 15 years ago." Such stagnation is no longer going to be the norm, he said.
Employees of this digital generation expect more in terms of capability and less in terms of IT exerting control over them. "They're used to having all sorts of devices. They don't have to be taught," he said. "They don't understand why IT is so heavy handed in control of PCs. They expect something different."
With the need to strike that proper balance of giving employees more flexibility (while maintaining an appropriate level of control), Niehaus displayed the recently announced Microsoft 365 service, which includes Windows 10 Enterprise, Office 365 Pro Plus, and the Enterprise Mobility + Security service. The latter, EMS, includes Azure Active Directory for authentication, Intune device management and data loss protection features. The new Microsoft 365 bundle, still in preview, aims to help organizations move to the cloud, Niehaus said. "There are lots of cloud services feeding into this system, with Windows being updated more frequently -- updates twice year instead of twice per decade," he said.
Windows updates essentially become a cloud-based service. The new shift to Windows as a Service means users will get the latest features onto these devices. "The goal is to make this as simple as possible," says Niehaus. "Windows as a Service is an iterative process. We want to be able to move to servicing approach. We also want an easy path to Windows 10."
Niehaus acknowledges there are scenarios where traditional deployment with system images will still be required, but hope to minimize that, especially as Windows servicing progresses. "Our hope is you never need to do that again," he said. "We'll keep you up to date using Windows as a Service." A day later, Aug. 9, Niehaus published a blog post following his presentation, explaining all of the Windows-as-a-Service changes. The post referred to documentation posted in April and a five-minute Microsoft Mechanics video posted last month.
Microsoft intends to have the Microsoft 365 service arrive ready to use. "Ideally we take this further and ship to employee. Then, straight out of box, when they get it, with just a few simple steps, they can set it up without IT touching it."
Windows Autopilot, which Microsoft announced in late June, is also a key piece of that strategy. "We can do this from the cloud completely," Niehaus said. "The idea with the cloud-driven approach is you can take a device and register it in the cloud so when that device is turned on, it can, more or less, configure itself. That is the golden path. We want to eliminate all touching and imaging of the devices. We want to make it easy for users to do it themselves."
The basic steps for using Windows Autopilot, Niehaus said, are to register device to indicate it belongs to an organization. You can then assign a profile which includes system settings. Then you ship the device to user. "They turn it on and off they go," he says. "That's when magic happens."
After a detailed demonstration of the configuration process for the Microsoft 365 service, Niehaus shifted back to discussing the new Windows delivery and deployment cadence. "Once we have Windows 10, we never want you to go through an upgrade project again. We want you to stay current on that." To maintain that cadence, there will be new Windows 10 releases twice a year, in March and September. "Let's just call them semi-annual channel releases."
On the System Center Configuration Manager side, Microsoft will issue three release updates: in March, July and November. "The first and last roughly correlate to Win 10 updates," he said. "They need that intermediate middle of the year release for Configuration Manager."
And that is the primary message from Microsoft: to prepare for the new update cadence. "We used to talk about every three years. Now it's every six months to get latest productivity and security features," Niehaus said. "We need to stay ahead of the bad guys. We will always have new security capabilities. It's a constantly moving target. And [improved security] is the number one reason we've seen organizations move to Windows 10."
A number of sessions at the next TechMentor, scheduled for Nov. 13-17 as part of the Live! 360 conference in Orlando, Fla., will address Windows 10 migration.
Posted by Lafe Low on 08/10/2017 at 8:31 AM0 comments
Almost every IT security professional is concerned that the latest advanced persistent threats (APTs) have made them potential targets of sophisticated cyberespionage campaigns. A survey of IT security leaders in the U.S. and several European countries conducted by security software provider Bitdefender found that 96 percent are concerned about APTs, while 61 percent worry about becoming victims of targeted corporate or industrial espionage.
The survey of 1,051 IT security decision makers, conducted in April and May of this year, also found that 58 percent could be targeted by cyberespionage campaigns using APTs, with 36 percent acknowledging that they were at risk of sophisticated cyberespionage attacks aimed at exfiltrating critical information.
Office 365 attacks are of particular risk since they provide access to e-mail accounts and files stored in OneDrive. Cloud access security broker (CASB) Skyhigh Networks last month revealed a campaign specifically targeting its large enterprise customers' Office 365 accounts.
Skyhigh reported it detected 100,000 failed login attempts originating from 67 IP addresses and 12 networks throughout the world. The campaign targeted 48 of its customers' Office 365 accounts, according to Sandeep Chandana, senior VP of engineering at Skyhigh Networks. Chandana revealed the brute force attack in a blog post on July 20, noting the attack didn't cast a wide net, but rather was targeted at high-level executives.
"The attack was really sophisticated," Chandana said in an interview this week. "It worked really slow, under the radar. Typical systems didn't detect it because it was timed in such a way to evade typical solutions." Based on the intelligence Skyhigh gathered, the attackers appeared to have passwords of high-level executives, many of them C-level, Chandana said, but not their login IDs. "They were trying to use different variations of user names with the same passwords," he said.
Chandana said Skyhigh alerted the ISPs and Microsoft of the incident, and the attempted logins have since tapered off. No one was breached that the company is aware of, he said, noting these were all Fortune-250 companies that use two-factor authentication.
IT security pros believe competitors (61 percent) are the number one culprit of these campaigns, according to the Bitdefender survey, followed by hactivists (56 percent), foreign state-sponsored actors (48 percent) and national government agencies (41 percent). "Most advanced persistent threats are not limited to state-sponsored attacks, as enterprises can also fall victim to attackers that exploit zero-day vulnerabilities to install highly targeted malware to spy on companies and steal intellectual property," according to the report's executive summary. Only 32 percent believe that insiders are likely attackers when it comes to APTs.
Posted by Jeffrey Schwartz on 08/04/2017 at 1:37 PM0 comments
While a growing number of PCs now support Windows Hello authentication, many still can't take advantage of the feature aimed at letting users access their systems without a password. The new Microsoft Modern Keyboard with Fingerprint ID, launched last week, brings Windows Hello Authentication to any PC running Windows 10.
Biometric authentication was among many key new features introduced with Windows 10, released two years ago this week. The number of new systems that support Windows Hello is on the rise but many still lack the necessary hardware -- especially lower-end PCs and older ones upgraded to Windows 10. Microsoft is looking to encourage users to create unique passwords for various accounts and believes Windows Hello will encourage that practice.
"Studies show more than 80 percent of people use the same password across multiple Web sites, managing around 20-30 different accounts," noted Jennifer Thompson, product marketing manager for accessories, in a blog post. "We want to make sure that everyone running Windows 10 can experience the beautiful relief that comes from letting go of your written Pa55w0Rd5! So, we worked to deliver a predictable, intent-driven and simple solution for someone to quickly and securely log into their PC, or authenticate an action."
Microsoft earlier this year pointed to a number of peripherals that provide Windows Hello authentication from Acer, Asus, Fitbit, HID, Huawei, Nymi, Plantronics, Vanconsys and Yubico. RSA Secure Access Authenticator also provides Windows Hello authentication. Nevertheless, while Windows Hello supports biometric authentication to PCs, which reduces the likelihood of unauthorized access to an unattended PC, the number of apps that support Windows Hello integration is paltry. Besides Microsoft OneDrive, the company now points to only 11 third-party tools that support Windows Hello, such as Dropbox, Citrix ShareFile, OneLocker Password File and Cloud Drive.
The new keyboard can link to PCs via Bluetooth or a hardwire connection and costs $130, with a matching mouse priced at $50.
Posted by Jeffrey Schwartz on 07/31/2017 at 11:40 AM0 comments
Microsoft's new Azure Container Instances (ACI) introduces a way to spin up workloads with the precise amount of CPU cores and memory needed on demand, use them for as little as a few seconds and instantly remove them and pay just for what was used. And it does so with the need to provision and manage Hyper-V or any other virtual machines.
ACI, revealed yesterday, is what Microsoft believes is the fastest and easiest way yet to deploy and manage container-based workloads in the cloud, pushing it up the infrastructure layer. Corey Sanders, director of Azure Compute at Microsoft announced the new service during a webcast, where he demonstrated the ability to spin up and take down container-based workloads.
Microsoft's existing Azure Container Service, released over a year ago, introduced the ability to deploy containers in its public cloud service. Using a choice of orchestration tools such as Mesos DC/OS or Docker Swarm, the current Azure Container Service deploys containers as a set of virtual machines. Administrators must manage both the VMs and containers, whereas the new ACI is able to provide that on-demand provisioning and deprovisioning capability without having to use virtual machines to spin up the containers.
"That is what makes Azure Container Instances unique," Sanders said. "It directly deploys the containers instead of deploying and requiring customers to manage the virtual machines underneath. This pushes the infrastructure up a layer, enabling customers to just work with containers, and no longer have to worry about container deletion, creation, patching and scaling of virtual machines. That's all taken care of by just exposing the container as the top-level resources."
Does this new capability portend the demise of the virtual machine? Given Hyper-V is a key component for which Azure itself is built upon, probably not anytime soon. Even with ACI, Sanders noted that Docker Hub and the Azure Container Registry will be supported and that Hyper-V virtualization in Azure enables the deployment and secure isolation of containers. It's still early days for containers.
"Containers are certainly emerging as a platform for newer workloads and for enabling some of the points around agility and scale," he said. "[But] we expect that many workloads will require the isolation characteristics and scale characteristics that come with full virtual machines. We do expect those will continue to be used, and deployed by customers, both in the public cloud and on premises, whether it be on Hyper-V or other hypervisors."
Posted by Jeffrey Schwartz on 07/27/2017 at 2:20 PM0 comments
Looking to simplify the deployment of containers, Microsoft is ramping up a new service in Azure that will allow customers to deploy them on the fly. The company today announced its new Azure Container Instances (ACI) and a Kubernetes-based connector that will enable orchestration.
Microsoft is offering the new ACI service in technical preview, initially for Linux containers, with support for Windows planned in the coming weeks. The preview initially will be available in the U.S. East and West and Europe West Azure regions. The company has also released with the preview its new ACI Connector for Kubernetes, an open source tool that will allow for the orchestration of Kubernetes-based container clusters.
Corey Sanders, Microsoft's director of Azure Compute, described ACI as variably sized single containers that deploy on the fly without requiring virtual machines, with usage to be billed by the second. "It offers the fastest and easiest way to run a container in the cloud," Sanders said during a webcast announcing ACI and the Kubernetes connector. Sanders noted that for typical container deployments, administrators must create a virtual machine to host that container.
"That amount of time to get started and that amount of work to deploy a container has now gone away with Azure Container Instances," Sanders said. "It offers a much faster way to launch those containers by directly launching the container itself instead of first creating the virtual machine."
Sanders elaborated on the annoucement in a blog post. "As the service directly exposes containers, there is no VM management you need to think about or higher-level cluster orchestration concepts to learn," he noted. "It is simply your code, in a container, running in the cloud."
In addition to simplifying the deployment and management of container-based workloads by taking the VM out of the picture, Sanders emphasized that the appeal of containers is their ability to scale and shut down on the fly in an agile and cost-effective manner. In addition to the per-second billing, customers can pay by the gigabyte and CPU, allowing them to choose the capacity needed for the containers.
"This lets customers and developers make sure the platform is perfectly fitting their workload, not paying for a second more than required and not paying for a gigabyte more of memory than necessary," he said during today's call. "Customers can designate the amount of memory separate from the exact count of vCPUs, so your application perfectly fits on the infrastructure," he added in his blog post. "With ACI, containers are a first-class object of the Azure platform, offering Role-Based Access Control (RBAC) on the instance and billing tags to track usage at the individual container level."
In addition to the ACI Kubernetes cluster, customers can deploy ACI instances from a Docker hub repository, which supports a large set of commercial, custom or internally built based containers or from an Azure Container Registry as well as private registries.
Along with today's new service, Microsoft said it has joined the Cloud Native Computing Foundation (CNCF) as a Platinum member. CNCF, which is a project sponsored by the Linux Foundation (Microsoft joined last year), governs various container projects including Kubernetes, Prometheus, OpenTracing, Fluentd, Linkerd, containerd, Helm and gRPC, among others. Gabe Monroy, lead program manager for Microsoft Azure containers, is joining the CNCF board.
Posted by Jeffrey Schwartz on 07/26/2017 at 1:51 PM0 comments
When Microsoft last week posted an update of features that will be removed or deprecated with the release of the Windows 10 Fall Creators Update, the flagship Paint image capture and illustration tool was among those on the list, which also included PowerShell 2.0 support, IIS 6 management tools, Outlook Express and the Reader app. Obituaries and tweets lamented the death of MS Paint, which was among the first free add-on tools offered in Windows.
Microsoft has deprecated Paint in favor of the new Paint 3D tool, which debuted in April with the Windows 10 Creators Update, and has taken its place on the Windows Start menu. While Paint 3D is now pinned to the Start menu, the original Paint is still included among the apps bundled with the OS. With the fall update, only Paint 3D is included.
Reports of Paint's demise went viral yesterday, apparently much to Microsoft's surprise. The company issued a brief statement stating that Paint isn't disappearing, reiterating that it'll remain available in the Windows Store as a free download.
"MS Paint is here to stay. It will just have a new home soon in the Windows Store where it will be available for free," according to the statement by Megan Saunders, Microsoft's general manager for 3D for Everyone, Windows Experiences. "If there's anything we learned, it's that after 32 years, MS Paint has a lot of fans. It's been amazing to see so much love for our trusty old app."
Nevertheless, that doesn't mean Paint will remain in the Windows Store forever. Tools that Microsoft classifies as "deprecated" could be removed in future releases, according to the company.
Frankly, it's not that big a deal. Paint 3D, though it has a much different interface, has the same basic functionality Paint offers, while giving users more illustration and photo editing options, including the ability to render images in 3D.
In fact, when Microsoft first revealed and demonstrated Paint 3D back in October of last year at the announcement of the Windows 10 Creators Update and Surface Studio, the company said at the time that the new tool was a replacement for Paint due to its ability to create richer graphics and images (see Michael Desmond's assessment of Paint 3D and some of the other features in the Windows 10 Fall Creators Update Technical Preview).
By the time MS Paint finally does disappear from the Windows Store, in all likelihood few will notice or care other than those who are sentimental. And those traditionalists will likely once again mourn its final demise, though in reality MS Paint just has a new name and added features.
Posted by Jeffrey Schwartz on 07/25/2017 at 1:08 PM0 comments
Growth of Azure once again was the center of attention following Microsoft's latest quarterly earnings report last week. But the company also gave a strong nod to continued expansion of its Office 365 business, buoyed by the news that Office 365 revenues for the first time surpassed sales of licenses of the traditional suite. Another important stat worth noting was that SharePoint Online usage nearly doubled last quarter over the same period last year.
CEO Satya Nadella shared that data point during the company's investor conference call last week to discuss its results for Q4 FY 2017. Although Nadella alluded to SharePoint Online's growth in passing, and without further elaboration, it gives some indication that the company's goal to move more SharePoint on-premises users to Office 365 is accelerating. To what extent the doubling of usage is meaningful depends on how many organizations are now using it, which the company hasn't disclosed. Various reports have suggested SharePoint Online usage still represents a small percentage of the overall SharePoint user base that either continues to run SharePoint Server on-premises or through hosting or cloud services.
In his prepared remarks, Nadella spoke of Office 365's expansion both in terms of subscriber growth and the depth of premium services used by commercial customers. "Customers are moving beyond core workloads to adopt higher value workloads," Nadella said. "For example, we've seen a significant increase in SharePoint usage, which nearly doubled year-over-year." Does this signify a turning point for SharePoint Online?
During the Q&A portion of the call, financial analyst Raimo Lenschow of Barclays asked about its competitive edge in the context of the Office 365 Premium packages such as E5. "There's a lot more in the Office 365 adoption cycle beyond Exchange and e-mail," Nadella responded. Indeed, the impetus for many organizations to move to Office 365 is to get out of the business of deploying and managing Exchange Servers on premises, a common practice for more than two decades.
But moving to SharePoint Online hasn't been a priority for many organizations, especially those that have built custom server-side applications that aren't portable to Office 365. The release of the SharePoint Framework and recent extensions, along with new tools such as Microsoft Flow, Power Apps and Microsoft Forms are helping to ease that transition. Growth of the new Microsoft Teams also could portend increased usage of SharePoint Online.
But as the recent research conducted earlier this year by graduate students at the Marriott School of Management at Brigham Young University, spearheaded by Christian Buckley's CollabTalk, confirmed, more than half of SharePoint administrators, developers and decisionmakers surveyed have cloud-based implementations of SharePoint but don't have plans to transition entirely to the online version. (Redmond magazine was a media sponsor of the survey, available for download here).
Microsoft corporate VP Jeff Teper, who leads the company's SharePoint, OneDrive and Office, welcomed the Nadella mention in a tweet saying: "One of first highlights from Satya in today's earning's call – near 2x YoY growth in #SharePoint Online usage."
Indeed, that's a sign of growth. While Office’s commercial revenue saw a $277 million or a 5% year-over-year increase for the quarter, Microsoft said the growth was driven by a 43% increase in Office 365 commercial revenues. The overall 5% growth in commercial Office revenues, which totaled $5.5 billion, were offset by the shift to Office 365 from traditional licenses, according to Microsoft. But to what extent SharePoint Online growth has contributed and, in its own right, is accelerating is hard to determine. While Microsoft said that there are now 27 million consumer Office 365 subscribers, the company didn't share the number of commercial Office 365 subscriptions. Nadella had said back in April during the company's Q3 FY2017 earnings call that there are more than 100 million commercial Office 365 subscribers.
As the aforementioned survey revealed, while 32 percent of organizations of all sizes reported that they have hybrid SharePoint deployments, nearly half, or 49 percent, have hybrid server licenses, 35 percent have on-premises licenses and the remaining 17 percent use SharePoint Online. It will be telling what future surveys show with regard to SharePoint Online usage.
It's clear, and certainly not a surprise, that SharePoint Online is growing. The fact that usage has doubled year-over-year validates that point. But it's difficult to draw any solid conclusions as to whether the near-doubling of SharePoint Online usage last quarter points to acceleration or just incremental growth.
Is your organization moving more of its SharePoint workloads to Office 365?
Posted by Jeffrey Schwartz on 07/24/2017 at 6:32 AM0 comments
Office 365 revenues were greater than licenses for perpetual software licenses during the last quarter, marking the first time that this has happened. Microsoft revealed that new milestone yesterday when it reported its quarterly earnings for the period ended June 30.
While Microsoft said there are now 27 million consumer subscriptions, the company hasn't disclosed the number of commercial and enterprise Office 365 subscribers. The company did say commercial revenue for Q4 of its fiscal year 2017 rose 43% over the same period last year. Microsoft pointed to a number of new large subscribers including Nissan, Quicken Loans, Key Bank and Deutsche Telekom, as a part of Office 365's success.
Office 365 is becoming more profitable as well. Microsoft said gross margin for its portfolio commercial cloud services, which, in addition to Office 365, includes Azure and Dynamics 365, was up 52% for the period -- a 10-point increase.
Microsoft CFO Amy Hood indicated during the quarterly earnings call that growth was occurring across different license types and she cited momentum for its E5 premium subscriptions, which the company believes will expand with the coming of Microsoft 365, the bundle of Office 365, Windows, Enterprise Management and Security and Dynamics 365.
"Office 365 commercial revenue growth will continue to be driven by install base growth, ARPU [average rate per user] expansion, and adoption of premium services like E5 and should outpace the rate of transactional decline," Hood said. She added that LinkedIn revenue of $1.1 billion for the quarter, which is also part of Microsoft's Productivity and Business Process segment, contributed 15 points of growth.
Posted by Jeffrey Schwartz on 07/21/2017 at 12:32 PM0 comments
Microsoft has created a new software-defined datacenter certification program for storage and hyper-converged systems running Windows Server 2016, which shares many of the same properties as Azure such as Storage Spaces Direct and software-defined networking (SDN). The new program is targeted at customers interested in some of the appliances coming in September from Dell EMC, Hewlett Packard Enterprise (HPE) and Lenovo that will allow enterprises and service providers to run some of Microsoft's public cloud technologies in their own datacenters.
The Windows Server Software-Defined (WSSD) program lets partners deliver infrastructure capable of building hyperscale datacenters based on Microsoft's software-defined datacenter guidelines and specifications. Systems certified by Microsoft under WSSD undergo the company's Software-Defined Datacenter Additional Qualifications (SDDC AQ) testing process and "must firmly follow Microsoft's deployment and operation policies to be considered part of the WSSD program," as explained by QCT, one of the first six partners certified. The other partners include DataON, Fujitsu, HPE, Lenovo and Supermicro.
"Deployments use prescriptive, automated tooling that cuts deployment time from days or weeks to mere hours," according to the WSSD announcement on Microsoft's Hyper Cloud blog. "You'll be up and running by the time the WSSD partner leaves your site, with a single point of contact for support." The partners are offering three types of systems:
- Hyper-Converged Infrastructure (HCI) Standard: "Highly virtualized" compute and storage combined in the single server-node cluster, which Microsoft says will make them easier to deploy, manage and scale.
- Hyper-Converged Infrastructure (HCI) Premium: Hardware that Microsoft describes as "software-defined datacenter in a box" that adds SDN and Security Assurance to HCI Standard, which the company claims simplifies the ability to scale compute, storage and networking as needed, similar to public cloud usage models.
- Software-Defined Storage (SDS): Enterprise-grade shared storage that's built on server-node clusters, designed to replace traditional external storage devices with support for all-flash NVMe drives, which customers can scale based on demand.
The specific WSSD-certified offerings announced by Microsoft include:
- DataON's S2D-3000 with the company's DataON MUST deployment and management tool, which provides SAN-type storage monitoring.
- Fujitsu's Primeflex for Storage Spaces Direct.
- Lenovo's Cloud Validated Design for Microsoft Storage Spaces Direct, optimized to provide cloud-based storage to Microsoft Hyper-V and Microsoft SQL database workloads.
- QCT's QxStack WS2016, MSW2000 and MSW6000 hyper-converged appliances (its MSW8000 is still under review).
- Supermicro SYS-1028U-S2D HCI appliance, which it describes as a "highly dense" all-NVMe system for cloud-scale software-defined data centers.
While some obvious large providers of software-defined and engineered hyper-converged systems weren't on the initial list, notably Cisco Systems and Dell EMC, Microsoft said it expects to add more partners over time.
Posted by Jeffrey Schwartz on 07/19/2017 at 1:24 PM0 comments
More than 500 people attended a panel discussion Wednesday to hear Microsoft officials explain the company's plans to become a leading provider of enterprise blockchain software and services. Microsoft is among a rising number of IT infrastructure providers and customers that believe blockchain, the distributed ledger technology that has helped enable Bitcoin and the mounting number of cryptocurrencies like it, is poised to disrupt almost every industry.
Two key leaders at the center of Microsoft's marathon effort to build out robust private blockchain capabilities via its Azure cloud service described the technology's potential to upend the economics of many industries during a session at the company's Inspire partner conference, which took place this week in Washington, D.C. The two officials -- Yorke Rhodes, Microsoft's global strategist for blockchain, and Craig Hajduk, a principal program manager for blockchain engineering -- joined myself, Jeremy Epstein and moderator Eric Rabinowitz, CEO of Nurture Marketing, to explain what blockchain is and how they see it bringing about major changes across various industries by removing intermediaries from various processes and transactions.
Microsoft's blockchain efforts, code-named "Project Bletchley," started about two years ago. In the last year, the company has made a push to build Azure blockchain as a service as well. The efforts have gained credibility in the banking, capital markets and insurance industries. They are also of interest in sectors such as manufacturing, logistics and healthcare, where record keeping and security are needed for transactions.
"This technology allows us to digitize assets in a way that we've never been able to do so before," Rhodes said during Wednesday's session. "As soon as I digitize a cow, I can do futures. There's tremendous things going on and this wave of interesting technology is taking us places we've never believed that we could get to before." Panelist Jeremy Epstein, CEO of Never Stop Marketing, has focused most of his time these days working with startups using or offering blockchain-based technology such as Storj.io, OpenBazzar and Everledger because "this thing is like a tsunami 50 miles off the coast, and basically no one knows it's coming," Epstein said.
Blockchain is best known as the open source distributed ledger (think of a shared write-once, read-many shared spreadsheet), that enables bitcoin transactions. Bitcoin, the peer-to-peer cryptocurrency, was considered a breakthrough when it was invented in 2009 because it "was a very novel, creative combination of existing technologies that were out there applied in an interesting way, solving the problem that researchers and computer scientists have struggled to solve up until that point," Hajduk explained in his remarks.
"Blockchain refers to the technologies that sit behind that cryptocurrency and, in fact, there are over 75 different distributions of blockchain or distributed ledger technologies that are available on the market today," Hajduk continued. The amount is actually much higher, evidenced by the number of initial coin offerings (ICOs), or new cryptocurrency startups launched (it's important to point out that these fall outside traditional regulated markets and venture capital fundraising requirements). "There's a lot that you can go into, but the thing to know is that there's a lot of choices for customers and partners today. There's [also] a lot of excitement and hype -- that means it can be very hard to separate what's real from what is just pure hype," he said. "The engineering talent here is frickin' mind blowing, but the marketing is just horrific."
However, it's still early days for blockchain. And while hundreds of large organizations are exploring it and running pilots, no one knows how quickly it will take off. But there's significant interest and hype around the technology. If you've struggled to get a grasp of what blockchain might portend for existing infrastructure and applications, you're not alone -- it has confounded many veteran IT experts. Rhodes, who also teaches electronic commerce at New York University, said it took him some time and extensive research to get up to speed.
"I discovered blockchain by reading about directed acrylic graphs in the summer of 2015 after Ethereum was launched," Rhodes said. "I started to read, and I started realizing that there were all these new nouns and verbs and I had no clue what they meant, so I studied literally for six months on and then came together with another colleague, some of you may know Marley Gray, and we started this engine and ourselves down the blockchain road."
Having met both of them in May 2016 during the inaugural ID 2020 Summit at the United Nations, Microsoft's extensive blockchain focus came to light for me. That event, and last month's follow-up, put forth the notion that blockchain could someday give everyone a secure and private digital identity, but especially to the estimated 1.2 billion undocumented individuals -- many of whom are exploited and deprived of basic human rights. At last month's ID 2020 gathering, attended by 300 people, Accenture and Microsoft announced that they have jointly developed a digital biometric identity that's based on blockchain.
Microsoft, through its key role in organizing and sponsoring the inaugural ID 2020 event, put the spotlight on how blockchain could become more than just a mechanism for removing intermediaries from financial transactions, a notion that already had the financial services establishment on edge with an urgency to act before upstarts beat them to the punch. Microsoft has quickly won over a large number of well-known banks, financial service market firms and exchanges because of its focus on enabling private blockchains based on Ethereum, along with its commitment to supporting other blockchains such as Chain Core, the IBM-Linux Foundation-backed Hyperledger Fabric and J.P. Morgan's open source Quora in the Azure Marketplace.
"I would tell you that about 50 percent of the investment right now in blockchain-based applications is going into the financial sector," Hajduk said. "Not surprising, [since] this technology came from cryptocurrencies, it does value transfer very well. On the other hand, retail or manufacturing supply chain is an area where there's a lot of participating investment. We also see it in healthcare. It's very important in the public sector as well. So, this is a technology that has broad applicability and has broad interest. It's not really about cryptocurrency -- it's about the shared source of truth and it's about rethinking, reworking and transforming your business. That, folks, is what drives the excitement about distributed ledger and blockchain technology, and that's also why we're here today."
Still, everyone on the panel cautioned that the hype cycle around blockchain has reached fever pitch. "If you look a distributed ledger technology as a whole, the hype is still deafening," Hajduk warned. "I'd say in probably six out of 10 cases [or] seven out of 10 cases, when the engineering team sits down with a customer, they say, 'Oh we really want to do a blockchain project.' When you start to talk about the use cases, most of the use cases will actually not be appropriate for a distributed ledger."
For background on blockchain, see this month's Redmond magazine cover story, "Microsoft's Bet on a Blockchain Future," which explains why blockchain is important to IT pros and developers. Microsoft's blockchain technology is further discussed in an article by Terrence Dorsey, "Inside Microsoft's Blockchain as a Service," while partner efforts are described in the Redmond Channel Partner article, "Microsoft Aims to Eliminate the Middleman with Blockchain."
Posted by Jeffrey Schwartz on 07/14/2017 at 12:28 PM0 comments
More than two years after Microsoft revealed plans to offer its Azure Stack software to makers of hybrid cloud-based appliances, Azure Stack is now set for release this September. Azure Stack, which lets enterprises and service providers run their own mirror images of Microsoft's cloud platform in their own datacenters, is a strategic deliverable as the company looks to advance modern IT architectures including hybrid cloud, DevOps and serverless computing.
The first Azure Stack appliances will be available in September from Dell EMC, Hewlett Packard Enterprise and Lenovo, with Microsoft's newest partners, Cisco and Huawei, set to release their offerings by year's end and in the first quarter of 2018, respectively. Microsoft announced the release plans, pricing and the service options at its Inspire conference (formerly known as Worldwide Partner Conference), taking place this week in Washington, D.C.
"We're building out Azure as the first hyper-scale cloud that supports true distributed computing with Azure Stack," said Microsoft CEO Satya Nadella in Monday's opening keynote address.
Some beg to differ as to whether Azure Stack appliances will be "the first" hybrid cloud offerings delivered to organizations, since products from VMware or OpenStack-based appliances may have claimed that turf. But Microsoft argues it brings the software-defined infrastructure offered in Windows Server 2016 (such as Storage Spaces Direct, Hyper-V and support for containers) to a common application development, deployment and systems management model.
"You're writing one set of code, you're updating one set of code, you're deploying one set of code but it is running in two places," said Microsoft Corporate VP Julia White, during a press briefing at Inspire. "In a Visual Studio dropdown, you can select Azure or Azure Stack. It's that simple."
The initial systems will allow customers to provision and manage both IaaS and PaaS workloads via the Azure Portal, effectively choosing Azure Stack as a region. While workloads running in Azure Stack initially are limited, Microsoft officials say they cover the most widely used capabilities in Azure. Among them are Virtual Machines (base VMs and Windows Server), Storage (Blob, Table and Queue) and PaaS offerings via the Azure App Service (including Web Apps, Mobile Apps, API Apps and Functions).
Microsoft said it will continue to push additional capabilities and templates over time. In the short-term pipeline is IoT Hub and the Azure Data Service, said Microsoft Senior Product Director Mark Jewett. While Azure stack doesn't yet support Azure Data Services, customers can run SQL Server in Azure Stack. "We can certainly deliver database as a service," said Jewett.
Jewett and White also pointed to the ability to run the Azure App Service Stack on premises, notably PaaS services, the common API model and Azure Functions, which lets organizations move to serverless computing. Nadella in his keynote also said he sees serverless computing as the next wave in application development, deployment and management. "Virtualization has been amazing, but now this new era of micro services, containers and serverless [computing] is going to be fundamentally transformative to the core of what we write as applications," he said.
Azure Stack will appeal to those who have data sovereignty requirements where information can't be stored in the public cloud, edge computing scenarios where connectivity is unavailable or sporadic such as cruise ships and oil rigs and those looking to build new cloud-based applications that run on premises or extend existing legacy systems.
While Azure Stack isn't the first hybrid cloud appliance, Microsoft is looking to make the case that it's the first to share a common control plane across on-premises and public clouds. Paul Galjan, senior director of product management for Dell EMC Hybrid Cloud Platforms, agrees. "It is unique," he said. "It fits into a niche in the market that no other software vendor is offering anything quite like it."
Dell EMC, which launched a single-node developer edition of Azure Stack back in May that costs $20,000, will offer appliances that support up to 12 modes. The initial systems will be based on the company's PowerEdge R730XE systems. Dell EMC will follow shortly with iterations based on its new 14G server platform, announced at Dell EMC World, which will be built using the new Intel Xeon Scalable ("Skylake") processors, officially launched yesterday.
The configurations from Dell, which will range in cost from $100,000 to $300,000, will vary in average capacity from 100 to 1,000 Azure D1 class virtual machines with up to 8TB of persistent storage, according to Galjin.
Microsoft's Azure usage pricing is now set and specific costs can be found here.
Posted by Jeffrey Schwartz on 07/12/2017 at 12:10 PM0 comments
Accenture has migrated nearly 75 percent of its 400,000 employees to Windows 10 and is on pace to complete the upgrade next year. It appears to be the largest known Windows 10 migration to date, Microsoft acknowledged this week.
It's not entirely surprising that Accenture has fast tracked its Windows 10 migration, considering it is the largest independent global IT consulting and systems integration firm and one of Microsoft's closest partners. Accenture also is the largest customer using Microsoft's OneDrive with 6 Petabytes of business data stored in the online cloud storage service, tied to Windows 10 and Office 365.
Besides the fact that the two are invested in their Avanade joint venture, both companies are working closely on a number of technology initiatives. Two prominent examples were Accenture's endorsement of Microsoft Teams when Microsoft launched it back in November and more recently their work together on advancing secure identities using blockchain, announced earlier this month.
Still, Accenture's migration is a remarkable example of an enterprise that has taken an aggressive posture toward upgrading so many employees in relatively short order. As Windows 10 reaches its two-year anniversary next month, upgrades are on the rise. But as reported by my colleague Kurt Mackie earlier this week, a survey by Adaptiva found just under half (46 percent) of respondents report having migrated 10 percent or less of their PCs and devices to Windows 10, while 41 percent of participants said they have plans to have 51 percent of more devices migrated to Windows 10 within the next year.
The Accenture migration is also noteworthy in that the company's workforce is four times the size of Microsoft's. Naturally, the push gives Accenture cover to say it eats its own -- and Microsoft's -- dogfood, to coin an old phrase. "Not only do we enable our people with the latest technology, we're also setting ourselves up to be a reference for the state of what's possible with Microsoft," Accenture Managing Director Brad Nyers said in a promotional video, embedded in Microsoft's announcement. "It's demonstrating to our clients that we can be a market leader in the adoption of Microsoft technology."
In addition to its massive Windows 10 migration, Accenture is making a big push with Office 365 and Microsoft's Enterprise Mobility + Security (EMS) service, which the company earlier this month ported to the Azure portal and implemented the Microsoft Graph APIs (as detailed in this month's Redmond magazine cover story).
In addition to showcasing its use of the new technology, Accenture CIO Andrew Wilson noted that 75 percent of its workforce are millennials. Both are key factors driving its own move to provide them with more modern experiences such as Microsoft Teams, which Wilson discussed at the November launch, and the ability to support BYOD, where EMS plays a key role.
"Managing the identity of the user, differentiating between enterprise and between personal use at the application and the data level, we can operate in both a mobile way and a secure way [without] disrupting the user experience at the same time," Wilson said in the above-mentioned video. "We believe we're at the very leading edge of keeping ourselves and our clients totally relevant."
Posted by Jeffrey Schwartz on 06/30/2017 at 9:47 AM0 comments
The massive Petya ransomware attack crippled companies and governments across the globe yesterday, putting many workers on the sidelines, thousands of whom were unable to access business-critical files. The attack is similar to last month's WannaCry ransomware attack, which exploited a flaw in Windows Server Message Block 1 (SMB 1). It affects those who didn't apply Microsoft's critical MS17-010 patch issued in March. WannaCry had a kill switch, but there's no known kill switch for the Petya ransomware (also called "NotPetya" by some researchers).
Its effect was indeed quite extensive. The attack yesterday infected more than 12,500 users in 64 countries across the world including Belgium, Brazil, Germany, Russia and the United States. Microsoft late yesterday posted a detailed account of Petya's technique, which the company described as "sophisticated."
Microsoft said it has since released updates to its signature definition packages shortly after confirming the nature of the malware. The updates are available in Microsoft's free antimalware products, including Windows Defender Antivirus and Microsoft Security Essential, or administrators can download the files manually at the Malware Protection Center, according to Microsoft, which also noted that the new Windows Defender Advanced Threat Protection (Windows Defender ATP), released with the latest Windows 10 update "automatically detects behaviors used by this new ransomware variant without any updates."
Experts said this attack is the latest reminder that organizations need more advanced options to protect organizations from becoming victims of ransomware. A report by ISACA found that 62 percent of those surveyed were attacked by ransomware and only 53 percent have any type of formal approach to mitigate it. Moreover, 31 percent said they routinely test their security controls and 13 percent never test them, according to ISACA's recently released State of Cyber Security report.
Organizations need to build better architectures based on zero-trust segmentation, processes (automation, threat intelligence and patch management) and culture and communication, according to a blog post by Forrester Analyst Jeff Pollard. "The more dependent on digital business your revenue is, the higher the likelihood a cybersecurity issue will cripple you," Pollard said.
With this attack and last month's WannaCry incident, security firms are reiterating the following security best practices and guidelines (while also making a case for their own security wares):
- Backup and recovery: In conversations and conferences held by companies such as Acronis, CommVault and Veeam, the companies have talked up the fact that merely backing up your data doesn't mean your data will be protected from ransomware. The recent release of Acronis Backup 12.5 includes a new feature called Acronis Active Protection, designed to prevent malware that can find its way into a backup in the first place using behavioral heuristics. "We are making sure the ransomware cannot get into our agent and get into our backups," said Frank Jablonski, VP of global product marketing at Acronis.
- Manage pivileges: The Petra exploit, similar to other ransomware variants, requires elevated administrator rights to gain access to systems, Morey Haber CTO of BeyondTrust said. Organizations that have lax privilege management tools should remove end user administrator rights, which will ensure that only digitally signed software is trusted. However, that will only stop initial infection, Haber warned. "Once the first machine is compromised, administrator rights are not needed to propagate the worm due to the severity of the vulnerability and methods used for exploitation," he said.
- Keep software up to date: In addition to removing administrator rights, Haber said organizations should perform vulnerability assessment and install security patches promptly.
Those individuals that are educated on what to do when receiving a suspicious message is surprisingly low, commented Marty Kamden, of NordicVPN, in an advisory released today. "If you encounter a 'Check Disk' message, quickly power down to avoid having the files encrypted by the ransomware," said Kamden. Also, it's important to know which file to block. "Stop the spread within a network from the Windows Management Instrumentation by blocking the file C:\Windows\perfc.dat from running," he noted. "If such a file doesn't exist yet, create it yourself and make it read-only."
Posted by Jeffrey Schwartz on 06/28/2017 at 2:12 PM0 comments
Organizations that now use or are considering the Microsoft Teams chat tool offered with Office 365 business and enterprise subscriptions received both good and bad news last week. The good news is that Microsoft Teams can now integrate with third party cloud storage and file sharing services such as Box, Dropbox, Citrix ShareFile and Google Drive. The bad news is support for guest access that will allow external users to participate in Microsoft Teams groups will arrive later than expected. At the time Microsoft Teams was released in March, the company had targeted adding the guest access capability by the end of June.
The delay in support for adding external users to Teams should be brief, according to Suphatra Rufo, a Microsoft program manager for Microsoft Teams, who provided notification of the delay in a support forum. Although Rufo didn't provide an exact timeframe, she insisted the company still plans to deliver the feature, which will allow groups to add outside contractors, partners, customers and suppliers.
"We just are hitting some technical issues," according to Rufo's explanation. "The next date is not too far from the original June target. This is a top priority, so trust me when I say you will have it soon!"
Several posters on the forum commented that "soon" is too vague for their liking. "I'm about to launch a multisite project about lean manufacturing," according to a comment by someone identified as Gerald Cousins. "Nearly 8 companies, 20+ project leaders to coordinate / inform / communicate. A good opportunity to use Teams. Have you any expected date for availability? So that I can decide to delay ... or to postpone."
Marco, who apparently works for a school district, also was hopeful the delay would be brief. "This is becoming a real problem now. You 'committed to June' and I relied on this and made commitments for it to my customers. There are school migrations planned during July that rely on this feature. School starts again in August. So, will we be able to use Teams with external access or not."
Providing access to external users obviously must be delivered correctly to ensure customers aren't introducing security risks. Based on Rufo's comments, it appears the feature will arrive in a relatively short timeframe, but it's understandable those that have made commitments are left holding the bag.
As for the good news, Microsoft is extending the storage and file sharing options for Teams, which, until now, required organizations to draw from OneDrive and SharePoint. Now Microsoft Team members can now also integrate with Box, Citrix ShareFile and Google Drive. Office 365 Admins can configure the individual storage providers in the Office 365 Admin Center, according to the announcement posted by Katharine Dundas, a Microsoft senior product manager for Office 365.
"By bringing content from Box into Teams, organizations can share their files more easily and collaborate on projects in real time, all while keeping their content securely managed in Box," said Jon Fan, a senior director for product management at Box, in his post announcing the integration with Microsoft Teams.
Ross Piper, VP of enterprise at Dropbox, added that integrating Office 365 and Microsoft Teams with Dropbox, set to be available next month, will make it easier to find, share and gather feedback in a chat without having to leave the conversation. "Once the integration is authorized by an administrator, users will be able to add Dropbox folders to a channel," according Piper's announcement. "From there, they'll be able to upload files to conversations, and create Office files directly on a shared Dropbox folder in Teams."
Posted by Jeffrey Schwartz on 06/26/2017 at 11:39 AM0 comments
While Amazon's surprise deal last week to acquire Whole Foods for $13.7 billion is poised to upend the entire grocery and retail industry, as well as how suppliers distribute their goods, it also could impact which cloud services providers large retailers, distributors and goods suppliers use. Microsoft is in the middle of a battle that has already emerged in wake of last week's deal, where rivals are concerned about issues related to supply chain visibility and loath to enrich a fierce competitor attacking their margins.
Whole Foods, which has 462 high-end grocery stores throughout North America and Europe, is already a large Microsoft Azure customer. Meanwhile, Amazon's most formidable rival in the retail industry, Wal-Mart, which has a storied history of using its IT purchasing clout to its advantage, is among several large retailers that have told IT providers that if they want Wal-Mart's business, they can't use solutions dependent on AWS, according to a report in the Wall Street Journal this week.
Wal-Mart, which keeps the bulk of its data on premises, does use Azure and other providers when running some cloud-based services and acknowledged that there are cases when it pushes for alternatives to AWS, according to the report. Amazon reportedly responded that such conditions amount to bullying and are bad for business.
Wal-Mart reportedly did approach cloud service provider Snowflake Computing at one point. However, CEO Bob Muglia said that Snowflake's data warehousing service currently only runs in AWS, though in an interview with Muglia late last year, he hadn't ruled out Azure and other clouds in the future, if there's a business case to do so. At the time, it didn't appear to be a priority.
Ironically, Muglia is a former longtime Microsoft executive and president of the company's server and tools business, and was on the team that launched Azure back in 2010. Even more ironic is a previous Whole Foods and Azure commitment, according to a November case study published on Microsoft's Web site. In the midst of a five-year plan to move all of its infrastructure and software to a SaaS model running in Microsoft Azure, Whole Foods has already deployed Microsoft's Enterprise Mobility and Security (EMS) service. Whole Foods has also migrated 91,000 employees from Active Directory running on premises to Azure AD Premium, which gives all of them single sign-on access to 30 SaaS applications.
It goes without saying that should the deal go through, the role of Azure at Whole Foods will surely diminish. Microsoft has the most to lose should the deal go through, though IBM and Oracle have formidable retail clientele as well, many of which are using their emerging cloud offerings. Nevertheless, AWS and Microsoft Azure are considered to have by far the largest cloud infrastructure service portfolios, datacenter footprint and industry leadership, according to Gartner's latest annual Magic Quadrant Report.
For its part, Microsoft has invested significantly in targeting solutions for retailers, distributors and consumer goods suppliers. Microsoft has its Retail Experience facility in Redmond, Wash., which I saw two years ago. The facility offers partners and customers who visit a glimpse of the broad advances and investments Microsoft has made in offering retailers and wholesalers new capabilities, showcasing what the company is working on in areas such as automation, IoT, machine learning and new ways of managing payments.
The deal to acquire Whole Foods, by far the largest Amazon has ever made, brings new questions to what retail experiences will look like in the coming years. Though not surprising, the move shows that Amazon CEO Jeff Bezos is willing to spend whatever he feels is necessary to win. And it's possible Amazon could find itself having to raise its bid. Whole Foods shares have traded slightly above the agreed-upon price of the all-cash deal on belief, or speculation, that a rival suitor might top Amazon's price.
Depending on Amazon's determination to pull off the deal at any price, the reality is that the giant online retailer now it has its sights on bricks and mortar distribution in a big way (it is already experimenting with several bookstores), which could bring a new wave of business disruption.
While it's impossible to predict how this will play out and what moves will follow, I think it's a reasonable bet that Microsoft won't try to one-up Amazon and acquire a retailer such as Wal-Mart. Certainly, let's hope not.
Posted by Jeffrey Schwartz on 06/23/2017 at 2:02 PM0 comments
Lenovo has rebooted its entire datacenter portfolio with what it described as its largest and broadest rollout of new and refreshed hardware with the introduction of 26 new servers, storage and network gear and a new line of engineered appliances and hyper-converged systems. At a one-day event in New York City for hundreds of customers, analysts and press, Lenovo's top brass declared the company intends to extend its footprint in datacenters and become the leading supplier of high-performance and super-computing systems.
The company's event was more than just a large rollout of datacenter, client and workstation products but rather an assertion that Lenovo is gunning to outpace rivals Cisco, Dell EMC and HPE. Such competitive chest thumping of course is common, and Lenovo's assertion is ambitious considering its rivals currently have much broader and modern datacenter portfolios (and, consequently, greater market share).
Nevertheless, Lenovo officials at yesterday's event noted that the company will deliver its 20 millionth server next month. Best known for taking IBM's struggling PC business 12 years ago and achieving market leadership, Lenovo is relatively new to the datacenter after Big Blue again decided to offload its commodity x86 server business in January of 2014 for $2.3 billion, a deal that was completed a year later.
While Lenovo has rolled out various upgrades since and has signaled its plan to extend its datacenter footprint, yesterday's Lenovo Transform event was the kickoff of a strategy that brings together new products and revamped development, manufacturing, distribution, marketing and service capabilities. "We are going to disrupt the status quo and accelerate the pace of innovation -- not just in our legacy server solution but also in the software-defined market," said Kirk Skaugen, president of Lenovo's Data Center Group.
Skaugen, a former senior executive at Intel who was tapped late last year to lead Lenovo's datacenter business, believes Lenovo has some distinct advantages over Cisco, Dell EMC and HPE -- notably that it doesn't have those companies' legacy businesses. "We don't have a huge router business or a huge SAN business to protect," he said. "It's that lack of legacy that's enabling us to invest and get ahead of the curve on this next transition to software-defined. You are going to see us doing that through building our internal IP, through some significant joint ventures [and] also through some mergers and acquisitions through the next several quarters."
Another key advantage is that Lenovo manufactures its own systems, he emphasized. Bold as the statements Lenovo made yesterday might sound, which also includes wanting to be the top player in supercomputing in the next several years, the company has the resources to disrupt the status quo if it can execute. "I've never seen a big, bold statement from Lenovo on the datacenter side," said Patrick Moorhead, principal analyst with Moor Insights and Strategy, during a brief interview right after the keynote presentation. Moorhead, who said he needs to drill deeper into the roadmap, said Lenovo has been building toward this focus for over a year. "They've thrown down the gauntlet and are definitely at the table," he said.
Moving away from IBM's Series x brand, Lenovo launched two new brands: ThinkSystem and ThinkAgile. ThinkSystem is a broad portfolio of new platforms consisting of new servers, storage and network switches that will roll out with this summer's release of Intel's new Xeon Scalable Family Platforms.
The new rack-based ThinkSystem offerings include the SR950 4U system, which is targeted at mission-critical workloads such as transactional systems, ERP and in-memory databases, and is scalable from two to eight processors. The denser SN850 blade server compute node is designed for the company's Flex System chassis. The SD530, the company's high performance computing entry into the 2U4N form factor, is designed for its new D2 chassis. Also added to the ThinkSystems line is its new DS Series entry-level to mid-range storage offering, available in all flash and hybrid SAN configurations.
ThinkAgile is what Lenovo describes as its software-defined infrastructure based line consisting of engineered systems targeting modern hybrid cloud workloads that include hyper-converged systems based on platforms from Microsoft, Nutanix and VMware. Lenovo's planned Azure Stack appliance will fall under the ThinkAgile portfolio and will be called the ThinkAgile SX for Microsoft Azure Stack.
Both the ThinkSystem and ThinkAgile portfolios are based on Lenovo's new systems management platform, XClarity Controller, which the company said offers a modern and intuitive interface that can provide centralized configuration, policy and systems management across both.
Lenovo officials said that while the company plans to accelerate the release of new products and partnerships, the company has made some key operational changes over the past year that will give its datacenter group better focus. Among the changes, Skaugen said the company has moved to a dedicated sales and marketing organization. "In the past, we had people that were shared between PC and datacenter," he said. "Now thousands of sales people around the world are 100 percent dedicated end-to-end to our datacenter clients."
Skaugen added that Lenovo now has a dedicated supply chain, procurement organization and has brought in new leadership that can focus on various technology and industry segments. Lenovo has also revamped its channel organization. A year ago, Lenovo's datacenter group had five different channel programs around the world. "We now have one simplified channel program for dealer registration," he said. "I think our channel is very, very energized to go out to market with Lenovo technology across the board. And we have a whole new set of system integrator relationships [and] a whole new set of partnerships to build solutions together with our system integrator partners."
Moorhead said the moves were long overdue. "While I think Lenovo should have done this two or three years ago, right after they chewed up IBM's xSeries business, these moves should help them become more successful," he said.
Despite some operational miscues in the past, unfortunately for Lenovo, it picked up the IBM xSeries business at the peak of the market, Charles King, principal analyst with Pund-IT, observed in a research note. Lenovo acquired IBM's x86 business in January 2015, when global server sales had grown just 2.3%. But the x86 systems market grew at twice that rate, according to IDC, King recalled, noting that two years later the global server market declined 4.6% to $14.6 billion in the 4th quarter of 2016.
"While Lenovo was working to integrate IBM's System x x86 systems and personnel with its own strategies, products and company culture, it was also navigating a notable decline in hardware sales and revenues," King noted.
Now that Lenovo has rebooted, King said despite its posture that the company doesn't have the legacy of some of its rivals, given its success in the PC business, it would be premature to underestimate the company's ability to extend its footprint in the datacenter over time. "It would be a mistake to assume Lenovo isn't fully ready and able to take its efforts in datacenter solutions and sales to the next level."
Posted on 06/21/2017 at 1:35 PM0 comments
Accenture today demonstrated a prototype of a technology it developed with Microsoft that can allow individuals to create digital identities based on the blockchain Ethereum protocol using biometric authentication. The prototype, demonstrated during the second ID2020 Summit at the United Nations, showed how an individual can create a digital identity on a blockchain tied to a biometric interface such as a fingerprint or facial recognition and maintain control.
The demo was based on a recent effort by Accenture, which tapped Microsoft and its joint venture Avanade, to provide a biometric identity system for the UN's High Commissioner for Refugees (UNHRC), which has already enrolled more than 1.3 million refugees throughout the world from more than 75 countries and hopes to support more than 7 million by 2020.
David Treat, director of Accenture's Blockchain practice, showed attendees of the ID2020 Summit how an undocumented refugee could create his identity and update it through his life with information such as birth, banking, health, education, career and any information needed to authenticate a given transaction. The user enrolls his credentials using Accenture's Unique Identity Service Platform, which uses Microsoft's Azure Blockchain as a Service, to provide and share identity attributes based on permissions defined by the user.
The Ethereum blockchain is suited for giving individuals control over their personal data because it's based on a "permissioned," distributed peer-to-peer architecture. Accenture and Microsoft recently showcased the potential of blockchain to give the 1.2 billion estimated people throughout the world who now lack any form of identification a digital fingerprint as an example of how the ID2020 working group within the UN sees partnering with the IT community as a positive trend. The goal is to find a global identity solution by 2020 that could be implemented by 2030, as defined in 2015 by the UN's Sustainability Development Goals. "This is going be a long haul. It's not something we are going to solve overnight," said Dakota Gruener, ID2020's executive director. The first ID2020 Summit gathered last year at the UN, and intends to continue working on its mission in the near future, Gruener said.
Treat said the technology is designed to connect with existing identity systems and is based on the recently announced Decentralized Identity Foundation, a consortium led by Accenture, Microsoft, RSA and a number of Blockchain startups aiming to create "an open source decentralized identity ecosystem for people, organizations, apps and devices."
Accenture's biometric registration capability has been in the field for three years. "What we did was make sure that it's scalable and runs well on our cloud, and then add this consumer-owned identity piece," said Yorke Rhodes, Microsoft's global strategist for blockchain. Over time, Rhodes says blockchain offers the potential to give users control how their identities are used by Google, Facebook and LinkedIn, as well as in Active Directory.
"If you look at a lot of the problems associated with identity, there are hacks associated with honeypots," Rhodes said. "So, the ideal world is you can get away from that by not actually pulling together all this data into these large databases."
Accenture's Treat said the pilot discussed actually amounts to a simple use of blockchain. "It lets you leave the data where it is, and lets you use blockchain to capture that identifier, index that information, use it where it is and create that linkage between disparate sources."
Posted by Jeffrey Schwartz on 06/19/2017 at 1:33 PM0 comments
Microsoft this week unveiled plans to bring Windows Server into its semi-annual update release cycle, starting this fall, alongside a new Nano Server image configuration with a 50 percent reduced footprint. The company is stripping the infrastructure roles from Nano Server to optimize the headless configuration option for deployment of container-based environments. While Microsoft revealed the Nano Server changes during last month's Build conference in Seattle, along with other features coming to Windows Server including Linux container support and plans to offer the server OS into its new semi-annual cycle, the company on Thursday made its plans official and provided some additional details.
Only those opting for the newly minted semi-annual channel can implement any of the new technical features planned for Windows Server, which include the pending stripping down of Nano Server. In addition, organizations will be required to have Software Assurance coverage to access the semi-annual channel releases. The revamp of Nano Server is noteworthy because leading up to its release last year, Microsoft had touted the minimal-footprint deployment option for Windows Server 2016 for its suitability for large clusters in Web-scale application and datacenter environments. However, Microsoft has since found that the "vast-majority" of Nano Server deployments from a workload perspective are running container-based applications based on Docker, Kubernetes and others. Since container-based workloads do not require the infrastructure components, Microsoft determined that removing them would result in a more efficient server environment and advance the move toward containers.
"Nano Server will be optimized as a container runtime image and we will deprecate the infrastructure roles," said Chris Van Wesep, Microsoft's director of enterprise cloud product marketing, "so for anybody who had wanted to do smaller footprint compute and storage clusters, Server Core will be the right implementation to do that." By deprecating the infrastructure features in the Nano Server option, the removal of that code will make way for Microsoft's new .NET Core 2.0, "which enables customers to use more of their code in more places [and] make Nano Server the best option for new container-based development," said Erin Chapple, general manager for Windows Server, in a blog post announcing the new release options.
Microsoft is recommending Server Core for hosting virtual machines as well as containers, which Chapple said can run a Nano Server or Linux container images. The Windows Server update this fall will support Linux workloads via extended Hyper-V isolation, which will allow Linux containers to run without having to deploy two separate container infrastructures to run both Linux and Windows-based applications. As previously announced, Microsoft is also bringing the Windows Subsystem for Linux, (a.k.a. Windows Bash component), allowing application administrators and developers to use common scripts and container images for both Linux and Windows Server container hosts, according to Chapple.
Collectively, these technical changes to Windows Server and the continuous release cycle option associated with it are part of Microsoft's strategy to bring more consistency to the server OS and Azure. The changes also promote the development of modern cloud apps and migration of legacy apps and systems to these environments using container images. "Many customers don't realize that Server Core is the base image that runs Azure and Azure Stack," Chapple noted. "This means the investments we make in Windows Server for Azure can be made available to customers to use in their own datacenters. This makes Server Core a great choice for Azure customers who like the idea of consistent technologies between their datacenter and the cloud. One example of this in the upcoming feature update is the cluster sets functionality for increased scale of hyper-converged deployments. We're also continuing to add security investments such as the ability to quickly encrypt network segments on their software-defined networking infrastructure per their security and compliance needs. You can expect new features to continue to optimize efficiency, performance and security for software-defined datacenters."
Server Core will also play a key role with the modernization of applications, Van Wesep emphasized. "One of our big pushes for next year is going to be around getting folks that have traditional .NET applications to drop those into containers running on Windows Server 2016, potentially even moving them into Azure," he said. The new features will only be available to those opting for the new semi-annual channel, which will require Microsoft Software Assurance or Azure cloud subscriptions.
Microsoft explained how the new semi-annual channel release update will work. The company will offer new feature updates every spring and fall, the same model it recently moved to for Windows 10, Office 365 ProPlus and System Center. Microsoft will offer Windows Server previews shortly before the final release via its Windows Insiders for Business program, which is now open to those who want to sign up. Each semi-annual release will come with a pilot availability period of three to four months, and once the software is deemed stable, Microsoft will lock it down into a "Broad" release. Those releases will carry the Windows Server name with no year attached to it, instead using the Windows 10 versioning model. The first release, scheduled for September, will be called Windows Server 1709. Chapple noted that the semi-annual channel feature updates are cumulative, containing previous updates. The semi-annual channel feature updates will get combined for Microsoft's long-term servicing channel releases, a servicing model conceived for environments that can't tolerate change.
The long-term servicing channel will include five years of mainstream support, five years of extended support and the option for six years of Premium Assurance. Van Wesep acknowledged that the long-term channel will be the most common in the near term. "I don't imagine that the vast majority of the people will come out of the gates and say this is our new model and we will wholeheartedly switch to this -- that would be naïve," he said. "I think there has been enough demand and feedback from customers that they want a more active way of consuming our software that there will certainly be a meaningful size of the installed based that will start looking at this and working to adopt it. I can see a scenario where every customer would find both channels compelling for different parts of their organization."
Indeed, many organizations may be resistant to such a model, and Van Wesep acknowledged that many existing applications and systems don't lend themselves to a continuous release update model. But as many organizations look to transform their business processes or models over time, that can change. "This is on us to do the education process," Van Wesep said. "People need to start thinking about the layers of abstraction between the app, the OS and the underlying hypervisor/fabric. It all can be thought of independently and should be."
Posted by Jeffrey Schwartz on 06/15/2017 at 12:27 PM0 comments
Microsoft is giving a significant boost to its support for Cloud Foundry, the popular open source cloud platform for building and deploying cloud-based applications. The company announced yesterday it has joined the Cloud Foundry Foundation and is offering extended tools and integration between the popular PaaS architecture.
Cloud Foundry is quickly emerging as a DevOps platform of choice among enterprises looking to develop applications in any language and deploy in container images on any supported infrastructure. Conceived originally by SpringSource, which in 2009 was acquired by VMware, the Java-oriented Cloud Foundry project was later spun into Pivotal Labs. Pivotal contributed to the Cloud Foundry open source project, while offering its own commercial stack.
In addition to Pivotal, among its principal stakeholders are the rest of the Dell Technologies family including Dell EMC and VMware. Cisco, Google, IBM, SAP and Suse are also among those who have made strategic bets on Cloud Foundry. Microsoft first announced support for Cloud Foundry two years ago with the release of service broker integration and a cloud provider interface to provision and manage Cloud Foundry in Azure via an Azure Resource Manager template. Microsoft added Pivotal Cloud Foundry to the Azure Marketplace last year.
A growing number of enterprises have found Cloud Foundry appealing because of its ability to scale and support automation in multiple hybrid and public cloud environments. GE, Ford, Manulife and Merrill are among a number of large Microsoft customers using Cloud Foundry as their cloud application platform, noted Corey Sanders, Microsoft's director of Azure Compute. Sanders announced Microsoft's plan to become a Gold sponsor of Cloud Foundry at the annual Cloud Foundry Summit, taking place this week in Santa Clara, Calif. While Microsoft already offers some support for Cloud Foundry in Azure, the company is making a deeper commitment, Sanders explained yesterday in a blog post.
"Cloud Foundry on Azure has seen a lot of customer success, enabling cloud migration with application modernization while still offering an open, portable and hybrid platform," Sanders noted. Microsoft's extended Cloud Foundry support in Azure includes integration with Azure Database (PostgreSQL and MySQL) and cloud broker support for SQL Database, Service Bus and Cosmos DB. Microsoft has also added the Cloud Foundry CLI in its Azure Cloud Shell, which Sanders said will provide "easy CF management in seconds."
Sanders outlined some other key integration points and tools that will enable deeper support for Cloud Foundry running in Azure. Among those, as described by Sanders in his blog post:
- Azure Cloud Provider Interface - The Azure CPI provides integration between BOSH and the Azure infrastructure, including the VMs, virtual networks and other infrastructural elements required to run Cloud Foundry. The CPI is continually updated to take advantage of the latest Azure features, including support for Azure Stack.
- Azure Meta Service Broker - The Azure Meta Service Broker provides Cloud Foundry developers with an easy way to provision and bind their applications to some of the most popular services, including Azure SQL, Azure Service Bus and Azure Cosmos DB.
- Visual Studio Team Services plugin - The Cloud Foundry plugin for Visual Studio Team Services (VSTS) provides rich support for building continuous integration/continuous delivery (CI/CD) pipelines for CF, including the ability to deploy to a CF environment from a VSTS-hosted build agent, allowing teams to avoid managing build servers.
- Microsoft Operations Management Suite Log Analytics – Integration with Log Analytics in OMS allows users to collect system and application metrics and logs for monitoring CF Application.
Microsoft, Google and Red Hat are among those working to build support into their service brokers to provide a single and simple means of providing services to native cloud software and SaaS offerings based on Cloud Foundry (as well as OpenShift and Kubernetes). The resulting Open Service Broker API project, announced in December, aims to provide a single interface across multiple application and container service platforms. While Microsoft announced support for the API at the time, Sanders said it is formally joining that group, which includes Fujitsu, GE, IBM and SAP.
"Working with this group, I hope we can accelerate the efforts to standardize the interface for connecting cloud native platforms," Sanders said, promising that will also ensure application portability across platforms and clouds. Microsoft will host a webinar on July 20 that describes how to create applications using Cloud Foundry on Azure, including examples of existing customer implementations.
Posted by Jeffrey Schwartz on 06/14/2017 at 12:34 PM0 comments
Microsoft today is showcasing its aspiration to broaden the reach of analytics data and to bring new ways to integrate and visualize operational data. Kicking off the Microsoft Data Insights Summit in Seattle, Wash. today, the company announced the general availability of its new Power BI Premium offering, announced last month. Microsoft also revealed forthcoming capabilities to Power BI that it said will add more drill-down capabilities and more advanced ways of querying data using conversational input.
Since the release of Power BI three years ago, Microsoft has fast tracked the addition of new tools, integration capabilities and scale of its self-service visualization offering. Power BI is now used by 2 million people who create 30,000 data models in Power BI services every day, said James Phillips, corporate VP of Microsoft's Business Applications, Platform and Intelligence (BAPI) organization, speaking in the opening keynote at the Microsoft Data Insights Summit.
Phillips said Power BI is integrated with in the form of "content packs" embedded among thousands of software and SaaS offerings using its common API. "If you go back three of four years, there were reasonable questions about Microsoft's commitment to the business intelligence market," Phillips said. "I think if you look at the levels of investments that we've made, the progress that we've made, the growth of our user population, I think those questions are far behind us."
Microsoft believes it will change the economics of delivering visualized operational analytical data from any data source. The release of Power BI Premium comes with a new licensing model that allows for the distribution of reports enterprise wide, as well as externally to business partners and customers. The cost of embedding Power BI Premium into apps starts at $625 per month.
Power BI Premium is offered in a variety of different capacity, virtual core and memory sizes, depending on the requirements of the application. Microsoft posted a Power BI Premium calculator to help customers estimate the configurations they'll need and the monthly costs. Microsoft touts the fact that the scale of Power BI comes from the fact that it is deployed throughout its Azure footprint -- the new premium offering offers an on-premises Power BI Report Server. In addition to Power BI Premium customers, those with SQL Server Enterprise Edition and Software Assurance (SA) can deploy the Power BI Report Server as well via their entitlements.
Microsoft Technical Fellow Amir Netz also took the stage during the opening keynote to announce new offerings coming to Power BI Premium this summer that will allow customers to embed Power BI into their applications. He also demonstrated how Power BI Premium allows key Microsoft applications to generate visualizations. Among then was the ability to embed Power BI reports into SharePoint collaboration projects "that are fully dynamic and fully updateable that you can distribute to everybody," Netz said. In addition to Power BI working with SharePoint, Netz said it works with the new Microsoft Teams collaboration workspace application, allowing members of a team to add Power BI reports. "All you have to do is go to the channel, and pick one of the workspaces you want it to work with and just like that I have a Power BI report embedded into Microsoft Teams," he added. Also demonstrated was forthcoming Microsoft Visio custom visual, which the company will release next month, though a preview is now available.
Netz also described new capabilities that will allow for users to act on data generated from Power BI Premium, which will be enabled though Microsoft's Power Apps offering. Netz said users will be able to write back directly to Power BI. The keynote also showcased the new Quick Insights feature that will allow advance predictive analytics via the Power BI Desktop, allowing people to gather answers to queries.
Posted by Jeffrey Schwartz on 06/12/2017 at 2:05 PM0 comments
Microsoft was among the numerous tech companies and leading businesses across all industries that spent the last few months with key officials in the Trump administration urging President Trump not to withdraw from the Paris Climate Agreement. President Trump's decision last week to pull out last week was widely and sharply criticized by IT and business leaders, as well as elected officials at the federal, state and local levels.
For its part, Microsoft was a champion of the agreement from the outset and viewed last week's decision as a key setback. "We believe climate change is an urgent issue that demands global action," Microsoft CEO Satya Nadella tweeted on Thursday just after the president announced plans to withdraw. "We remain committed to doing our part."
Brad Smith, Microsoft's president and chief legal officer, delivered a strong rebuke of the president's announcement. "We are disappointed with today's decision by the White House to withdraw the United States from the landmark, globally supported Paris Agreement on climate change," Smith wrote in a LinkedIn Pulse blog post.
"Continued U.S. participation benefits U.S. businesses and the economy in important and multiple ways," Smith added. "A global framework strengthens competitiveness for American businesses. It creates new markets for innovative clean technologies, from green power to smart grids to cloud-enabled solutions. And by strengthening global action over time, the Agreement reduces future climate damage to people and organizations around the world. We remain steadfastly committed to the sustainability, carbon and energy goals that we have set as a company and to the Paris Agreement's ultimate success. Our experience shows us that these investments and innovations are good for our planet, our company, our customers and the economy."
Trump argued that the agreement would cost the U.S. economy $3 trillion in lost GDP and 6.5 million jobs, while slashing annual household incomes by $7,000 or more. "Not only does this deal subject our citizens to harsh economic restrictions, it fails to live up to our environmental ideals," Trump said. "As someone who cares deeply about the environment, which I do, I cannot in good conscience support a deal that punishes the United States -- which is what it does -- the world's leader in environmental protection, while imposing no meaningful obligations on the world's leading polluters."
A number of major news organizations reported many of the claims Trump made weren't accurate, notably the vast number of jobs that would be lost, the economic impact and potential for blackouts and brownouts. Smith noted that Microsoft spent several months with administration officials, imploring them of the importance of the agreement.
On the eve of the decision, Microsoft joined other key technology providers, including Adobe, Apple, Facebook, Google, Hewlett Packard Enterprise, Intel and Salesforce.com, in running full-page ads in the New York Times, New York Post and The Wall Street Journal. According to the ad, presented in the form of a letter to Trump, sticking with the agreement would strengthen U.S. competitiveness, create new jobs with providers of clean energy and reduce long-term risks to businesses that can be harmed by consequences to the environment.
Indeed, Microsoft, now one of the largest datacenter operators in the world, has long endorsed efforts and participated in initiatives aimed at reducing the need for carbon-based energy. Microsoft said five years ago it was 100 percent carbon-neutral, and implemented an internal carbon fee. Since imposing that, Microsoft in November 2016 reported that it has reduced carbon dioxide emissions by 8.5 million metric tons, purchased more than 14 billion kilowatt hours of green power and its community projects globally have supported more than 7 million people, all which align with the United Nations (UN) Sustainable Development Goals
Microsoft was a strong proponent of the Paris Agreement, ratified in early November of last year, and within two weeks participated in the launch of the SMARTer2030 Action ICT industry, joining those in technology, business and policy makers to support the Paris Agreement goals "through public awareness, education and mobilization."
The company's position on the withdrawal from the Paris Agreement, as described by Smith, drew mixed reaction in the comments section of his post. Many praised Microsoft's defense of the agreement, while others criticized Smith, saying they shared Trump's view that it wasn't good for America. "Microsoft supports global socialism," Mark Elert, a senior account manager at U.S. Internet, commented. "Apparently Microsoft doesn't actually believe in real science," added Greg Renner, director of information systems at Covenant Services Worldwide.
Another critic, Pat Esposito, a SharePoint and Office 365 consultant who has contributed to Redmond magazine, offered a more measured response, offering Trump the benefit of the doubt. "Let's see what Trump's next move is," Esposito commented. "If he invests the $3B allocated for the accord back into U.S. green initiatives, perhaps Microsoft and others will follow suit. I say develop the models for success and then bring them to the rest of the world to follow."
I asked him via e-mail why he didn't support the agreement. "Economically, our money will go further spent internally than diluted across the many countries in the accord," he responded. "Lack of binding enforcement, non-guaranteed financing and the fact that several studies indicate even with the plan as configured, it will not have a positive impact. there are other ways we can and must demonstrate a commitment to this world crisis. Only if Mr. Trump chooses to do nothing should we start calling him out... he's the president, we have to give him a chance to perform."
Posted by Jeffrey Schwartz on 06/05/2017 at 12:02 PM0 comments
Microsoft is rebuilding its Skype client communications interface "from the ground up," and it's now available for those with Android phones and planned for iOS devices soon. Windows and Mac versions are slated for release over the next few months. Skype's new look is both cosmetic and functional as it offers a platform for intelligent bots that can let people use it to search for information, products and services.
The revamp applies to the free Skype consumer client that's also now included with Windows. The move is an ambitious push to pick up more mainstream user support by bringing modern communications services to the app. The wholesale rebuilding of Skype fits with Microsoft CEO Satya Nadella's goal to bring conversational computing to the mainstream. However, standing out among options from the likes of Amazon, Apple, Google and Facebook, and even apps that are popular with millennials such as Snapchat, will be challenging.
Images of the new Skype show an entirely new modern interface, which the company proclaims will deliver a vast improvement in how people communicate. The new Skype puts chat "front and center," Microsoft said Thursday when it announced the release of the new interface to Android users.
"We want to help you deepen connections within your personal network," read yesterday's announcement by Microsoft's Skype team. "There's only one of you in this world, so now you can show off your personal style by customizing Skype with your favorite colors. When in a conversation, you should always make sure your voice is heard, or more specifically, your emoticon is seen."
The new Skype interface lets users share their feedback during a chat session or video call by tapping on a reaction icon. Skype also offers a new Highlights section, that lets users create a "highlight reel" of their day with photos and videos. Users can send a Highlight directly to select contacts or groups. Microsoft wants people to use Skype as their canvas to share experiences and communicate more expressively with friends, families and defined groups.
But Microsoft wants people to use Skype beyond simple chat, voice and video communications. The new Skype offers add-ins and bots to provide an "infinitely searchable" tool. In one example, a StubHub bot will help find seats for an event and find seating options and pricing. An Expedia bot lets users find travel options with other bots and add-ins forthcoming.
Since its release a day ago, Microsoft has shared some known issues on its support page. Among them are incoming calls default to the speakerphone. Those wanting to switch to earpiece mode should just tap on the speakerphone icon. Skype doesn't allow users to receive SMS messages within the app yet. The translator function isn't available and voicemail is currently restricted to those with Skype phone numbers. For Skype-to-Skype calls, Microsoft said it's not yet supported, but users can sign in to other Skype clients to receive them.
Perhaps the biggest known complaint is that those using the new Skype client with existing phone numbers aren't showing up as being online for other Skype users, an issue Microsoft said on the support page it is resolving. In the meantime, contacts who search for you using your full name in Skype will find you. Until Microsoft resolves the issue, the support page offers a workaround.
It will be interesting to see if the new interface broadens the use of Skype in a crowded field of communications and chat options. But, for sure, this isn't your father's Skype. If and how these features are brought over to Skype for Business also remains to be seen, but the team will undoubtedly monitor the new consumer interface to see what works.
Posted by Jeffrey Schwartz on 06/02/2017 at 10:50 AM0 comments
Microsoft is building on its Azure Site Recovery (ASR) service with a new disaster r ecovery option intended to ensure customers can restore Azure virtual machines running in its public cloud IaaS offering. The new service, released today as a technical preview, will let customers replicate IaaS-based applications in Azure to any different region of a customer's choice without having to deploy additional software or appliances to existing ASR subscriptions.
While Microsoft emphasizes Azure's high availability, compliance requirements stipulate the need to have a disaster recovery solution that can provide business continuity. To ensure data is adequately protected requires implementing a simplified approach to replication data to an alternate region, according to Microsoft Principal Program Manager Rochak Mittal.
"This new capability, along with Azure Backup for IaaS virtual machines, allows you to create a comprehensive business continuity and disaster recovery strategy for all your IaaS based applications running on Azure," Mittal noted in today's announcement.
Microsoft is adding new features to its disaster recovery service on the heels of forging tighter ties with key suppliers of data protection software. Among them is Commvault, which claims it was the first to offer a Windows-based data protection offering 17 years ago, and has built its own DR as a service based on Microsoft Azure.
Randy De Meno, chief technologist for Commvault's partnership, says he's not concerned about Microsoft's ASR, noting his company offers a broad set of archiving, Office 365 protection, e-discovery and migration capabilities that go beyond core data replication. "We are going to drive more Azure consumption and give customers an enterprise-wide solution," he said during a meeting yesterday at the company's Tinton Falls, N.J . offices.
Key Microsoft execs have given a nod to some of these key partnerships. In a promotional video released today, Microsoft Corporate VP Steve Guggenheimer said: "We're partnering together to help those customers move to the cloud, in a way that helps take advantage [of what] Microsoft has built in Azure."
Earlier this month, a number of key Microsoft executives gave talks at Veaam's annual gathering of customers and partners in New Orleans. In addition to a keynote by Microsoft's Azure CTO Mark Russinovich, and Principal Program Manager Jeff Woolsey gave a talk at the VeeamON conference. "Azure Site Recovery is just replicating virtual machines out to Azure," said Clint Wycoff, a technical marketing evangelist at Veeam, during an interview at VeeamON. "It is it's somewhat complex to set up and a lot of customers and partners have had challenges around its complexity and consistency."
Scott Woodgate, director of Microsoft Azure Management and Security, begged to differ. "This is the simplest disaster recovery or business continuity configuration ever, in particular because it's all within Azure, there's no need to worry about configuring the interface to your organization's firewalls," Woodgate said during a brief interview. "It's virtually a simple wizard. it's virtually a simple wizard. "Many of the other vendors are offering applications running in virtual machines, where I as the end user still needs to manage and patch and update and secure that application. Azure Site Recovery is actually a SaaS service, so Microsoft does all that work for you, which reduces the overall cost of ownership versus the VM based solutions.
The new DR features added to Microsoft's Azure Site Recovery offer redundancy without requiring additional infrastructure on-premises, allowing administrators to choose cross-region DR by selecting the VMs and selecting a target location, according to Mittal. By offering the DR function as-a-service offering, customers can avoid the need to deploy, monitor, patch and maintain on-premises DR infrastructure, he added.
Administrators can enable cross-region DR by selecting a VM, choosing the Azure region and creating replication settings. Customers can choose how to orchestrate failovers and determine the required recovery-time and point objectives. The cross-region preview is available in all regions that Microsoft offers ASR.
Posted by Jeffrey Schwartz on 05/31/2017 at 7:56 AM0 comments
Former Secretary of State General Colin Powell is best known for his tenure as a one-time top diplomat and his role as chairman of the Joint Chiefs of Staff during Operation Desert Storm a quarter century ago. But he told an audience of several thousand IT pros yesterday that he's no luddite when it comes to enterprise technology and cybersecurity.
During a one-hour keynote address yesterday closing out this week's Citrix Synergy conference in Orlando Fla., Powell shared his IT credentials and his encounters with cybersecurity challenges over the years. Powell also emphasized the importance of strong leadership and the need to recognize the issues dealing with immigration and the diverse cultures, issues quite germane to any IT organization.
While Powell steered clear of weighing in on the current investigations into charges that the Russians hacked e-mail systems in an effort to steer the outcome of last year's presidential election, he weighed in on the private e-mail server that Democratic nominee Hillary Clinton used when she served as Secretary of State and how it brought attention to his use of personal e-mail serving in the same top diplomatic role.
"Hillary had a problem," Powell said, which stirred extended laughter and applause. "They came after me. They said: 'If Hillary did it, Powell must have done it.' So, they chased me around. And they subpoenaed my AOL account and I said 'go ahead, be my guest.' And they looked at it and discovered there was nothing there."
While he didn't dwell on Clinton's e-mail server during the talk, Powell gave this view: "Hillary really didn't do anything wrong," he said, a position he has shared in the past. "It was not done well, but the was no criminal intent or criminal act there, and that's what ultimately was discovered. But what we have to make sure of from now on is that we manage these systems the proper way."
If an 80-year-old decorated military general and former diplomat who served four U.S. presidents might seem an unlikely candidate to offer insights on the state of cybersecurity and IT management today, he pointed to why the audience should take him seriously. Early in his military career, the U.S. Army told Powell they were sending him to graduate school, though not to get a master's in foreign policy or political science. Instead, he was directed to get an MBA in data processing.
"I went there, and I graduated two years later, had a straight-A average, and of the 5,300 people here, this morning, I am probably the only one left, who knows how to program in Fortran, Cobol and to deal with 80-column punch card machines," he said.
"The reality is, as most of you I hope know, that a lot of what you have done and mastered over the years came out of the military," he added. "I was on the DARPAnet [Defense Advanced Research Projects Agency Network, the building block of today's modern Internet] 40 years ago, and information and technology and communications have always been an essential part of my military career of my life and any success that I have had always rested on the ability to move information around rapidly, effectively and make sure it gets to where it's supposed to be."
Powell said the highlight of that came more than two decades ago during the Persian Gulf War in 1990 when Powell and Field Commander General Norman Schwarzkopf Jr. had to move 500,000 troops from the U.S. to Saudi Arabia for the war, then known as Operation Desert Storm. "When it was time to issue the order to start the conflict, we had one of the most perfectly secure means of doing it," Powell recalled. "Not something you might think of now, but it was a fax machine. And by using that secure fax machine, I knew the order only went to one person. It wasn't a table that could spread around all over the organization. So, cybersecurity was always a major part in war planning."
At the same time, Powell recognized that if security was too tight, spreading critical information to key people in the field could stymie an effective outcome. Or worse, it could have put soldiers in harm's way. "I didn't want our security to be too tight ... and those down at the lower level weren't going to get information they needed to get because it was secure, it was secret. And I found early in my career and during this period of Desert Storm that we have to always triage information. There's a lot of information to give out unclassified because no enemy can react fast enough to it. It was a tactical situation."
That would come to play a decade later in 2001 when Powell became Secretary of State under President George W. Bush. "One of the challenges we are all facing now, and I faced when I became Secretary of State, is how to make sure you have an information system that is getting the information to where it has to be, when it's needed in order to be actionable. And make sure you are not cluttering the whole system by overclassifying things."
When Powell took over the State Department, one of his first actions was to make sure the 44,000 employees all had Internet-connected computers on their desks. While employees locally and in every embassy throughout the world now had connectivity, they were internally secured systems.
"That would allow you to send e-mails to the guy next door but you couldn't get the Internet outside to see what was going on in the rest of the world," he said. "I had to change that, so I got new software and hardware and then I had to change the brain-ware. I had to change the thinking within the State Department."
Powell did that by sending unclassified messages from his AOL account to State Department employees. "They knew Secretary Powell was liable to send an e-mail, so they said 'I better have my system on.' Now there [was the question] -- should you be using your AOL account that way? It was unclassified information [and] I had a secure system when I was dealing with secure material."
Nevertheless, he saw it as a critical step toward encouraging better communications at the time. "All I had to do was grease the wheels, grease the engine. I had to make sure these people understood the power of the network, the power of the information systems, and the only way to do that was for me to lead and set an example and get it going," Powell said.
Indeed, that worked, though it came back to haunt him when Hillary Clinton's use of personal e-mail dominated her entire presidential campaign.
Posted by Jeffrey Schwartz on 05/26/2017 at 1:12 PM0 comments
Citrix Systems is developing a secure browser that it will host in the Microsoft cloud. The new Citrix Secure Browser Essentials, set for release by year's end, will allow IT organizations to present desktop images to users regardless if they run any of the company's VDI or app virtualization offerings.
Citrix's new secure browser, designed to isolate corporate desktop images and data from personal information and apps, is among a number of new wares Citrix revealed at this week's annual Synergy conference, taking place in Orlando, Fla. Citrix and Microsoft are working together to help deliver the new secure browser, which Citrix will make available in the Azure marketplace. The secure browser will offer a version of Citrix Receiver and a new analytics service and is the next step in the Microsoft and Citrix broad pact to build the Citrix Cloud on Microsoft Azure.
"The browser itself needs a lot of protection and we will be delivering it with Microsoft," said Citrix CEO Kirill Tatarinov, in the opening keynote session. Tatarinov is the former longtime Microsoft senior executive who took the reins of Citrix early last year and quickly reached out to his former employer to extend their work together.
PJ Hough, Citrix's senior VP for product and also a long-time Microsoft exec who had worked on the Office 365 team before joining Tatarinov last year, said at this year's Synergy that the Citrix Secure Browser Essentials will isolate public Internet browsing from access to enterprise applications and resources. "It's going to be a great isolated browsing experience for customers who want to separate the corporate browsing they do from other applications and experiences on the device," Hough said.
"It's not only separating corporate from personal on the device, it's actually taking the corporate image and putting it up in the cloud," said Brad Anderson, Microsoft's corporate VP for enterprise mobility and security management, who joined Hough on stage during the keynote session at Citrix Synergy.
Certainly, it's not the first-time Citrix or others have released a secure browser, but the fact that it's Azure hosted, and that it can provide a means of isolating personal from corporate data and apps on any user-owned devices, is a good way to introduce those who don't use Citrix or virtual clients to the company's offerings.
"The potential here is since it's hosted in Azure, there's opportunity to protect the apps and data even further," said Mark Bowker, an Enterprise Strategy Group analyst. "Microsoft is a big target from threat vectors, and having [the browser] on Azure can give it the opportunity to provide an even higher level of protection just due to what they see out on the Internet."
Hough said the new Citrix Secure Browser Essentials, will arrive by year's end and will be available in the Azure Marketplace, with pricing starting at $180 per year (with a three-year subscription for a minimum of 50 subscribed users).
Citrix Receiver for Windows 10 S
Citrix and Microsoft are also working to deliver a release of the Citrix Receiver client for the new Windows 10 S operating system that will aim at rendering traditional Win32 desktop apps.
Windows 10 S, which Microsoft announced earlier this month, is a version of the OS that is locked-down, meaning it will only run Universal Windows Platform (UWP) apps and tools that are only available in the Windows Store. The new Citrix Receiver for Windows "opens the door for the Win 32 apps to run on Windows 10 S," according to a description posted by Vipin Borkar, a director of product management at Citrix. He also added that it will provide a way for organizations that want the Windows 10 S UWP experience, but also may want support for specific Win32 apps or environments not likely to find their way into the Windows Store, such as Google's Chrome browser.
Hough said it should be available in the Windows Store "any day."
Citrix also said it is working with Microsoft to develop a threat analytics service, which can pull all of the telemetry of its XenDesktop and XenApp solutions to address advanced security threats. The Citrix Analytics Service will offer continuous monitoring that will use the telemetry of users, devices, applications and networks to detect anomalies that may portend potentially malicious activity and offer specific responses to prevent an attack.
The plan to offer the Citrix Analytics Service, which will run on Azure as a part of the Citrix Cloud, comes as Microsoft is in the process of rolling out its own Windows Defender Advanced Threat Analytics service. Since the Citrix Cloud runs in Microsoft Azure, it's reasonable to presume they're exploring a number of integration points, including using Azure Machine Learning and the Microsoft Security Graph, as well as extending on the work they're completing with tying the Citrix platform to Microsoft Enterprise Mobility + Security (EMS).
Meanwhile Hough and Anderson pointed to the deliverables announced at last year's Synergy conference, among them the ability to run Citrix XenDesktop Essentials and XenApp Cloud Essentials in hybrid environments on Windows Server 2016 and Microsoft Azure, the integration of Microsoft's EMS service and the Intune mobile application management functions with Citrix XenMobile. "Citrix has taken all their apps and Intune-MAM-enabled them," Anderson said. "IT professions get one common management paradigm for managing all of their apps. And that translates to a much easier user experience because users have all of this working underneath one policy as one. It just flows a lot easier for them."
Posted by Jeffrey Schwartz on 05/24/2017 at 1:56 PM0 comments
When data protection provider Veeam last week held its annual gathering of customers and partners, the company outlined how the next release of its availability suite will offer near-real-time recovery, provide integration with a much wider range of storage hardware and use Microsoft Azure and Amazon Web Services for disaster recovery services. Veeam also said it will provide extended support for Office 365.
As reported last week, Veeam Availability Suite v10 will offer continuous data protection (CDP) with recovery point objectives of 15 minutes. Co-CEO Peter McKay, speaking during the opening keynote of the company's VeeamOn conference in New Orleans, described v10 as a "major extension" of its platform, which will widen its ability to backup and recover physical systems, which still account for an estimated 25 percent of datacenter resources, as well as Linux and Windows endpoints, covering PCs, personal mobile devices and IoT-based hardware with embedded software. A new API will allow connectivity to substantially more NAS storage offerings and reduced long-term retention costs with support for native object storage. But the other key focus at last week's conference was the rapid growth of cloud infrastructure, both hybrid and public services.
"Seven percent of the company's customers already have a cloud-first strategy when it comes to IT," said Paul Mattes, vice president of Veeam's global cloud group, according to a recent survey. Mattes, who gave a keynote in the second day of the conference, cited IDC's projection that 46 percent of IT spending will be in the cloud. "This is happening at a pace much greater than anything we've seen before in the history of IT," said Mattes, who joined Veeam last fall after a long tenure with Microsoft and its Azure team. "The cloud is being adopted at a rate that's much faster than anything we've seen, and that's because of the business opportunity and agility that cloud technology has afforded. So Veeam has embraced this aggressively [and] we are going to continue to deliver solutions that are cloud focused."
Several years after releasing its Cloud Connect interface, supporting Amazon Web Services and Microsoft Azure as a target, the v10 release will bring the CDP technology to create much deeper integration with the two companies' large public clouds, among others. While Veeam gave many of its key alliance partners ample airtime during the three-day gathering, including Microsoft, VMware, Cisco, NetApp, IBM and Hewlett Packard Enterprise, the company emphasized the deeper relationship it has formed with Microsoft. "Our Microsoft partnership has been one of the strongest and deepest partnerships over the years and we are going to take that further and deeper," CEO McKay said.
As part of last week's new pact, Veeam and Microsoft said they are working together to offer Veeam Disaster Recovery in Microsoft Azure, which will provide automated availability of business-critical workloads by providing DR as a service in Azure. The new Veeam DR in Azure will tie Veeam's new Direct Restore tool and the new Veeam Powered Network (PN), a tool that the company said will automate the setup of DR sites in Azure. "What the Veeam Powered Network is about is delivering a free lightweight tool that you can use to quickly set up a DR site in Azure using an easy-to-configure, software-defined networking tool," Mattes said. It doesn't require a VPN and will be offered to organizations of all sizes.
Also coming is native support for Microsoft's Azure BLOB cloud object service. Called Veeam's Scale Out Backup Repository (SOBR), the company said it will treat cloud object storage as an archive tier, allowing customers to retain Veeam backups as BLOB storage, a lower cost option for retaining data. The Veeam Management Pack will also tie with Microsoft's new Operations Management Service (OMS), a Web-based services that provides views of system and multi-cloud resources.
Veeam also said it is upgrading its recently launched Veeam Backup for Office 365. The protection tool will provide extended support for Exchange Online and will bring SharePoint Online and OneDrive for Business into the mix, providing support for hybrid SharePoint environments. The new release will provide multi-tenancy and support for multiple repositories, which will allow its network of 15,000 cloud service providers to deliver more secure, scalable and broader options. It will also include added automation with new PowerShell SDK and a RESTful API. "We will deliver full PowerShell support for creating jobs and modifying your organizations and adding infrastructure components to fully automating the recovery of e-mail items," said Mike Resseler, director of product management, who demonstrated the new Office 365 features. Reseller noted that the new RESTful API will interface with existing automation tools.
In addition to Microsoft Azure, Veeam said it has invested in a company that will allow it to provide native and agentless protection of AWS EC2 instances, and some of its other offerings including Aurora, Redshift and RDS. The company inked a partnership with N2WS, which will offer what it claims is the industry's first cloud-native, agentless backup and recovery and DR software that can recover AWS applications, moving them to other AWS accounts or regions, as well as into hybrid multicloud environments. During a session after the keynote Rick Vanover, Veeam's director of product marketing, demonstrated what he described as a mock scenario (sine it's not available yet) where data backed up in AWS was moved over to Azure (but, as Vanover noted, through non-existent components). "I showed the functionality that the Connector will have," he noted in a follow-up e-mail. "It will pick up AWS snapshots and write in a repository of Veeam."
Posted by Jeffrey Schwartz on 05/22/2017 at 1:00 PM0 comments
Veeam this week explained how it plans to fulfill its goal of extending its popular virtual machine protection platform to offer near-real-time availability of mission-critical systems running in hybrid environments. The company outlined its ambitious plans, which center around the forthcoming Veeam Data Availability Platform 10.0, at its VeeamOn gathering of 3,000 customers and partners in New Orleans.
The deliverables, set to start rolling out toward the latter part of this year, will fill some key gaps in its systems availability portfolio, such as providing continuous data protection (CDP) and protection of physical servers, cloud instances and SaaS-based data including Office 365 and Linux and Windows endpoints. The 10-year-old company that built its brand and a reputation as a leader in protecting VMs last summer signaled its intent to make its data protection offering viable for large global enterprises. At this week's conference, the company introduced the Veeam Availability Suite 10.0, and supporting software aimed at building on its footprint.
Among the deliverables Veeam outlined this week include:
- Continuous data protection of tier-1 and mission-critical applications with RPO/RCOs of 15 minutes supporting hybrid, multicloud environments hosted and SaaS environments.
- Disaster recovery in Microsoft's Azure public cloud, bringing Veeam's Direct Restore and new Protected Node (PN) to automate the DR process.
- Agentless data protection of data running in Amazon Web Services (AWS).
- An agent that can protect Windows-based physical servers and endpoints, along with applications running in Microsoft Azure, AWS and other public clouds.
- Native object storage support using Amazon S3, Amazon Glacier, Microsoft Azure Blob and any S3/Swift compatible storage.
- Extended Office 365 protection including OneDrive for Business and SharePoint Online.
- A new Universal Storage API framework that will add IBM, Lenovo and Infididat to Veeam's list of storage partners, joining Cisco, Dell EMC, Hewlett Packard Enterprise (HPE), NetApp and hyper-converged systems provider Nutanix.
In addition to broadening beyond its core competency of protecting VMs, the company said it is transforming itself with a new brand and new software licensing that are in line with the growing trend toward subscription-based pricing, while providing closer integration with key cloud providers. "We created a great business in the last 10 years. But now the game is changing completely because customers are moving to a multicloud world, so we have to adopt," said Ratmir Timashev, the company's cofounder, during an interview at VeeamOn. The company, whose revenue last year reached $607 million, up from $474 million in 2015, aims to take in $800 million this year and exceed $1 billion in 2018.
Timashev last year determined that to reach its lofty growth goals, it needs to move beyond protecting VMs and change its business model of offering perpetual software licenses as customers increasingly demand IT providers offer cloud-based subscription models. To do so, he brought in Peter McKay, a senior executive from VMware's client virtualization group, as co-CEO, putting him in charge of operations and growing the company. McKay, who joined VMware last July, has quickly jumped into filling some of those technology gaps and broadening its development efforts to focus on the shift to cloud-based infrastructure and application services.
"We dominate the virtualized world, but we needed to do physical and we needed to be more aggressive in the cloud space and so a lot of these announcements do just that," McKay said in an interview at VeeamOn. In order to become more aggressive in the cloud, and challenge the established data protection providers, McKay said he quickly determined that providing near-real-time recovery with CDP was a key priority. "That is huge," McKay said. "It's probably one of the most exciting announcements we have, if not the most exciting, and that changes the game in the market for us."
Posted by Jeffrey Schwartz on 05/19/2017 at 1:25 PM0 comments
VMware is bringing its new Horizon Cloud virtual app and desktop as a service to Microsoft Azure later this year. Microsoft Azure will represent the largest and only the second public cloud service available to date for Horizon Cloud customers.
By bringing Horizon Cloud to Azure, which VMware announced on Tuesday, customers will have another option for running managed apps and desktops in Microsoft's public cloud when it launches in the second half of this year. Rival Citrix Systems, which is set to have its annual Synergy technology conference in Orlando, Fla. next week, nearly a year ago launched a virtual client Windows-as-a-service offering hosted on Microsoft Azure as part of a broad partnership between those two companies.
While VMware last year announced plans to offer cross-cloud management services on Microsoft Azure and other large public clouds, the plan to add Horizon Cloud to the mix gives the burgeoning service a broader set of infrastructure flexibility, said Courtney Burry, VMware's senior director of marketing.
"We often see customers requiring fully cloud-hosted desktops and, while they can take advantage of what we have available today with Horizon Cloud and IBM SoftLayer, obviously lots of customers are moving toward Azure," Burry said. "We want to make sure we support those customers with the ability to manage that Azure infrastructure and manage those applications through that common cloud control plane and take advantage of those different deployment models."
Burry said one of the notably appealing benefits of Microsoft Azure is the availability of sub-hourly billing and its global datacenter footprint. Customers running Horizon Cloud in Azure will be able to use some of its other attributes, such as federated access into Azure Active Directory and a common management interface.
A customer will have the option to connect Azure infrastructure with the Horizon Cloud control plane, letting VMware manage desktops, apps and the entire infrastructure through that control plane. "Unlike our IBM model, in which a customer would come and buy the infrastructure and the desktops and apps through VMware, this provides customers with the flexibility that they have when using Azure today," said Burry.
Horizon Cloud, the outgrowth of the company's Horizon Air virtual desktop service, was launched earlier this year as part of VMware's Workspace One portfolio, initially supporting IBM's SoftLayer as its only public cloud provider. Customers can also run Horizon Cloud on the VX Rail hyper-converged infrastructure from its parent company Dell EMC, as well as hyper-converged infrastructure from Quanta and Hitachi Data Systems.
VMware launched Horizon Cloud with the goal of upping the performance and functionality of virtual clients by offering them on a common backplane. Horizon Cloud provides a common platform for managing virtual clients, devices and system images with common monitoring and reporting and service updates. In addition to the Horizon DaaS, Horizon Cloud includes the new Horizon Apps, which delivers published SaaS and mobile apps to the workspace.
Horizon Cloud's new Just-in-Time Management Platform (JMP) can offer real-time app delivery, rapid provisioning of virtual desktops and contextual policy management, according to the company. VMware also has touted Horizon Cloud's new Digital Workspace Experience with BEAT (Blast Extreme Adaptive Transport), its new UDP-based network link designed to optimize user experiences regardless of network conditions, making it suitable for low-bandwidth, high-latency and high-packet loss.
Horizon Cloud is designed to let organization provision fully featured PC desktops and applications running either solely in the public cloud or they can run the hyper-converged infrastructure and scale to the public cloud.
Initially, the Azure-hosted Horizon Cloud service will support the virtual desktop offering, with the app service set to follow. Burry also said the release of the Skype for Business option for Horizon Cloud, announced last fall, is imminent.
Posted by Jeffrey Schwartz on 05/17/2017 at 5:12 PM0 comments
As the list of major apps joining the Windows Store continues to grow, albeit incrementally, Microsoft scored another coup this week announcing that the Apple iTunes music and app stores will be available by year's end.
Microsoft has struggled to get big names into the Windows Store but a number have jumped on board including Facebook and Spotify. In addition to iTunes, Microsoft announced that SAP's Digital Boardroom and the popular Autodesk Stingray 3D gaming and real-time rendering engine were being added to the Windows Store. Stingray isn't the first Autodesk modern app to join the Windows Store. Autodesk Sketchbook was added last year and Microsoft reported it's seeing 35 percent sales growth each month for the app this calendar year so far.
The release last month of Windows 10 Creators Update, announced last fall, appears to have generated improved prospects for Microsoft's Universal Windows Platform (UWP), through it's far from having received critical mass. Microsoft is also bringing Office 365 to UWP. The cover story for this month's issue of Redmond magazine offers some analysis on the impetus for the latest Windows 10 release and the company's "mixed reality" tooling to motivate a growing number of developers to build UWP apps and make them available in the Windows Store.
In his monthly Redmond Windows Insider column, Ed Bott questioned whether there's enough incentive for the broad cross-section of software publishers and developers to dedicate resources to UWP. "It doesn't help that Microsoft has pivoted its mobile platform more often than the New England Patriots backfield," Bott writes. "Just ask a longtime Microsoft developer about Silverlight. And be prepared to duck."
Terry Myerson, Microsoft's executive VP for Windows and Devices, announced at Build this week a number of efforts to motivate developers. Among them are .NET Standard 2.0 for UWP and XAML Standard, slated for release later this year, which Myerson said will provide a more simplified and modernized code base. Myerson remains encouraged.
"There has never been a better time to bring your apps to the Windows Store," he wrote in his blog post, where he claimed monthly in-app purchases in the Windows Store have doubled. Myerson also noted that Microsoft will deliver complete UWP functionality to Visual Studio Mobile Center this fall, which will include automated build support and an extended number of Windows devices in its test cloud.
Posted by Jeffrey Schwartz on 05/12/2017 at 12:41 PM0 comments
Dell EMC is refreshing its compute storage and networking offerings with a broad portfolio of modernized infrastructure designed to underpin hybrid cloud datacenters of all sizes. At the core of the company's new lineup of datacenter offerings, outlined this week at Dell EMC World in Las Vegas, is an upgraded version of the flagship Dell EMC PowerEdge servers, the first developed by the newly merged company.
The company kicked off the datacenter portion of the conference with the launch of its PowerEdge 14 G servers (due out this summer) which are tied to the release of Intel's next-generation Xeon processors, code-named "Skylake Purley." It's the first refresh of the PowerEdge server line in three years and, in keeping with any refresh, the new systems offer the typical boosts in feeds and speeds. And while PowerEdge refresh will appeal to anyone looking for the latest servers, the release is also the key component to the entire Dell EMC converged and hyper-converged systems portfolio as well as new purpose-built appliances and engineered systems.
In addition to a new line of tower and rack-based servers, the PowerEdge 14 G will be the core compute platform for the forthcoming Azure Stack system and a new portfolio of datacenter tools, including a new release of its Networker data protection offering and upgrades to the VXRail 4.5, VX Rack and XC Series engineered systems (Windows Server, Linux and VMware, among others). "This is our 14th generation of servers, which is actually the bedrock of the modern datacenter," said David Goulden, president of Dell EMC, during the opening keynote session.
The new PowerEdge 14 G servers will be available for traditional datacenter applications as well as Web-scale, cloud-native workloads. Among the key upgrades that Dell EMC will deliver in the new PowerEdge server line are increased app performance and response times. The company claims the servers will offer a 19x boost in Non-Volatile Memory Express (NVMe) low latency flash storage single-click BIOS tuning that will allow for simplified and faster deployment of CPU-intensive workloads and the ability to choose from a variety of software-defined-storage (SDS) options. "We knew we had to accelerate the workloads. We had to reduce the latency to make sure we have handled the performance to transform peoples' businesses ," said Ashley Gorakhpurwalla, president of the Server Solutions division at Dell EMC. The server's new automatic multi-vectoring cooling allows a greater number of GPU accelerators, which the company claims can increase the number of VDI users by 50 percent.
In addition to the performance boost, company officials are touting a more simplified management environment. The servers will support the new OpenManage Enterprise console and an expanded set of APIs, which Dell EMC said will deliver intelligent automation. The company described the new OpenManage Enterprise as a virtualized enterprise system management console with a simple user interface that supports application plugins and customizable reporting. A new Quick Sync feature offers server configuration and monitoring on mobile devices. It boasts a 4x improvement in systems management performance over the prior version and can offer faster remediation with its ProSupport Plus and Support Assist, which the company claims will reduce the time to resolve failures by up to 90 percent.
Dell EMC has also added some noteworthy new security capabilities embedded in the hardware that the company said offers new defenses including SecureBoot, BIOS Recovery, signed firmware and iDRAC RESTful API that conforms to Redfish standards. It also has better protection from unauthorized access control changes, with a new System Lockdown feature and a new System Erase function that ensures all data is wiped from a machine when taken out of commission.
The new PowerEdge servers were part of a number of other key datacenter offerings announced by the company this week. "Our new 14 G servers will be built into our full Dell EMC product portfolio, bringing out of our seventh generation of storage and data protection as well," Goulden said. The servers will be offered with a variety of the company's new software-defined enterprise storage systems including a new version of the Dell EMC ScaleIO software-defined storage (SDS), upgrades to the company's Elastic Cloud Storage (ECS) platform including the ECS Dedicated Cloud Service for hybrid deployments of ECS and ECS.Next, which will offer upgraded data protection and analytics, its new Project Nautilus SDS offering for storing and streaming IoT data. The servers will also power Dell EMC's new Ready Node portfolio, designed to transition traditional datacenters into cloud-scale infrastructure.
In addition to storage, Dell EMC said the PowerEdge 14 G will power the company's new Open Networking switches, including what the company claims is a top of rack that can offer more than a 2x in-rack throughput speed of traditional 10GbE switches and a unified platform for network switching as well as a new line for small and midsize organizations.
It will also run Dell EMC's forthcoming Azure Stack appliance, announced last week and on display on the show floor. I have a meeting and will file a separate post with more information on that offering, which isn't set to ship until the second half of this year.
Posted by Jeffrey Schwartz on 05/10/2017 at 2:14 PM0 comments
When Microsoft introduced its Surface Laptop last week, the company boldly promised it would "reset" the mobile PC category. Besides some innovative mechanical engineering, an impressive high-resolution PixelSense display that renders 3.4 million pixels and a lightweight, thin form factor, company officials were especially proud of the battery life the Surface Laptop is poised to achieve: 14.5 hours when continuously running video.
Most people take such battery life claims with a grain of salt, a point I reminded the lead engineer for the Surface Laptop at last week's launch event in New York City. The engineer, who requested his name not be used, seemed to take exception to my skepticism of such best-case claims. A year in the making in collaboration with Intel, the engineer was emphatic that the Surface Laptop's battery life will prove impressive.
First off, he emphasized improvements with Intel's 7th Generation Core processor and the work the two teams have done on ensuring the new Windows operating system and the engineering applied to the Surface Laptop, will ensure long battery life. Second, the team looked at its previous efforts, where engineers used telemetry from previous versions of the operating system, the Edge browser and Office.
"Architecturally we took a slightly different approach to developing the Surface Laptop in that we deliberately load-switched almost all of the subsystems to optimize those subsystems for when we need to bring them up and power them down," the engineer explained. "From the beginning there was a conscious effort to prolong battery life, increase connected standby time and off state power to minimize it."
The fact that the battery pack has four equal cells is also a key factor, he added. "What's nice about that, is there are two serial and two parallel cells, which optimizes battery life because every cell works exactly the same way. You don't leave a lot of capacity on the table, and over the life of the battery, you have less aging."
Similar to the Surface Pro hybrid laptop PCs, the battery in the new laptop can't be swapped out. Many users of the Surface Pro 3 reported low battery lives on certain models, attributed to some bad batteries. Customers with expired or no extended warranties were out of luck. Will those who purchase the new Surface Laptop have better luck? The engineer was pretty confident that they will and that Microsoft has learned a lot about optimizing battery life since then.
"I do understand the issue that you may have had with the claims versus reality," he admitted. "We've done a lot of work over the last couple of years to make sure that the claims match the experience a lot more so. A lot of work went into getting to the 14.5 hours. We wouldn't have claimed it if we hadn't validated it with numerous SKUs and multiple lots, and a substantial number of devices."
Besides the battery life, I asked what else will help the Surface Laptop justify its premium price over the top-tier Ultrabooks from key OEMs? "Look at the thickness of this device, the fitting of the motherboard underneath that keyset and key travel, the vapor chamber design, the heat pipes underneath and the spreading that heat," he said as he showed me the system. "And then we vent all of the heat out of the back, so there's no exposure. And if the fan drives all of the exhaust air out the back, we actually have a real challenge in that we hold the touch temperatures to a really low temperature."
It sounds impressive but the temperature will be pretty high if the Surface Laptop doesn't offer the superior battery life promised.
Posted by Jeffrey Schwartz on 05/08/2017 at 10:29 AM0 comments
Today marks the two-year anniversary of Microsoft's first Ignite conference in Chicago where the company revealed plans to offer Azure Stack, the same software that runs its public cloud, and also unveiled the technical specifications allowing customers and service providers to run iterations of Azure in their own datacenters. While the company's vision for Azure Stack changed last year after the release of the first technical preview, Microsoft has signaled it will appear in the latter half of this year and there are now signs it will soon see the light of day.
Dell EMC offered a key indicator today that Microsoft is on track with the introduction of a single-node server for developers and a 4-node converged system aimed at dev and test, both of which will appear when Microsoft officially releases the software later this year. Dell EMC is one of four of the top datacenter infrastructure providers that Microsoft has engineering and codevelopment pacts with to offer Azure Stack-certified systems. In addition to Dell EMC, Microsoft is working with Cisco, Hewlett Packard Enterprise (HPE) and Lenovo, which have all indicated they'll have Azure Stack-based systems and have demonstrated prototypes over the past few months.
Following an interview with Dell EMC officials, it's clear that the company is taking a different approach to its Azure Stack systems than the Cloud Platform System running Windows Azure Pack (WAP). The CPS was introduced by Dell prior to its merger with EMC. Now that the two companies are combined, EMC's influence on the Azure Stack platform is apparent. Today's announcement comes in advance of its Dell EMC World gathering, which will take place next week in Las Vegas, where officials plan to emphasize hybrid cloud.
Since the completion of the Dell-EMC merger last summer, server, network and storage infrastructure product groups have consolidated with EMC's headquarters in Hopkinton, Mass. The development of its Azure Stack appliances are now based on EMC's approach to hybrid cloud hardware, said Kevin Gray, Dell EMC's director of product marketing for hybrid cloud platforms.
"Our first enterprise hybrid cloud was based on the VMware vRealize Suite and all their software defined datacenter components and the virtualization," Gray said. "We build integrations to IaaS layer, things like backup and encryption as a service, and we're extending that approach and model to Azure Stack. We are leveraging the partners we've had with Microsoft and the expertise we both have with hybrid cloud."
In addition to its vRealize-based offering, Dell EMC offers its Pivotal Cloud Foundry native hybrid cloud platform, which Gray said focused on enterprises and partners building cloud-native analytics modules. "We are moving this model to Azure Stack," he said.
Gray said the company isn't revealing specs at this time other than the entry level 1-node system doesn't come with the entire infrastructure and tooling planned for deployment as it's only intended for skilled developers. It will carry a list price of $20,0000. The 4-node system will carry a list price of $300,000. The company isn't offering details on hardware spec as of yet.
However, Gray said where Dell EMC believes it will have a differentiated offering with its Azure Stack Platform is via its backup-as-a-service offering, based on Dell EMC Networker and Data Domain storage. "We back up not just the data produced by the applications, we actually protect that underlying infrastructure of the Azure environment," he said. "So, all of the bits that are created and the IaaS and PaaS layer are protected, as well as the underlying infrastructure, making sure we backup the full environment."
Dell EMC's Azure Stack Platform will also offer the company's CloudLink SecureVM encryption offering. This secure VM tool is available in the Azure catalog and enables encryption of virtual machines such that they're portable and remain secure as they move between hosts. "That really ensures that workloads remain secure, wherever they are running in the datacenter, whether it's in public cloud or if it's the on-premises instance of Azure Stack," Gray said.
While Gray emphasized the 4-node system will be targeted for development as well, he indicated deployment-based systems will also arrive by launch time.
Posted on 05/04/2017 at 1:27 PM0 comments
Microsoft wants to see Windows PCs, Office 365 and its forthcoming mixed reality wares in every classroom from kindergarten through high school and college. The company has taken the wraps off perhaps its broadest effort yet to accomplish that goal.
At an event in New York today, Microsoft unveiled Windows 10 S, a version of Office 365 optimized for educational environments and a new Surface Laptop that company officials said will exceed the capabilities of existing mobile PCs, Chromebooks and Apple MacBooks. The company also released a version of its Intune configuration and management service customized for educational environments.
It's not lost on any provider of technology that capturing the student demographic is critical, since that's the time they form preferences and allegiances to specific platforms and applications. Likewise, making it easier for students to learn and collaborate with each other, teachers and parents is critical, said Microsoft CEO Satya Nadella, who discussed how he's spent the past two years visiting classrooms all over the world.
"Technology should make teachers' lives simpler and spark students' creativity, not distract from it," Nadella said, in remarks kicking off today's MicrosoftEDU event. "This is a top priority we are focused on at Microsoft. Today we are delivering an accessible streamlined platform, readily available to all classrooms so teachers spend less time focused on technology and more time doing what they love doing: inspiring students."
While speculation about Microsoft's plans to release new hardware has mounted for months, perhaps the biggest surprise was the launch of Windows 10 S, a version of the operating system optimized for classroom environment. It will support forthcoming View Mixed Reality learning experiences as well as various new teaching applications and STEM-based lesson plans and apps such as Lego's WebDo 2.0 tools focused on headsets, interactive whiteboards and accessibility.
Terry Myerson, executive VP of Microsoft's Windows and devices group, described Windows 10 S, which can run on partner devices that start at $189, up to Microsoft's high-end Surface Book, as a streamlined version of the OS that's secure and able to maintain consistent performance over years of usage.
Windows 10 S will also test the appetite for Microsoft's Universal Windows Platform (UWP) in a big way because it will not run classic Win32 software -- only apps available in the Windows Store. This restriction will ensure consistent performance and better security, Myerson explained.
"Everything that runs on Windows 10 S is downloaded from the Windows Store, which means first it's verified for security and performance," Myerson said. "When it's downloaded to the device, it runs in a safe container to ensure that the execution of applications don't impact the overall performance of the rest of the system, allowing the performance of the device to be the same on day one as day 1,000."
Still lacking in the Windows Store is the complete desktop suite of Office applications consisting of Word, Excel, PowerPoint and Outlook, which Myerson said, "will be coming soon." Another limitation that might raise some eyebrows, but also with the same goal of ensuring consistent performance and security, is the fact that Windows 10 S will only run Microsoft's Edge browser. Also, Windows 10 S won't support domain joins to Active Directory on-premises -- only via Azure AD.
Windows 10 S is slated for release this summer and can be deployed on existing Windows 10 Pro systems free of charge. New PCs sold for educational use will also include free subscriptions to Minecraft: Education Edition. Microsoft is also offering Office 365 for Education, including use of the new Microsoft Teams free of charge to students and educators. The company also released a version of Windows Intune for Education, which is now available.
Surface Laptop: 'Resets the Category'
The other big news at today's MicrosoftEDU event was the launch of the Surface Laptop, a thin and lightweight device with a 13.5-inch display available in high-end configurations that aims to offer a viable alternative to Apple MacBooks. While this is not the first Surface introduction to make such a claim -- in fact, most have -- it may have made the strongest argument yet, though it's too early to draw any conclusions since the device isn't shipping.
"This is the laptop that resets the category," said Panos Panay, Microsoft's corporate VP for devices. While it runs Windows 10 S, this system will clearly not just appeal to students, though clearly the company wants to grab the attention of those who want to go back to school this fall with a MacBook. Panay emphasized the engineering of the device, which is made of anodized metal, an Alcantara-covered textured backlit keyboard with keys that are 1.5 mm and a .2mm pitch which is has a maximum of 0.57 inches and weighs just 2.76 pounds. Microsoft officials claim the Surface Laptop will get 14.5 hours of battery life, but typically systems never achieve maximum power estimates.
Although I only spent a few minutes with the new Surface Laptop, it was quite light given its size, and its 3.4 million-pixel display renders high-resolution visuals. The systems are priced in the same general range as Microsoft's current Surface Pro 4 line. An entry level Surface Laptop costs $999, configured with an Intel 7th Generation Core i5 processor, 128GB SSD and 4GB or RAM. The company is taking orders for i5-based systems today, which are slated to ship June 15. Surface Laptops based on the Intel Core i7 processor are scheduled to ship in August. A Surface Laptop with an i7 processor, 16GB of RAM and a 512GB SSD is priced at $2,199.
While the Surface Laptop will run the new Windows 10 S operating system, it will also support Windows 10 Pro, and presumably enterprise editions. Microsoft is initially only selling the Surface Laptop through its own retail and online stores. Asked about plans for other retailers, resellers or channel partners to offer the Surface Laptop, a Microsoft spokeswoman said the company has no information to share at this time.
Posted by Jeffrey Schwartz on 05/02/2017 at 11:48 AM0 comments
Microsoft this week showcased customers that are investing in the company's newest cloud-based offerings such as the Internet of Things (IoT) and machine learning using predictive analytics.
At an event held in New York called Microsoft's Digital Experience, the company showcased more than a dozen companies that have committed in some form to piloting or implementing a number of new wares aimed at either rapidly accelerating an existing process and/or enabling new revenue opportunities.
Companies that Microsoft said are using such Azure-based technologies include Bank of America, Hershey, Fruit of the Loom, Geico, Maersk and UBS. For example, Hershey is using Azure Machine Learning and Azure IoT to better predict temperatures in vats based on feeds from sensors to reduce waste in the process of manufacturing Twizzlers by using PowerBI and Azure ML.
"This is a beautiful correlation plot that we run in Power BI, said George Lenhart III, senior manager of IS innovation and disruptive technologies at Hershey. "We turned on the machine learning, and by adjusting every five minutes with the prediction of whether it was going to be heavy or light, we were able make changes accordingly."
Maersk Transport and Logistics, one of the world's largest logistics providers, plans to use Microsoft's technology to automate its supply chain and with the goal of shaving tens of millions of dollars from its costs by bringing information to management and customers and predicting activities that may be the source of delays. "It's all about speed -- how do we go to market faster, how do we do much more with less and how do we increase our return on investments," said Ibrahim Gokcen, Maersk's chief digital officer.
Gokcen said Maersk started using the Azure Machine Learning IoT suite and other services about a year ago and has decided to engage in a long-term effort to build out an Azure-based digital marketplace. Maersk is also considering the potential of Azure Stack and running it on the more than 1,000 ships which could be anywhere in the world.
"It will be great because as we build applications in the cloud, we will be able to just drop them on premises and do some streaming analytics on data that we generate on the vessels that can't be transmitted to the cloud when they're in the middle of the ocean. Then when they reach a port they can replicate and synchronize the data."
Patrick Moorhead, principal of Moor Insights and Technology, said the event is a sign that Microsoft is making more progress with customers than it often gets credit for. "I think they're making a lot more progress with their customers than people realize. And if there's one thing I can fault Microsoft for, is I think they need to be telling this story more," Moorhead said. "They have infinitely more customers using Google cloud and AWS."
Microsoft's talk at Digital Experience included results from a Harvard Business Review study that Microsoft had commissioned. Based on a global survey of 783 people with various roles in business operations and management, the survey found only a handful, 16 percent, considered themselves fully digital operations, though 61 percent said they have started going down that path. Only 23 percent said their business rely on many digital technologies. The survey is available for download here.
Posted by Jeffrey Schwartz on 04/28/2017 at 12:59 PM0 comments
Microsoft's announcement last week that it that it has combined the release cycles of Windows, Office 365 ProPlus and System Center Configuration Manager (SCCM) should be welcome news to IT managers. Even better is the fact that Microsoft has designated that those releases will come out twice per year -- in March and September.
Given Microsoft's legacy of missing deadlines by many months (and often longer for many of its major products) it's reasonable to be skeptical that Microsoft can keep to the regimen it committed to last week. But more recently, Microsoft has pretty much kept to its release plans.
It always seemed incongruous that the different products came out at different times, though with new versions coming out every three years or so, it was certainly more understandable, especially given Microsoft's organizational makeup in the past. Now that Microsoft has moved to more frequent release cycles, the company says customers have asked that the servicing models become more consistent and predictable.
As IT plans system upgrades moving forward with the knowledge that the latest versions of Windows, Office 365 ProPlus and SCCM update releases are now aligned, this should help bring better clarity and consistency to those for whom that matters. And for IT pros who are more ambivalent about the change, there's little downside.
Microsoft said it will be fielding more questions on the changes next week during a Windows as a service AMA, scheduled for May 4 at 9 a.m. PST.
Posted by Jeffrey Schwartz on 04/26/2017 at 10:04 AM0 comments
Microsoft has unleashed numerous new offerings to build up its extensive suite of Internet of Things (IoT) technology. Looking to extend the reach of IoT for novices, the company is planning to release a new Microsoft IoT Central SaaS-based offering.
Unlike the rest of its IoT services and tools, Microsoft IoT Central is a managed SaaS solution for those that don't have experience working with and building automation and data gathering capabilities. Microsoft IoT Central aims to significantly accelerate the ability of those customers to deploy the variety of automation and data gathering capabilities using Windows 10 IoT Core and integrating them with existing applications and systems.
Microsoft said in last week's announcement that it will roll out the new service in the next few months. Although the company didn't reveal much about the new SaaS offering, it appears Microsoft IoT Central is aimed at customers and partners looking to build applications that utilize sensors and intelligent components into their applications that know little about IoT and don't want to use its PaaS-based offering.
The Microsoft IoT Central announcement was one of a number of new offerings introduced last week and earlier this month. The company is also showcasing its Microsoft Azure IoT Suite PaaS offering this week at the annual Hannover Messe conference, and is also using the event to unveil its new Connected Factory offering. Microsoft said Connected Factory is designed to simplify the connection of on-premises environments based on the industry standard OPC Foundation's platform-independent Unified Architecture (UA) and the older Windows-specific OPC Classic devices to Microsoft Azure, which the company said provides operational management information. IT managers can use the Microsoft Azure-based Connected Factory to view and configure embedded factory devices.
Partners that offer IoT gateways that are designed to bridge data gathered from IoT-based sensors and endpoint devices to the Microsoft Azure IoT Suite include Hewlett Packard Enterprise, Softing and Unified Automation. Microsoft indicated that its IoT software is integrated into those partner gateways, which limits the configuration work needed for Azure IoT Hub.
Also adding to its IoT portfolio, Microsoft last week launched its new Azure Time Series Insights, which the company described as a managed analytics, storage and visualization service allowing users to analyze billions of events from an IoT solution interactively and on-demand. The service offers a global view of data over different event sources, allowing organizations to validate IoT products.
Microsoft said the tool is designed to uncover trends, find anomalies and discover root-cause analysis in near real time, and its user interface is simple enough that lines of business can create capabilities without requiring dev teams to write any code. Microsoft is also offering APIs that it said will allow customers to integrate functionality into existing applications. The new Azure Time Series Insights Service, available now in preview, is already built into the above-mentioned new Microsoft IoT Central and the existing Microsoft Azure IoT Suite.
Microsoft already has an extensive IoT portfolio with its Azure IoT Offering and Windows 10 IoT Core. The company earlier this month made available its Azure IoT Hub, announced last fall, as well as the Azure DM Client Library, an open source library that lets developers build device management capabilities into devices built with Windows IoT Core connected to Azure. Microsoft says the new client library uses the same approach to device management as its enterprise Windows management tools.
The new Windows IoT Azure DM Client Library addresses such functions as device restart, certificate and app management, as well as many other capabilities introduced with the new Azure IoT Hub device management. The DM Client Library is designed to address the resource restrictions of sensors and other devices with embedded components and allows Azure IoT to remotely manage those devices.
Addressing scenarios where connectivity to the cloud is an issue, Microsoft last week also announced the preview of its Azure Stream Analytics, specifically for edge devices. Azure Stream Analytics on edge devices uses the Azure IoT Gateway SDK, which runs on either Windows or Linux endpoints, and supports various hardware ranging from small components and single-board computers to full PCs, servers and dedicated field gateways devices, Santosh Balasubramanian principal program manager for Azure Stream Analytics explained in a separate blog post. It uses Azure IoT Hub to provide secured bi-directional communications between gateways and Azure, he noted.
Finally, on the IoT front, Microsoft said it has bolstered security with support for key industry standards and partnered with several players of components at the silicon layer. Microsoft said Azure IoT now supports Device Identity Composition Engine (DICE), which allows silicon gate manufactures to put unique identification on every device, and Hardware Security Module (HSM) to secure device identities. The new partners Micron and STMicro will enable the new HSM and DICE security technologies for Microsoft, while Spyrus will support HSM as part of Secure Digital (SD) and USB storage devices.
Posted by Jeffrey Schwartz on 04/24/2017 at 2:05 PM0 comments
Most organizations have either deployed their first hybrid cloud or intend to do so within the next 12 months. But many who plan to do so may not realize they already have a hybrid cloud, according to a new report published last week by Microsoft.
The Microsoft State of the Hybrid Cloud 2017 report revealed that 63 percent of mid- and large-size enterprises already have implemented a hybrid cloud. However, Microsoft discovered that of the 37 percent claiming they haven't yet implemented their first hybrid cloud (but intend to within the next 12 months), 48 percent already unknowingly have deployed one.
Microsoft came to that conclusion, which the report noted as a surprising figure, based on the infrastructure specified among IT pros surveyed in December and January. "Odds are you've got a hybrid cloud strategy and have already started implementing this approach," said Julia White, corporate VP for the Microsoft cloud platform, in a blog post prefacing the report. "We know this because nine in ten of IT workers report that hybrid cloud will be the approach for their organizations five years from now."
Overall, the 1,175 IT pros who responded to Microsoft's survey had similar definitions of hybrid cloud -- defined as some type of infrastructure that integrates an on-premises environment and a public cloud. Yet Microsoft was surprised by the disparity between that definition and reality. "It seems that while the conceptual definition of hybrid cloud is not challenging for people, identifying hybrid cloud in practice is more difficult," the report stated.
That begs the question, what are the deployed solutions that customers didn't see as a hybrid cloud? A spokeswoman for Microsoft referred to the report's appendix where it spelled out the use-case definitions. They're broken down into four categories:
- Hybrid Applications: Any application that shares common APIs and end-user experiences with an on-premises implementation and IaaS or PaaS services. Typically, an organization can extend an application by combining on-premises data with on-demand public cloud resources when workload requirements spike.
- Data: It's become increasingly common to use some form of public cloud to backup and/or archive data. Likewise, many mobile and Web-based front end analytics tools may run in a public cloud, even if it's querying on-premises data.
- User: Organizations running Office 365 or using cloud-based file sharing services such as OneDrive, Box or Dropbox, are examples of those which likely have a hybrid cloud. Cloud-based directories federated with on-premises user account information, most notably Active Directory, have become popular forms of providing single sign-on to enterprise resources including SaaS-based applications and tools. Many organizations are using either Azure Active Directory or one of several third-party identity-management-as-a-service (IDMaaS) offerings.
- Infrastructure: The ability to extend the compute capacity of an on-premises datacenter by connecting to a public cloud-based IaaS or PaaS via a dedicated gateway, network or VPN makes up a basic hybrid cloud. It's what enables the above-mentioned hybrid cloud applications. But anyone who said they are using a monitoring tool that offers visibility and/or control over both systems on-premises and in a public cloud also has a hybrid cloud, according to Microsoft. Likewise, many organizations that require near real-time recovery of on-premises data are using new disaster recovery-as-a-service offerings, which are hybrid cloud based. These DRaaS offerings are public cloud-based services capable of providing nearly synchronous replication between the two. An organization that has deployed advanced security analytics tools that provide unified views and protection against threats against any endpoint on-premises or a cloud-based resource also has a hybrid cloud infrastructure.
Is it a big deal that some might not realize they already have a hybrid cloud? Not really. IT pros are focused on building and running infrastructure and understanding how everything works. But the report underscores Microsoft's emphasis on hybrid clouds, which is evident in almost everything it, and most other companies, are developing these days. From that standpoint, the report provides a baseline for Microsoft's view of hybrid clouds with some notable data points.
For example, it shows that the most popular use cases for hybrid clouds were (in order) productivity/collaboration, high-performance networking, SSO, DRaaS, cloud front ends to on-premises data, archiving, analytics, unified monitoring, dev test in cloud and production on-premises, advanced security and global applications.
Not surprisingly, there are notable variations depending on the industry and country the respondents were from. The 1,175 IT pros who responded were either from the U.S., U.K., India or Germany. Those in regulated industries such as banking and healthcare were among the top and retailers ranked high as well. Adoption in Germany was notably low due to data sovereignty laws, while India showed high adoption, likely because of latency issues there and the need to keep data accessible.
Deployments also varied based on the age of a company. More older companies with on-premises infrastructure have hybrid clouds than younger ones. More than half (52 percent) of companies in business for less than 10 years have hybrid clouds, while 91 percent of companies that are 25 years to 49 years old have one. Curiously, the percentage drops slightly to 84 percent among companies that are 50 years old.
Let's not forget, as Microsoft promotes hybrid clouds, its eye is toward the future. And that future is Azure, which White underscored last week, when she pointed to Microsoft's claim that organizations can save up to 40 percent on Windows Server VMs run in Azure by using existing licenses with Software Assurance. The company outlined how by introducing its new Azure Hybrid Use Benefit tool and free migration assessment tool.
Posted by Jeffrey Schwartz on 04/20/2017 at 12:46 PM0 comments
Now that everyone has filed their tax returns, many may wonder where that money goes. A new effort spearheaded by former Microsoft CEO Steve Ballmer can now help. Ballmer has recently started talking about the new Web site, USAFacts.org, which is a massive database of information about federal, state and local government information presented just like the 10-K forms publicly traded companies must release.
Ballmer, who is funding the site as a nonprofit, nonpartisan project, has assembled a team of people, many of whom helped create Microsoft's financial reports, and started talking up the effort in February during a TEDxPennsylvaniaAvenue talk. Upon looking for information about how money is spent, there was no single source of information to people easily find information on everything ranging from how Medicaid is used to how many people work for the government.
To get a sense of how much information is available, check out this 2017 report, that provides performance information for a wide spectrum of expenditures, from the Medicare Trust Financials to spending on national defense and veterans' affairs to child safety. And that was just at the federal level. Ballmer noted there are more than 90,000 organizations overall in federal, state and local governments as well as school, fire and water districts.
Recalling as CEO of Microsoft the need to know how every business unit and product was performing and why, Ballmer set out to create the same benchmark for how the government is performing. "I'm a numbers guy," Ballmer said during his February presentation. "For me, numbers are a tool to tell a story and to bring things together in ways that are more precise [and] that has more context. There has to be more clarity than you'll find in any other way. Numbers add to the discussion. They take a collage of mess and turn it into something that helps you in the playing field of a complex platform."
According to the site, USAFacts provides a data-driven portrait of the American population, government, its finances and the impact it has on society. It bills itself as a comprehensive, integrated database that brings together information from federal, state and local government sources, while providing logically organized and contextual information. It draws only on the most recently reported government data.
"What we want people to be able to do is to go to work and say what do my tax dollars go for, what is the money going to get used for and what kind of outcome," he said. Ballmer emphasized that the information doesn't give forecasts and is not biased. "We don't do any forecasting, no prediction that's not factual. We are just reporting on the history," he said. "We'll let people do their own forecasting and prediction. We also don't try to take a position on any issue."
Not surprisingly, the site runs in Microsoft Azure, Ballmer said in an interview published by Geekwire yesterday. The application is .NET-based using REST APIs to access the system. The front-end interfaces are JavaScript based. Plans call for enabling users to create their own visualizations with PowerBI.
Posted by Jeffrey Schwartz on 04/19/2017 at 1:48 PM0 comments
The results of what is believed to be the first independent survey to analyze the makeup of hybrid SharePoint environments are in and the findings reveal that nearly a third of organizations have SharePoint hybrid users, while nearly half (46 percent) still have deployments that are entirely on-premises and 22 percent use the SharePoint Online service offered via Office 365.
More than half have cloud-based implementations of SharePoint, yet claim they don't have plans to transition entirely to the online version, according to the survey that was fielded last month by graduate students at the Marriott School of Management at Brigham Young University and spearheaded by Christian Buckley's CollabTalk. The survey was sponsored by Microsoft, a number of ISVs and a number of media partners including Redmond magazine (and is available for download here).
The survey's aim was to determine the makeup of hybrid SharePoint deployments and the extent to which organizations plan to maintain those environments as they move more applications and infrastructure to the cloud and as Microsoft continues to emphasize Office 365 as the future of its collaboration strategy. Those with on-premises deployments plan to have hybrid solutions by 2020, according to the survey, which validates what proponents have long emphasized: that most organizations that have long used SharePoint either already have hybrid deployments or plan to move part of its functionality to the cloud but maintain hybrid implementations.
"While Microsoft's messaging continues to focus on cloud-first, mobile-first when it comes to product innovation, the company has realized the need to bring hybrid features and messaging more to the forefront in recent years," Buckley said in the introduction to a report based on the findings of the results released today. "Office 365 may be the future of collaboration, but Microsoft has softened their tone in regard to on-prem customers needing to move to the cloud -- not only reassuring them that on-prem will be supported as long as customers are using it, but acknowledging hybrid as a valid strategy for some organizations."
While 32 percent of organizations of all sizes claimed that they have hybrid SharePoint deployments, nearly half, 49 percent, have hybrid licenses, while 35 percent said they have on-premises licenses and the remaining 17 percent used SharePoint Online.
The overall makeup of SharePoint licenses, according to the survey, shows that 63 percent were on-premises and 37 percent were online. Given the overlap and the fact that it's not unusual for Microsoft customers to have more licenses than actual usage, the report based on the survey stated: "We know that the number of SharePoint users are fewer than the number of licenses, and consequently, the percentage of licenses coming from companies using hybrid solutions will be greater than the number of users they have."
Among the 626 respondents representing 510 different organizations, the findings not surprisingly showed that mid- and large-size enterprises are more partial to hybrid and on-premises implementations of SharePoint and less likely to move everything online. Small businesses are more likely to already have or plan to use SharePoint entirely online. More than half of those with on-premises SharePoint implementations today will have hybrid deployments by 2020.
Small business with 51 to 200 employees accounted for 16.7 percent of respondents, while 21.4 percent had 1,001 to 5,000 employees and 17.5 percent had more than 10,000. Respondents came from all over the world, with the U.S. making up 35 percent of the sample.
The findings underscored the overlap between licenses and users. To that point, the survey noted: "In many cases, we assume that a single user possesses a single license, which, in many cases may be true, but not in a hybrid environment, where a single user may have two licenses: SharePoint Online and on-premises SharePoint. Recognizing this might not always be true, with some on-premises users not having SharePoint Online licenses and some online users without access to the on-prem environment, because the scope of our analysis focused specifically on SharePoint, we found that respondents overwhelmingly owned or planned to acquire Office 365 e-licenses where SharePoint is included. In other words, SharePoint on-prem users were generally given appropriate online licenses."
Organizations least likely to move from SharePoint on-premises to Office 365 are those with workloads that require SQL Server Reporting Services Integrated Mode, PowerPivot for SharePoint or PerformancePoint Services, John White, SharePoint MVP and CTO of UnlimitedViz, noted in the report.
"Companies that have made significant investments here cannot move these assets, making a complete move to the cloud impossible. Hybrid is the only cloud option," White stated. "Combine this with the prevalence of third-party solutions (Nintex, K2, etc.) and custom solutions, and it is easy to see why some on-premises presence will be with us for quite some time."
Such issues, and a vocal SharePoint community, are among the reasons Microsoft has shifted its emphasis on hybrid deployments. Speaking to that issue in the report, Jared Shockley, senior service engineer at Microsoft, said: "Migration of customizations used in on-premises installations is biggest blocker to cloud migrations of SharePoint. Many companies do not want to rethink or redevelop these solutions as that is an additional expense to the migration. There are tools and frameworks, like Cloud App Model, to accomplish this work. But they are not the same as on-prem tools and frameworks. This training for the development teams can be one of the main blockers for migrations to SharePoint Online."
While the report notes Microsoft is closing this gap, "the challenge of reducing functionality to end users is not trivial," said Ed Senez, president of UnlimitedViz, an ISV that provides an analytics solution for SharePoint, Office 365 and Yammer. "A simple example is that SQL Server Reporting Services (SSRS) does not currently have a cloud solution. It would be a hard sell to tell employees that you can no longer get the reports that you need to do your job. Again, this gap is closing, but it remains an issue."
Posted by Jeffrey Schwartz on 04/17/2017 at 3:43 PM0 comments
McAfee is once again a freestanding provider of security software, following last week's completion of its divestiture from Intel, which was announced last fall. Private equity firm TPG acquired a majority 51 percent stake in the McAfee spinoff for $3.1 billion, though Intel has a strong vested interest in McAfee retaining 49 percent ownership. Now free from Intel's control, the new McAfee is no longer beholden to the interest of the chip provider, giving it a freer hand to compete with the likes of IBM, Symantec, Sophos and Trend Micro, among others.
Chris Young, who ran Intel Security, is now McAfee's CEO. While TPG has suggested further acquisitions are likely and said in its strategy statement that it intends to "build and create one of the largest, independent, pure-play cybersecurity companies in the industry." As many have noted, Intel's $7.7 billion acquisition of McAfee back in 2011 didn't live up to its promise. Now McAfee hopes to gain ground in a much different IT security landscape.
Nevertheless, McAfee has a formidable and wide range of cybersecurity offerings including its flagship endpoint security software, intrusion detection and prevention tools, its Enterprise Security Manager SIEM offering and e-mail security, Web security and vulnerability scanning tools. While it exited the next generation firewall (NGFW) business, ePolicy Orchestrator had become an "anchor" platform for Intel Security, and now McAfee, according to ESG Senior Principal Analyst Jon Olstik, in a Network World blog post. Olstik, who has followed McAfee for decades since it was known as Network Associates, said McAfee's challenge is to regain its leadership in endpoint security, become less product focused, emphasize the C-suite and focus on cloud security, an area the company hasn't adequately addressed.
One area McAfee has invested in heavily is threat intelligence with ePolicy Orchestrator tied to its Threat Intelligence Exchange (TIE), whose wide gamut of partners supports its Data Exchange Layer (DXL), which the company recently made available as open source in the hopes to extend adoption.
In the first McAfee Labs Threat Report following the spinoff, the company identified five critical challenges to handling threat intelligence: volume, validation, quality, speed and correlation. The 49-page report is available for download, though here's an edited synopsis of the five threats McAfee Labs believes the industry must address:
- Volume: The Internet of Things has led to the deployment of millions of security sensors creating high volumes of data fed into threat intelligence tools, which include streaming analytics and machine-learning software that process and analyze the data. While these tools have improved the level of internal threat detection, it has created a yet unsolved massive signal-to-noise problem. Vendors are tackling this in various ways, such as building access monitors that scan sensitive data, sophisticated sandboxes and traps that can resolve contextual clues about a potential attack or suspicious event.
- Validation: Given the ability for threat actors to issue false threat reports designed to mislead or overwhelm threat intelligence systems, it's essential to validate the sources of shared threat intelligence.
- Quality: Vendors need to rearchitect security sensors to capture and communicate richer trace data to help decision support systems identify key structural elements of a persistent attack. Filters, tags and deduplication are critical. McAfee is among six founding members of the new Cyber Threat Alliance (CTA), launched in February during the RSA Conference, that is looking to address the quality issue. Joined by Check Point, Cisco, Fortinet, Palo Alto Networks and Symantec, the CTA will automatically score the quality of threat intelligence data, but can only gather information if they are supplied quality input.
- Speed: The latency between a threat detection and the reception of critical intelligence remains an issue. Open and standardized communication protocols, optimized for sharing threat intelligence are essential for successful threat intelligence operations. Advanced persistent threats and sophisticated, targeted campaigns often target multiple organizations in specific vertical industries, meaning communications among an intermediary or exchange must occur within hours of the first indication of an attack.
- Correlation: As threat intelligence is received, correlating the information -- while looking for patterns and key data points relevant to the organization -- is critical. Vendors must find improved ways to share threat intelligence among different products and improve methods to automatically identify relationships between the intelligence collected and ultimately to employ machine assistance to simplify triage.
While the report points to an industry call to action, it gives a synopsis of McAfee's priorities regarding threat intelligence, an emphasis kicked off back in 2014 with the launch of its DXL threat exchange. Olstik noted the DXL platform is effectively security middleware. The TIE includes products from dozens of exchange members who offer network management, application and database security, incident response, forensics, endpoint and mobile device management platforms, authentication, encryption, data loss prevention and cloud security.
Posted by Jeffrey Schwartz on 04/14/2017 at 11:42 AM0 comments
PC shipments have increased for the first time in five years, according to the latest quarterly report issued by market researcher IDC. The first quarter IDC PC Device Tracker report from IDC, which also showed HP regaining the top position from Lenovo, showed that the 60.3 million units shipped worldwide during the period represented a year-over-year growth of 0.6 percent.
While the increase may appear negligible, it was a surprise increase versus a projected 1.8 percent decline, according to IDC, which revealed the quarterly report yesterday (on the same day Microsoft released its new Windows 10 "creators update"). The growth spurt is particularly surprising, given the beginning of the year is historically slow for pc sales.
Jay Chou, research manager of IDC's PC Tracker reports, noted in a statement that competition from tablets and smartphones, along with longer PC lifecycles, have pushed PC shipments down 30 percent since the peak at the end of 2011. Though he disputed the notion that those devices are the reason for the declining growth.
"Users have generally delayed PC replacements rather than giving up PCs for other devices," Chou stated. "The commercial market is beginning a replacement cycle that should drive growth throughout the forecast. Consumer demand will remain under pressure, although growth in segments like PC gaming as well as rising saturation of tablets and smartphones will move the consumer market toward stabilization as well."
Despite the first uptick in global PC shipments since 2012, the U.S. market wasn't a contributor to that growth. According to IDC, the overall PC market in the U.S. declined slightly with shipments of 13.3 million units. The report noted strong demand for Chromebooks as well.
HP Back on Top
It appears HP was a key driver of last quarter's surprise surge. Shipments for the quarter rose 13.1 percent, making it the market-leading supplier of PCs for the first time since 2013, when it ceded the top spot to Lenovo, which dropped to the No. 2 position. HP shipped 13.1 million units in the first quarter, compared with 11.6 million during the same period last year, giving it 21.8 percent of the market, compared with 19.4 percent share year-over-year. Lenovo shipped 12.3 million units, giving it a 20.4 percent share of the market.
Dell, the No. 3 supplier, saw a 6.2 percent growth with 9.6 million units shipped. Apple came in fourth with 4.2 million shipments, up 4.1 percent and Acer's 4.1 million systems shipped represented a 2.9 percent increase in shipments. Outside the five top players, the rest, lumped together as "other" saw an 11.4 percent decline.
Despite IDC's optimistic report, rival Gartner's quarterly findings, also released yesterday, contracted those findings, reporting a 2.4 percent overall decline. According to its report, it was the first time global PC shipments fell below the 63 million unit threshold. Likewise, while HP took the top spot in U.S. shipments, Lenovo retained its leadership globally, though the latter's growth was considerably lower.
It's not the first time the two firms had contradictory reports because they use different tracking methods and metrics. The two did agree on the fact that the top three players, -- Dell, HP and Lenovo -- will battle it out among enterprises. "The market has extremely limited opportunities for vendors below the top three, with the exception of Apple, which has a solid customer base in specific verticals," said Gartner Principal Analyst Mikako Kitagawa, in a statement.
Depending on whose numbers you buy into, the PC business in 2017 has either gotten off to a surprisingly good start, or has yet to hit rock bottom. In either case, despite this week's release of the Windows 10 creators update, it may be premature to say we've entered the post post-PC era.
Posted by Jeffrey Schwartz on 04/12/2017 at 2:29 PM0 comments
As Microsoft gets set to roll out the next major release of Windows 10, the company is also priming the pump for developers to take advantage of the latest new features coming to the OS -- 3D, mixed reality, improved natural language interaction with Cortana, enhanced inking capabilities and support for the new Surface Dial -- with this week's release of a new SDK.
Microsoft is set to start rolling out the latest Windows 10 upgrade, called "creators update, this coming Tuesday. It's the second major upgrade to Windows 10 since Microsoft launched it nearly two years ago. The first update came last summer with the release of the Windows 10 Anniversary Edition and offered better stability and improved security. There are noteworthy additional security improvements in the creators update as well but Microsoft has a big stake in the new usability features to make it more attractive to end users.
The new creators update SDK, which was made available for download on Wednesday, includes a broad set of tooling for developers. In addition to the SDK, the download includes Visual Studio 2017 UWP Tooling. The Windows Store is also accepting applications built around the Windows 10 creators update. "We expect users to once again move rapidly to the latest and best version of Windows," said Kevin Gallo, Microsoft's corporate VP for Windows developers, in a post announcing the release of the creators update SDK. "For developers, this is the time to get ready for the next wave."
The SDK lets developers to create a new Universal Windows Platform (UWP) app or build one with existing app code on Windows. Microsoft posted a list of new and improved features for developers as well as a list of new namespaces added to the Windows SDK.
A laundry list of new capabilities in the new SDK outlined by Microsoft includes:
- Desktop to UWP Bridge, to help convert existing applications to UWP apps and integrating them with other apps
- Ink improvements that offer greater inputs and ink analysis, which analyzes ink stroke input for Windows Ink apps, such as shape detection and recognition, handwriting recognition and layout interpretation and classification. It also includes a new Ink toolbar, support for Input injection to programmatically generate and automate input from a variety of devices and the ability for developers to specify inking apps via the new Ink Workspace.
- Windows IoT Core updates with support for Cortana, a spruced-up IoT Dashboard, Azure Device Management and Device Guard for IoT, among other new updates.
- UWP App Streaming Install, which lets users launch an app before it's fully installed for faster access.
Gallo pointed to other features in the SDK including APIs for the Surface Dial, "significant Bluetooth improvements with Bluetooth LE GATT Server, peripheral mode for easier discovery of Windows Devices and support for loosely coupled Bluetooth devices (those low energy devices that do not have to be explicitly paired)" and he pointed to the recently released Android SDK for Project Rome.
While Microsoft has emphasized that the creators update will allow users to generate new 3D images, the support for mixed reality headsets priced in the $300 range will test user interest in holograms. Microsoft also recently rolled out its Mixed Reality toolkit. Little is known about the release dates and other specifics around the various headsets in the pipeline, though Mashable's Lance Ulanoff yesterday was the first to publish a review of the forthcoming Acer Mixed Reality Headset.
Posted by Jeffrey Schwartz on 04/07/2017 at 12:54 PM0 comments
A Microsoft Store might not be the first place one might look for an Android phone but the company's retail locations are taking orders for the new Samsung Galaxy S8, launched last week.
Even as Microsoft stores already sell other competitive devices, such as the Facebook Oculus virtual reality headsets, and the company itself now supports Apple's iOS and Android in numerous ways, the idea of it now selling an Android phone -- and one from the largest supplier -- is somewhat striking.
Certainly, it's not remarkable considering Microsoft's support of other platforms and that Samsung has offered Microsoft apps on its previous phones the Galaxy S6 and S7 (the latter was removed from the market last year after some started catching fire). Indeed, 11 device manufacturers agreed to preinstall Office on their Android devices more than two years ago.
In addition to Office, The Samsung Galaxy S8 Microsoft Edition can be customized with OneDrive, Cortana and Outlook, among other Microsoft offerings. Presuming Samsung has resolved the issues that led to the company having to take the Galaxy S7 off the market, the S8 is expected to be a hit, considering Android is the most widely used mobile platform and the company has the most popular Android-based phone.
The Galaxy S8, which will offer improved video recording and rendering and introduce new biometric authentication options, is the first to use Qualcomm's new Snapdragon 835, which ultimately will introduce support for new high-speed Gigabit LTE-class connectivity that carriers are expected to roll out later this year (here's a listing of specs).
Microsoft is only selling the new phones in its stores, not online. They're due to arrive April 21.
Posted by Jeffrey Schwartz on 04/03/2017 at 12:19 PM0 comments
Pam Edstrom, who many say played a key role in shaping the image of Microsoft and its cofounder Bill Gates, passed away last week at the age of 71 following a four-month battle with cancer.
Microsoft hired Edstrom in 1982 as its first director of public relations where she crafted the company's communications strategy, which many believe helped bring visibility to what was then an obscure startup. Two years later, she joined Melissa Waggener Zorkin, who, at the time, had a boutique PR agency. The two later formed Waggener Edstrom, now known as WE Communications.
Initially, Edstrom balked at Waggener-Zorkin's overtures to join her until she convinced Gates and then-Microsoft President Jon Shirley what they collectively could do for Microsoft. In those early years, Edstrom cultivated relationships with influential business and technology reporters, helping spread the mission and values Gates and Microsoft had and the oft-described goal of bringing PCs to every user. "We spent time very directly designing our roles, and spent endless hours simply working side by side," Waggener-Zorkin said in a blog post published on the agency's Web site in a tribute to Edstrom.
Many veterans of the agency have landed key communications roles at companies such as Expedia, Lenovo, Starbucks and T-Mobile, according to Geekwire, which was the first to report on Edstrom's death. Microsoft's current VP of communications, Frank X. Shaw, is a former president of the agency, the report noted.
In response to her death, Gates told The New York Times that Edstrom "defined new ways of doing PR that made a huge mark on Microsoft and the entire industry."
Posted by Jeffrey Schwartz on 04/03/2017 at 12:18 PM0 comments
Looking to protect sites running in its public cloud from malicious attacks, Microsoft this week released its new Web Application Firewall (WAF) option for its Azure Application Gateway and HTTP load-balancing service.
Microsoft said its new centralized WAF service, announced last fall at Microsoft's Ignite conference, will protect Web apps running with the Azure Application Gateway from common exploits such as SQL injections and cross-site scripting attacks.
Preventing Layer-7 app-level attacks is difficult, requiring laborious maintenance, patching and monitoring throughout the application tiers, according to Yousef Khalidi, Microsoft corporate VP for Azure Networking. "A centralized Web application firewall (WAF) protects against Web attacks and simplifies security management without requiring any application changes," Khalidi said in a blog post this week announcing the release of the Azure WAF service. "Application and compliance administrators get better assurance against threats and intrusions."
Microsoft's Azure Application Gateway is the company's Application Delivery Controller (ADC) Layer-7 network service, which includes SSL termination, load distribution and URL path-based routing and can host multiple sites, according to Khalidi. The new ADC service in Azure also offers SSL policy control and end-to-end SSL encryption and logging.
"Web Application Firewall integrated with Application Gateway's core offerings further strengthens the security portfolio and posture of applications protecting them from many of the most common Web vulnerabilities, as identified by Open Web Application Security Project's (OWASP) top 10 vulnerabilities," Khalidi noted. The WAF comes with OWASP ModSecurity Core Rule Set (3.0 or 2.2.9), designed to protect against these common threats, he added.
Besides SQL injection and cross-site scripting, Khalidi noted the WAF offering protects against command injection, HTTP request smuggling, HTTP response splitting and remote file inclusion attacks. It also addresses HTTP protocol violations, bots, crawlers, scanners and common misconfiguration of application infrastructures, notably in IIS and Apache.
As one would expect from a WAF, Microsoft's new services is designed to fend off denial-of-service attacks occurring simultaneously against multiple Web apps. Microsoft Azure Application Gateway can currently host up to 20 sites behind each gateway, all of which can defend against such attacks. The service is offered with the medium and large Azure Application Gateway types. It costs $94 and $333 per month, respectively.
Microsoft said it intends to add the new WAF service through its Azure Security Service, which scans cloud-based subscriptions for vulnerabilities and recommends ways to remediate issues that are discovered. That service currently didn't include protection of Web apps that aren't scanned by a WAF, though the service does offer third-party firewalls from Barracuda Networks Inc., Check Point Software Technologies Inc., Cisco, CloudFlare, F5, Fortinet Inc., Imperva Inc. and Trend Micro, among others.
Posted by Jeffrey Schwartz on 03/31/2017 at 11:48 AM0 comments
Among the slew of improvements to Microsoft's Skype for Business and Cloud PBX offering announced at this week's annual Enterprise Connect conference in Orlando, Fla., one that stood out was Polycom's new RealConnect, which will allow Office 365 users with Skype for Business to add Cisco devices to meetings.
It's noteworthy because it's an important step by Microsoft and Polycom that extends the reach of Skype for Business Online, permitting users to connect using other vendors' conferencing equipment. Given Cisco is the leading provider of VoIP phones and PBXs with a formidable videoconferencing systems business, it promises to widen the reach of Skype for Business.
"We want to make sure you have a great video experience regardless of what platform your users are on, and regardless of what platform you develop on, that extends across a number of different platforms," said Ron Markezich, Office 365 corporate VP, during his keynote address today at the conference.
Delanda Coleman, a Microsoft product marketing manager, joined Markezich on stage to demonstrate the interoperability capability that Polycom's RealConnect will offer when it's released next month. "Now any legacy Cisco VTC [videoconferencing system] can join the Skype for Business meeting without any problems," Coleman said.
Polycom is the leading provider of Skype for Business handsets and videoconferencing systems. While connecting to Cisco devices is an important step, it also suggests Polycom will look to connect with other devices, software and services. "Polycom RealConnect for Office 365 simplifies the video world by connecting Skype for Business online users with those using other video systems," Mary McDowell, Polycom's CEO stated in Markezich's blog post announcing the Cloud PBX and SfB upgrades. "This cloud service protects customers' investments in existing video systems as it allows these users to join a Skype for Business meeting with a single click."
It's reasonable to presume Microsoft will certify bridging solutions from other partners, which could lead to further usage of Skype for Business over the long haul, even if it means organizations will hold onto those existing systems longer.
Posted by Jeffrey Schwartz on 03/29/2017 at 1:39 PM0 comments
Microsoft's Enterprise Mobility + Security (EMS) service has come a long way over the past year with added integration and new capabilities as organizations grapple with what role it will play. If Microsoft has its way, EMS, bundled with Office 365 and Windows 10 will ensure customers won't choose third-party data tools to secure access to data, apps and cloud services and for authentication and policy management. But despite Microsoft's declaration that EMS is the most "seamless" enterprise information protection offering, the company is also showing a pragmatic view with the recent release of the Intune APIs, and partnerships with those who have rival solutions from VMware, SailPoint and Ping Identity, among others.
Nevertheless, Corporate VP Brad Anderson has long argued the case for EMS, and claims it's the most widely used enterprise mobility, application and device management offering. Anderson released a 35-minute video last week called "Everything You Want to, Need to, and/or Should Know About EMS in 2017," where he made the case for EMS and gave demonstrations showcasing the new EMS portal, and features such as conditional access, ties with the new Microsoft Security Graph, integration with Azure Information Protection and the recently released Windows Information Protection in Windows 10 and the release of the mobile application management (MAM) SDKs, that allow for the embedding of EMS controls into apps.
The slickly produced video came on the heels of a post by Anderson two weeks earlier that highlighted the coming together of PC and device management using mobile device management (MDM) approaches. It is indeed a trend that is gaining notice. It was a key topic of discussion during last December's TechMentor track at the annual Live! 360 conference in Orlando, Fla., produced by Redmond publisher 1105 Media. The application of MDM to PC and device management is also the focus of this month's Redmond magazine cover story, "Breaking with Tradition: Microsoft's New MDM Approach."
Anderson earlier this month pointed to the results of analyst firm CCS Insight's recent survey of 400 IT decision makers responsible for mobility, which found that 83 percent plan to converge PC and mobility operations into a single team within the next three years and 44 percent will so this year. Worth noting is that 86 percent reported plans to upgrade their PCs to Windows 10 within three to four years and nearly half (47 percent) planned to do so this year.
Microsoft has reported that more than 41,000 different customers use EMS. Anderson last week argued that's more than double the size of VMware's AirWatch installed base and triple that of MobileIron. Anderson also is a strong proponent that Azure Active Directory (AAD), the identity management solution offered for both EMS and Office 365, obviates the need for third-party identity management-as-a-service (IDMaaS) offerings.
"There are more than 85 million monthly access of office 365, just shy of 84 million of them use the Microsoft solution to manage and synchronize all of their identities into the cloud." Anderson said in reference to CCS Insight's annual survey. "What that means, just a little over 1 percent of all of the monthly active users of Office 365 use competing identity protection solutions. EMS is the solution that you need to empower your users to be productive how, where and when they want, and give them that rich, engaging experience."
Asked if he agrees with Anderson's strong assertion, CCS Insight's analyst Nick McQuire responded that Intune and EMS has had quite a large impact on the market over the past year fuelled by interest in Windows 10 and Office 365's growth. "Perhaps the biggest impact is the pause that it has generated with existing AirWatch, Blackberry and MobileIron customers," McQuire said. "The EMM market is slowing down and penetration rates of EMM into their customer bases is low and this is a challenge they need to address. Microsoft has contributed to this slowdown in the past 12 months, without question."
That said, McQuire isn't saying it's game over for the other providers. "At the moment, there is a real mix," he said. "Some customers are making the switch to Microsoft. Others may not have made the switch but are absolutely kicking the tires on the product and waiting to see if Intune and EMS becomes the real deal, given that it arrived late to the market and is playing catch up."
McQuire also noted that switching EMM products is not straightforward and churn rates in the industry, although unreported, are very low. "This is evidenced in the renewal rates across all the long-standing EMM players which are high (average 80 to 90 percent range) indicating that when EMM is deployed, it sticks and it becomes very hard to ask customers to rip and replace," he said.
The release of the Microsoft Graph and Intune APIs for Office 365 will help customers who don't want to move to EMS, he noted. Because EMS is offered with Microsoft Enterprise Agreements, using it with other tools will become more practical and make more customers open to using it in concert with those offerings.
"At the moment, we don't see many customers with a production environment under the coexistence model but we do see this growing rapidly this year," McQuire noted. "Microsoft's strategy here is not to concede these accounts but to land and expand."
Why does it make sense for rivals such as VMware's AirWatch or MobileIron to use the APIs? Ojas Rege, MobileIron's VP of strategy said there are two sides to the EMS equation. One is the EMS-Intune console on the front end and the other is a set of middleware services on the back end based on the Microsoft Graph.
"If other consoles like MobileIron want to leverage them, they can," Rege said. "What does matter are these additional proprietary Microsoft features. It doesn't make sense for us to use the Graph API to activate an Intune function to lock an iOS device because we just lock the iOS device directly, but it does make sense to use the Graph API, to set a security control on Office 365."
Adam Rykowski, VMware's VP of UEM Product Management, agrees that traditional desktop PC management and MDM are coalescing and it's fueling growth. "We are actually some seeing some pretty major customers ramp up even sooner than we had expected," Rykowski said.
Andrew Conway, general manager for EMS marketing at Microsoft, posted a brief update last week on EMS and Microsoft Graph APIs, describing them as a gateway to various offerings ranging from Azure AD, Outlook, OneDrive, SharePoint and Intune among others. "The Microsoft Graph API can send detailed device and application information to other IT asset management or reporting systems," Conway noted. "You could build custom experiences which call our APIs to configure Intune and Azure AD controls and policies and unify workflows across multiple services."
Posted by Jeffrey Schwartz on 03/27/2017 at 1:09 PM0 comments
Google has accused Symantec of improperly issuing 30,000 Extended Validation (EV) certificates and will immediately start distrusting them in its widely used Chrome browser.
The move, a stinging indictment against the giant Certificate Authority (CA) and security provider, means SSL/TLS-based certs issued by Symantec will be invalidated. Users who visit affected HTTPS Web sites will receive warnings that a site's cert isn't valid but will still have access if they choose to ignore the warning. It could force Web site operators to move to other CAs. Symantec is disputingGoogle's charge that the certs were improperly validated.
An investigation by the Google Chrome team initiated on Jan. 17 initially centered around just 127 certificates. But the company now has found more than 30,000 certificates issued over a period of several years that don't fully meet Chrome's Root Certificate Policy. Ryan Sleevi, a software engineer on the Google Chrome team, announced the move yesterday in a newsgroup post.
"The Google Chrome team has been investigating a series of failures by Symantec Corporation to properly validate certificates," Sleevi noted. "Over the course of this investigation, the explanations provided by Symantec have revealed a continually increasing scope of misissuance with each set of questions from members of the Google Chrome team. This is also coupled with a series of failures following the previous set of misissued certificates from Symantec, causing us to no longer have confidence in the certificate issuance policies and practices of Symantec over the past several years."
As of two years ago, certificates issued by Symantec accounted for more than 30 percent of the valid certificates based on volume, according to Sleevi. A Root Certificate Policy requires all CAs to ensure that they validate domain controls, audit logs for signs of unauthorized issuance of certs and protect their infrastructures to avoid the ability to issue fraudulent certs. Symantec has failed to meet those requirements, Sleevi stated.
"On the basis of the details publicly provided by Symantec, we do not believe that they have properly upheld these principles, and as such, have created significant risk for Google Chrome users," said Sleevi. "Symantec allowed at least four parties access to their infrastructure in a way to cause certificate issuance, did not sufficiently oversee these capabilities as required and expected, and when presented with evidence of these organizations' failure to abide to the appropriate standard of care, failed to disclose such information in a timely manner or to identify the significance of the issues reported to them."
Despite the ongoing investigation, Symantec was surprised by Google's move. In a statement sent via e-mail by a Symantec spokesman, the company disputed Sleevi's accusations. "We strongly object to the action Google has taken to target Symantec SSL/TLS certificates in the Chrome browser. This action was unexpected, and we believe the blog post was irresponsible. We hope it was not calculated to create uncertainty and doubt within the Internet community about our SSL/TLS certificates."
For now, Sleevi noted Google is taking the following steps:
- Newly issued Symantec certificates will only be valid for nine months or less, to reduce any impact to Google Chrome users of potential future certs that aren't issued properly.
- All Symantec-issued certs covering different Chrome releases must be revalidated and replaced.
- The Extended Validation status of Symantec issued certificates must be removed for at least one year but cannot be reinstated until it meets Google's guidelines.
Symantec argued the certs in question caused no harm against Web site visitors and believes Google is singling it out. Google's criticism of Symantec's certificate issuance policies is "exaggerated and misleading," according to its statement, which noted that it discontinued its third-party registration authority program to reduce any concerns regarding trust regarding its SSL/TLS certs.
"This control enhancement is an important move that other public certificate authorities (CAs) have not yet followed," according to Symantec. "While all major CAs have experienced SSL/TLS certificate misissuance events, Google has singled out the Symantec Certificate Authority in its proposal even though the misissuance event identified in Google's blog post involved several CAs."
Will the two companies iron out their differences? Symantec insists it maintains extensive controls over how it issues SSL/TLS certs and hopes to discuss the matter further with Google in hopes of resolving this dispute.
Posted by Jeffrey Schwartz on 03/24/2017 at 12:26 PM0 comments
Looking to make Outlook the center of the digital workspace, Harmon.ie has launched a new tool which brings together commonly used business apps and services into one place that gathers information from disparate apps and cloud services using AI and machine learning to apply context to a given task. Harmon.ie, which claims its namesake Outlook plugin is used in 1,200 organizations as a common interface to interact with information stored in SharePoint and Office 365 files, has widened its scope by supporting additional tools such as Yammer, IBM Connections, ZenDesk and Salesforce.com's Chatter and CRM offerings.
The new tool, called Collage, is among the first enterprise applications to make use of the Microsoft Graph APIs, released last year, according to Harmon.ie CEO Yaacov Cohen. "It's a very important API," Cohen said in an interview. "During the last six months, Microsoft has released more and more of Graph APIs, which give not just information but insights such as who I am working with. Based on these APIs, we can deliver an experience that can deduce things for you."
Cohen indicated that Collage will work with numerous other popular workplace tools. Collage seeks to make Outlook the center of an employee's workspace. But with the AI and machine learning APIs of the Microsoft Graph, Collage also provides more context with its ability to recognize keywords used across different apps, according to Cohen.
During a demo, Cohen showed how Collage recognizes topics people are working on and associates them with related relevant information among different apps. Collage lets users access information from Outlook in their native app experiences and brings documents from SharePoint and Office 365 as links rather than attachments to ensure information is current, Cohen explained. While it works with SharePoint Server (on-premises), it requires organizations to use OneDrive for Business to store data, Cohen said, noting its dependence on the Graph and Office 365.
"The tool is great for organizations who use Office 365 as well as other services that Collage can connect to," SharePoint MVP Vlad Catrinescu, who is president of vNext Solutions, said via e-mail. "By showcasing information 'in-context' from multiple services directly in Outlook, it allows users to be more productive, really get work done and make the right decisions because they have all the information available to them. I think Outlook is a great place to be the 'hub' of this information because, let's be honest, that's where most of the work of the classic information worker is."
In a review posted on his Web site, Catrinescu illustrated how Collage connects users with SharePoint sites relevant to a given topic, letting users view, open and drag and drop documents into SharePoint. Users can also drag and drop documents from the sidebar in Outlook and create an e-mail with a link to the document. As users with access to that shared document make changes, it ensures everyone has the most recent version, he noted.
"The big challenge for Harmon.ie is really to make sure their AI engine and Machine Learning will do an amazing job in tagging content from so many systems, so it becomes useful for the users," Catrinescu said, adding that it performed well in his recent tests.
Harmon.ie first introduced Collage last fall at Microsoft's Ignite conference in Atlanta, offering the free preview as a separate tool outside of Outlook. Now it's available to enterprise customers at a price of $6 per user per month and is unified with its flagship Harmon.ie client.
Posted by Jeffrey Schwartz on 03/22/2017 at 5:12 PM0 comments
Microsoft is expected to share what's next for SharePoint during an event scheduled for May 16 and it should come as no surprise the announcements involve further integration with Office 365, OneDrive, Yammer, Windows, PowerApps, Flow and the new Microsoft Teams chat service.
Corporate VP Jeff Teper, who oversees SharePoint and Office 365, is scheduled to lead the virtual event, joined by Corporate VPs James Phillips and Chuck Friedman, who will explain how Office 365, connected with Windows and Azure, "is reinventing productivity for you, your teams and your organization," according to the invite posted by Microsoft. In addition to introducing new products, the company at this point has only indicated that attendees will "learn how to create a connected workplace." Vague as that is, the teaser is consistent with Microsoft's focus on bringing a more social and connected user experience with SharePoint and Office 365.
The roadmap update is scheduled slightly more than one year after Microsoft held its Future of SharePoint event in San Francisco, where Teper introduced the new SharePoint Framework, marking the official launch of SharePoint Server 2016. At last year's launch, Microsoft promised to release upgrades to the on-premises server edition in 2017 but said the company will continue to emphasize Office 365 and hybrid cloud implementations. The forthcoming event isn't being promoted as a major launch but as an informational update that will include some release news.
It continues to beg the question: to what extent are SharePoint shops embarking on moving the collaboration suite to the cloud, either hybrid or purely online? As noted last week, a survey conducted by a team of graduate students at the Marriott School of Management at Brigham Young University and Christian Buckley's CollabTalk, which is sponsored by Microsoft, a number of ISVs and several media partners including Redmond magazine, aims to shed light on the extent to which SharePoint shops are moving to either hybrid or all-cloud deployments. According to preliminary results:
- 31 percent are using hybrid SharePoint solutions, with almost 50 percent remaining entirely on-premises.
- More than 60 percent of those with hybrid in place are managing SharePoint on-premises.
- Of those entirely on-prem, almost half plan to shift to hybrid within the next one to three years.
The survey is much deeper than that, and today's the last day to weigh in here. We look forward to sharing the results next month in advance of Microsoft's event.
Posted by Jeffrey Schwartz on 03/22/2017 at 11:53 AM0 comments
VMware has extended Windows 10 endpoint management and security options offered in its Workspace One platform, the company's new digital workspace platform released last year that brought together the company's AirWatch mobile device management (MDM) tooling and Horizon application delivery offerings.
The upgrade, released this week, is the first major update to Workspace One. In addition to adding new Windows 10 security and management controls to its AirWatch 9.1 MDM offering, which the company describes as unified endpoint management (UEM), the update also adds new support for Android devices, real-time threat detection and an advance rules engine for devices used in specific vertical industries and ruggedized computers.
Workspace One and AirWatch 9.1 adds more granular controls to work around off-network OS patching of Windows 10 endpoints and restrictions imposed by Microsoft's new Windows Update as a service and includes a new dashboard to track patch compliance and perform audits of Windows updates. AirWatch 9.1 also now provides advanced BitLocker configurations, which the company says eliminates the need for encryption management tools from Microsoft or other third-party providers.
Adam Rykowski, VMware's VP of UEM Product Management, said in an interview that the upgrade lets Windows administrators encrypt an entire disk, system partition or take a device's built-in TPM chip to eliminate the need for USB-based flash drives for Secure Boot or startup keys. At the same time, it enables the enforcement of logical PINs in conjunction with the TPM chip to lock the OS from starting up or resuming until a user is authenticated. It also offers various controls for rotation key policies, recovery controls and the ability to suspend BitLocker enforcement policies when deploying critical maintenance updates.
AirWatch 9.1 also now supports Microsoft Business Source Packages (BSPs), the set of components designed for specific hardware and the Windows Store for Business. Rykowski said that will ease the deployment of applications in Microsoft's Windows Store via VMware's Workspace One company store catalog.
VMware has also added real-time threat detection and access control remediation for Windows by integrating with VMware's TrustPoint endpoint security tool. The company added TrustPoint as an option to AirWatch through an OEM agreement inked last year with Tanium, a provider of real-time security and endpoint management tools, which both companies claim can query millions of endpoints in seconds to detect and remediate threats. VMware said TrustPoint offers accelerated compliance and threat management.
Many of the key updates are the result of feedback from early customers of last year's Workspace One release, which included the new AirWatch 9.0 that wanted to bring more MDM-like management to Windows 10, according to Rykowski. "It was interesting to see how Windows management required another deeper level of control that's very different from mobile," he said. "But the way we're doing it, is in this cloud way where you can apply updates and patches in real time with devices that are not on the network, regardless of where they are. We can provide the same level of deep capabilities but do it in that more modern management style."
The new Android support added to the update includes new onboarding support for managed devices, which now offers configuration of devices via a QR code or an e-mail to the user. It supports automatic app permissions and configurations, enforces app-level passcode polices for work applications without requiring an SDK and adds improved integration of Google Play and App Config setups. VMware's update also adds support for Apple's forthcoming iOS 10.3 and MacOS 12.4 releases with a new SDK, app wrapping engine and support for productivity applications.
A newly added browser plugin now offers single sign-on access to nonfederated software-as-a-service Web apps that don't support SAML. VMware is also slashing the price of the various packaging options. The new standard license was reduced from $4.33 per device to $3.50 and the advanced cut from $6 to $5.50.
Posted by Jeffrey Schwartz on 03/17/2017 at 1:14 PM0 comments
While SharePoint Server now lends itself well toward hybrid deployments with Office 365, it's unclear to what extent organizations are doing so. A research project underway hopes to measure how many organizations are actively pursuing hybrid SharePoint deployments, what's motivating them or holding them back.
Graduate students at the Marriott School of Management at Brigham Young University and Christian Buckley's CollabTalk are conducting the research, which is sponsored by Microsoft, a number of ISVs and several media partners including Redmond magazine. The research includes an online survey of SharePoint and Office 365 customers and MVPs, and customer interviews with contribution from outside research firms.
When Buckley, a Redmond contributing author and well-known MVP in the SharePoint community, reached out to me about our editorial cosponsorship to this research project, he noted that the size of the hybrid SharePoint market remains unclear, along with why organizations opt to remain on-premises. In addition to getting a better measure on hybrid deployments, the research aims to understand what fears many organizations still have about moving SharePoint to the cloud, as well as what technical issues are making it difficult to migrate existing server infrastructure, custom apps and third-party solutions.
A report based on the research will also explain how customers that have implemented hybrid SharePoint deployments are addressing those various challenges including management and security considerations. Buckley said there's no accurate data that explains how many customers are actively pursuing SharePoint deployments as a strategy versus a means toward displacing their on-premises deployments to the cloud.
"Whether you are already well into your cloud journey or just getting started, this study will help us understand where you're at and what we can do to help," Bill Baer, Microsoft's senior technical product manager focused on hybrid SharePoint deployments said in a statement.
The report will be released next month in various forms, including webcasts by some of the sponsors. In addition to Microsoft, the corporate sponsors include PixelMill, B&R Business Solutions, Rencore, Crow Canyon Systems, tyGraph, Focal Point Solutions and AvePoint.
Of course, we'll share the findings and analysis as well. If you'd like to weigh in, the deadline to participate in the survey is March 22.
Posted by Jeffrey Schwartz on 03/16/2017 at 10:30 AM0 comments
Okta, one of the largest identity-as-a-service (IDaaS) providers, is finally going public. The company, rumored for years to have its eyes on an initial public offering (IPO), this week registered plans with the Securities and Exchange Commission (SEC) to offer Class A common stock to be traded on the Nasdaq market.
The move is the latest sign that the company is undeterred by claims from Microsoft that organizations no longer need third-party IDaaS offerings (such as the Okta Identity Cloud) if they use Azure Active Directory Premium and the rest of its Enterprise Mobility and Security (EMS) services. Despite such claims, Okta and its customers point to the larger array of integrations offered with its cloud-based SSO service. Okta says it now offers 5,000 connectors to legacy and software-as-a-service (SaaS) applications.
Okta's S-1 filing reveals its revenues have soared in recent years. During its last fiscal year, revenues grew 109% from $41 million to $85.9 million. Looking at the first nine months of calendar year 2015 and 2016, Okta said revenues have grown from $58.8 million to $111.5 million, respectively. Despite the revenue growth, however, Okta posted steep losses -- $59.1 million in FY 2015, $76.3 million in FY 2016 and $54.9 million and $65.3 million for the first nine months of calendar years 2015 and 2016, respectively.
However, the company also touted its base of 2,900 enterprise customers including 20th Century Fox, Adobe, Engie, Flex, Github, LinkedIn, MassMutual, MGM Resorts, Pitney Bowes and Twilio, and key partnerships with Amazon Web Services, Box, Google Cloud, Microsoft, NetSuite, SAP, ServiceNow and Workday.
Indeed, many organizations are looking at third-party IDaaS options, said Forrester Research analyst Merritt Maxim in a blog post commenting on Okta's IPO. "Over the past 18 months, Forrester has had a steadily increasing number of IDaaS-related inquiries from enterprise clients looking to deliver identity and access management (IAM) capabilities to their employees via a SaaS subscription model," Maxim noted. "Okta's revenue growth aligns with the strong growth in demand we see from our clients."
While Okta's cloud-based single sign-on platform is among the most widely used IDaaS offerings, the company has added complementary services including a new mobility management tool and last week's acquisition of API management provider Stormpath.
"The need for a unified identity across every app, service and device is exploding as every company transforms their business with digital services," said Okta CEO Todd McKinnon, in a post announcing the Stormpath deal. "Additionally, developers are becoming major buying centers and decision makers within organizations. And with no signs of that trend slowing, the need for secure application integration is growing."
Okta also faces some formidable competitors. In addition to Microsoft, VMware now offers its namesake VMware Identity Manager, offered with its Workspace One and AirWatch suites, and MobileIron, a leading supplier of enterprise mobility management (EMM) services, also recently added a single sign-on tool to its offering. Then there's a slew of other rival providers including Centrify, One Login, Ping, Quest and Sailpoint, among others.
While Okta and Microsoft are partners, the two companies are also in a heated battle. In an interview last year, McKinnon argued Okta is winning a lot of enterprise IDaaS deals from Microsoft. "They're losing and they don't like losing," McKinnon told me at the time. We beat them eight out of 10 times, deal-for-deal, and that's even with them bundling in their products in their Enterprise Agreements for a pretty low price."
Microsoft argues it now has 41,000 customers using its EMS service. And, ironically, a key pillar of Okta's increased revenues have come on growth of Microsoft Office 365 subscriptions, which automatically result in the uptick of Azure AD accounts.
Posted by Jeffrey Schwartz on 03/15/2017 at 1:12 PM0 comments
It's the 30th anniversary of Microsoft's initial public offering. Since the company went public on March 13, 1986, the company's stock has grown 89,000%, though the vast majority of that was on Cofounder Bill Gates' watch.
Reporting on the anniversary was CNBC's Josh Lipton, who noted an investment of $1,000 at the time of the IPO would now be worth $900,000, presuming you held on to the shares. "Not a bad gain for those early employees," Lipton said.
Despite incrementally liquidating his Microsoft shares, Gates reportedly still holds $87 billion in Microsoft stock, while Cofounder Paul Allen's holdings in the company are valued at $20 billion.
Of course, the bulk of Microsoft's growth came while Gates was CEO, when shares rose 73,000%. Microsoft fell on harder times once Steve Ballmer took the reins. During his tenure between 2000 and 2014, shares fell 34%. Though much of its value dropped on some key missteps, many tech stocks took a beating when Ballmer took over -- first with the dotcom crash and next with the financial crisis of 2008. Also, Gates was still active at Microsoft as chief software architect during the first eight years of Ballmer's tenure.
Since Satya Nadella took over as Microsoft's third CEO in 2014, Lipton noted the company's stock has risen 78% from the Ballmer days.
Posted by Jeffrey Schwartz on 03/13/2017 at 12:57 PM0 comments
It's a good thing Outlook.com isn't my main e-mail account. Otherwise it would have been a long month. That's how long I was locked out of my Outlook.com account. Strangely, I had no problem accessing Microsoft and Office 365 accounts, which are all tied to the same credentials as Outlook.com.
Though I've had an Outlook.com account for years, I rarely use it and never give it out since I have a personal e-mail domain and use Yahoo and Gmail as backup accounts. But given the recent upgrades to the service and its ability to synchronize schedules and contacts, I decided to check it out to consider my options. But when I merely tried to access Outloom.com while already logged into my personal Microsoft Office 365 account, I was redirected to a page which read: "Sorry Something Went Wrong." Trying directly from Outlook.com didn't work either. After trying with a different browser, then via another computer and finally with an iPad, I presumed Outlook.com was experiencing an outage. Since it wasn't critical I tried the next day but still was unable to access my Outlook.com account.
Realizing it must have something to do with my account, I connected with the Outlook.com support desk via an online chat session. One can't expect telephone support with a free service, after all. First the technician said I should change my password in case my credentials were compromised. It didn't seem likely since if that were the case, I wouldn't be able to access any of my other Microsoft account services but it's hard to argue the wisdom of changing your password these days. Nevertheless, that didn't fix anything and the rep was unable to help.
During a visit to my local Microsoft Store the next day, the technician there said he'd reach out to the company's Global Escalation support team. Upon receiving a call from an Outlook.com technician I explained the situation. I gave him remote access to my PC, though I asserted I doubted the problem originated from the client.
The admin didn't see anything obviously wrong on my end but said he'd get back to me in a few days. We went back and forth a few times, and he ultimately suggested I capture my session traffic using Telrik's FiddlerCap Web recorder. The team apparently wanted to review the session traffic (HTTP, cookies and other information that could detect a possible problem).
A few days later he got back to me and apparently found the solution. He instructed me to log in to my Microsoft account and view my settings. Under "Your Info" he asked me to click on "Manage How to Sign-In to Microsoft." This is the section that lets you share all of your account aliases. Asked if my Outlook.com username was my primary alias, I replied no. The primary alias for my Microsoft account was the email address for my personal domain, which I have always used to log in to all of my Microsoft accounts. The admin instructed me to specify my Outlook.com address as my primary address. And in one click, the problem was solved. The good news is I can still use my personal domain e-mail address to log in -- I just can't specify it as my primary address (which is no big deal).
I asked the support rep at Global Escalation Services why it took a month to get to the bottom of this. (I should note that he was determined to get to the bottom of the problem and dutifully called me every few days to check in.) As I reported late last month, with the launch of Outlook.com Premium, Microsoft has transitioned the Outlook.com service over the past year from the Web-based system to Microsoft Exchange Server. The move to Exchange Server came with some caveats. In this case, those who use personal domains can no longer use Exchange Active Synch (EAS) with the aliases to those providers.
That doesn't answer the question of why it took so long, especially when this January community support thread showed a handful of others who had similar problems who noted that EAS is no longer supported with personal domain addresses. Perhaps if accessing the account was more urgent rather than a background task, I'd have found that myself.
If you also use a personal domain e-mail address to access your Outlook.com account (or create a new one), now you know that all you have to do is change your primary username.
Of course, if you do want to tie your personal domain to Outlook.com, you could sign up for the premium service, which allows you to link your personal domain and four others for $10 per year on top of the existing annual fee. And if you sign up by month's end, Microsoft is offering Outlook.com Premium for $19.95 a year. Microsoft says the regular price is $49.99 but we'll see if the company sticks with that price
However, the addition of that option would explain why those who used their domain names as their primary credentials were suddenly locked out.
Posted by Jeffrey Schwartz on 03/10/2017 at 12:40 PM0 comments
Cybersecurity experts are mourning the loss of Howard Schmidt, the nation's first cybersecurity czar, who died last week at the age of 67 after a long battle with brain cancer. Schmidt, who served two U.S. presidents, was a onetime chief security officer (CSO) at Microsoft and played a key role in shaping the company's Trustworthy Computing initiative.
Schmidt was recruited from Microsoft by President George W. Bush in April of 2002 in wake of the Sept. 11 terrorist attacks where he served as vice chairman of the President's Critical infrastructure Protection Board. President Barack Obama later tapped Schmidt as the nation's first cybersecurity czar -- his actual title was Special Assistant to the President, Cyber Security Coordinator.
In that role, Schmidt is credited with Obama's efforts to foster private and public sector cooperation in shaping more coordinated policy that promotes sharing attack and threat information in the common interest of protecting the nation's critical infrastructure.
In an interview with DarkReading in a 2011, Schmidt said his key efforts centered around the need for the government and private sector to share attack intelligence. "We are able to coalesce intelligence … the government has information that comes from its unique position, so part of this is taking that information and [showing] we care about putting the bad guys in jail," he said at the time. "We want to make sure we are sharing with our private sector partners."
Schmidt was instrumental in numerous White House initiatives. Of note was the National Strategy for Trusted Identities in Cyberspace (NSTIC), which the White House has since removed from its Web site, and also helped create a strategy on how the U.S. would defend itself from a major international cyberattack. As noted by DarkReading, Schmidt once warned of the "cascading effects" of targeted malware attacks against nation states. Schmidt left the Obama administration toward the end of his first term.
While he was well known for his government service, Schmidt's cybersecurity career spanned 40 years and held roles in military and the commercial sectors as well. In addition to serving as Microsoft's CSO, he spent two years as eBay's chief information security officer (CISO), was chairman of Codenomicon and held numerous board, director and other non-executive roles at various security companies, including Fortify, RatePoint, Neohapsis, BigFix and HAS Security.
Schmidt also served as international president of the Information Systems Security Association (ISSA) and as president and CEO of the Information Security Forum (ISF). After leaving the White House in 2012, Schmidt served as executive director of the Software Assurance Forum for Excellence in Code (SAFECode), a non-profit organization focused on promoting best practices in developing secure code. When Schmidt was too ill to continue at SAFECode his longtime Microsoft protégé Steve Lipner took the reins of the association last fall.
Indeed, Schmidt left his imprint on Microsoft as well, having served during the period that led up to Bill Gates' Trustworthy Computing Initiative and was a cofounder of what ultimately became the company's Trustworthy Computing Group, as recalled by Threatpost, a blog produced by Kaspersky Lab.
Along with Lipner, who ran the Microsoft Security Response Center back then, Schmidt helped create the team that led up to Gates' infamous Trustworthy Computing e-mail. Both Lipner and Schmidt worked closely together on the response on some of the largest major Internet cyberattacks at the time, including Code Red.
Lipner last week told Threatpost that Bush recruited Schmidt from Microsoft just three months before Gates launched the Trustworthy Computing Initiative. "Howard always felt a higher calling to service to the government of the United States, Lipner told Threatpost. "There's no better demonstration of that than the fact that, in late 2001, after the 9/11 attacks, he left Microsoft to join the White House cybersecurity policy office. His departure meant that he was no longer at Microsoft when the Trustworthy Computing e-mail -- which reflected a lot of effort on his part -- was released."
Posted by Jeffrey Schwartz on 03/06/2017 at 11:46 AM0 comments
Looking to bolster the container-based software it has helped standardize, Docker has launched a new enterprise edition with the goal of helping DevOps teams build, deploy and manage business-critical applications that it said can run in production at scale across various infrastructure and cloud services.
The new Docker Enterprise Edition (EE), announced today, is what the company calls its open and modular container platform for building and maintaining cloud-scale applications that are portable across the operating systems and cloud services. The new Windows Server 2016, released last fall, includes the Docker runtime engine free of charge. In addition to Windows and Hyper-V, Docker is certified to run on with a wide array of Linux distributions, VMware's hypervisor platforms and cloud services including Microsoft Azure, Amazon Web Services and Google Cloud Platform.
Docker EE is available in three configurations: Basic, Standard and Advanced. Basic includes the Docker platform designed to run certified infrastructure and has support for Certified Containers and Plugins, available in the Docker Store. The Standard configuration adds to that support for multi-tenancy, advance image and container management, as well as integration with Active Directory and other LDAP-based directories and the Advanced version offers Docker Security Scanning and continuous vulnerability monitoring.
The launch of Docker EE comes just over a year after the company launched Docker Datacenter, the company's DevOps platform built to enable the deployment of Containers as a Services (CaaS) on premises or in virtual managed private clouds. In addition to the Docker engine container runtime, it brought together the Docker Trusted Registry for image management, security and collaboration. Docker Datacenter also introduced the company's Universal Control Plane (UCP) with embedded Swarm for integrated management tooling.
Docker Datacenter is now a component of the Docker EE, available with the Standard and Advanced versions. All of the free Docker versions will now be renamed Community Edition (CE).
With the release of Docker EE, the company has also broadened its certification program with a wider number of third-party partner offerings available in the Docker Store that are compatible with Docker EE. In addition to the infrastructure the company's container platform supports, the new program will allow ISVs to distribute and support their software as containers that are reviewed and tested before they're available in the Docker Store.
The program also looks to create an ecosystem of third-party plugins that must pass API compliance testing to ensure compatibility across apps, infrastructure and services. "Apps are portable across different network and storage infrastructure and work with new plugins without recoding," said Dan Powers, Docker's head of partner and technology management, in a blog post announcing the new certification program.
Docker also plans to accelerate its release cycles. Docker will now issue new releases of the community and enterprise editions each quarter. The company will support each release of Docker EE for a year and the community editions for four months, though Docker CE users will have a one-month window to update from one release to the next. The Docker API version will remain independent from the Docker platform, said Docker Product Manager Michael Friis, in a blog post announcing Docker EE.
"Even with the faster release pace, Docker will continue to maintain careful API backwards compatibility and deprecate APIs and features only slowly and conservatively," Friis noted. "And Docker 1.13 introduces improved interoperability between clients and servers using different API versions, including dynamic feature negotiation."
Posted by Jeffrey Schwartz on 03/02/2017 at 3:43 PM0 comments
Microsoft today has released the third and final technical preview of its Azure Stack, the software that will be offered on integrated appliances letting enterprises and hosting providers run Redmond's public cloud platform in their own datacenters. The company also revealed it will offer consumption-based pricing for Azure Stack.
Signaling it's on target to become generally available this summer, the new Azure Stack TP3 comes nearly eight months after the company released the second preview and more than a year since releasing the first test release. While the company has shifted course a number of times, most notably last summer, Microsoft says it has firmed up its roadmap for Azure Stack and with the release of TP3 is promising more continuous refreshes right through its release this summer.
Shortly after today's TP3 release, Microsoft will add support for Azure Functions, and shortly thereafter Blockchain, Cloud Foundry, and Mesos templates. Also in the coming months, Microsoft will release a refresh to TP3 that will offer multi-tenant support. "Our goal is to ensure that organizations choosing hybrid cloud environments have this same flexibility and innovation capability to match their business objectives and application designs," said Microsoft Technical Evangelist Jeffrey Snover, in a blog post published today revealing the latest Azure Stack news.
Snover said TP3 will let customers:
- Deploy with ADFS for disconnected scenarios
- Start using Azure Virtual Machine Scale Sets for scale out workloads
- Syndicate content from the Azure Marketplace to make available in Azure Stack
- Use Azure D-Series VM sizes
- Deploy and create templates with Temp Disks that are consistent with Azure
- Take comfort in the enhanced security of an isolated administrator portal
- Take advantage of improvements to IaaS and PaaS functionality
- Use enhanced infrastructure management functionality, such as improved alerting
Apparently, Microsoft is letting testers keep their previews. Upon release, Snover said the company will rename its proof-of-concept (PoC) deployments Azure Stack Development Kit, which customers can use as a "single-server dev/test tool to prototype and validate hybrid applications," according to Snover. After Azure Stack is commercially available, Snover said customers should expect to see frequent release updates focused on application modernization, improved management and scale.
Snover emphasized the goal for Azure Stack to offer a consistent and hybrid application development platform that ensures apps developed can run in either the public or private versions of Azure IaaS and PaaS services. "Individuals with Azure skills can move projects, teams, DevOps processes or organizations with ease," he noted. The APIs, Portal, PowerShell cmdlets and Visual Studio experiences are all the same.
Microsoft also said it is looking to apply the same "cloud economics" to Azure Stack as offered through Azure. As such, customers will pay on a consumption basis, though somewhat less since Microsoft isn't incurring the entire infrastructure cost, according to a brief video explaining the usage-based pricing model. Hardware providers Microsoft has partnered with have indicated they will also offer subscription-based pricing plans.
The first systems will come from Dell EMC, HPE and Lenovo, and later in the year from newly added partner Cisco, which will offer Azure Stack systems jointly engineered among the respective suppliers and Microsoft. "From what we have seen if it so far is it is starting to take on the fit and finish of a completed product," said Jeff Deverter, chief technologist for the Microsoft Practice at Rackspace, in an e-mail. "This announcement has really ignited the productization work here at Rackspace as we strive for being a launch-day partner."
Rand Morimoto, a Microsoft Azure MVP and president of Walnut Creek, Calif.-based Convergent Computing, which provides IT development and deployment services, has been building POCs with Azure Stack for a number of large enterprises since Microsoft released the first technical preview early last year, and said in an e-mail he has been awaiting TP3. "This is a huge solution for a lot of our banking and healthcare customers," Morimoto said. "We are anxiously waiting for the final release of Azure Stack and corresponding hardware solutions around it."
Posted by Jeffrey Schwartz on 03/01/2017 at 12:54 PM0 comments
The premium ad-free version of Microsoft's Outlook.com e-mail service that offers personal addresses that can be shared by five people is now generally available. Outlook.com Premium will once again test the appetite among users to pay for personal e-mail services they've received for free for decades, if they ever paid for them at all. The new Outlook.com Premium service, which costs $49.99 per year, is available at a promotional rate of $19.95 until March 31.
Microsoft released Outlook.com Premium last year, first to private invitation-only pilots and then launched a public preview in October, as documented by Mary Jo Foley in her ZDnet blog. Microsoft recently removed the "preview" designation from the service, though the company didn't say when, according to a Feb. 14 update. A few weeks earlier, Microsoft told Foley that the migration of the 400 million Outlook.com mailboxes to the Office 365 infrastructure, which is powered by Microsoft Exchange, was 99.9 percent complete.
All of the major e-mail providers have long offered premium ad-free e-mail services and other perks including Google, Yahoo and Microsoft. What could give the new Outlook.com Premium service added appeal is the ability to create or use existing personal e-mail addresses to an account. Furthermore, a user can allow four additional people to use the domain name of the e-mail address, allowing them to also share calendars, contacts and files. The service also lets customers either create any domain that's available or tie to one they already have.
Customers who sign up for the promotional $19.95 rate will get use of the personal domain free for the first year. While they can renew for the same price, each personal domain costs an additional $10 per year. The personal domain automatically synchronizes Outlook.com mailboxes, along with hotmail.com, live.com and msn.com accounts into one mailbox.
Personal e-mail domains require an Outlook.com Premium subscription, according to Microsoft's FAQ about the Outlook.com Premium service, which implies that cancelling the latter means you must give up your domain name. You can create up to 10 aliases per Microsoft account and are permitted to change them up to 10 times per calendar year.
Outlook.com Premium users can use the same password to sign into an account with any alias tied to that Microsoft account, according to the FAQ. The company describes how to add or remove aliases in Outlook.com Premium here. If you bring your own domain, the registration process requires that you verify ownership and update its mail (MX) records by following instructions on the "bring your own domain" setting.
Users who don't want e-mails from their different accounts to appear in the same inbox can create rules to automatically move messages from a specific account to different e-mail folders. Outlook.com Premium also allows users to send and receive e-mails from AIM Mail, Gmail and Yahoo Mail addresses accounts, with Outlook.com Premium serving as the primary inbox for those other accounts. Similar to Microsoft's free e-mail services, attachment size limits are 10MB -- or 300GB when linked via OneDrive.
Outlook.com Premium, according to the FAQ, does not "currently" support e-mail auto-forwarding, e-mail groups, DomainKeys Identified Mail (DKIM) or Domain-based Message Authentication, Reporting and Conformance (DMARC). Perhaps the use of "currently" suggests those features may appear in the future. For now, Outlook.com Premium is only available to customers in the U.S.
Posted by Jeffrey Schwartz on 02/27/2017 at 12:47 PM0 comments
Amazon's entry last week into the crowded universal communications fray with the launch of its Chime service could pose a challenge to the likes of Microsoft's Skype for Business, Google Voice and a swath of fast-growing players such as Twilio, Bandwidth and Bluejeans, among numerous others. However, a quick look at the initial service suggests it has no obvious technical advantage over its rivals, hence it's not immediately likely to attract enterprises that use VoIP and UC offerings from Cisco, Avaya or Microsoft.
Anyone who has followed Amazon's meteoric growth over the past two decades knows not to take a blind eye to the company's track record in upsetting most -- though not all -- of the industries it has invested heavily in. Chime comes out of the company's AWS cloud business though technically it's available to consumers and businesses. I was able to create an account using my Amazon Prime credentials. And if you look at the success the company's retail business had with its Echo and Dash intelligent devices that respond to voice commands, one can imagine the possibilities Amazon might have up its sleeve.
The company has released three packages: the free version offers chatrooms and allows video calls between two parties; the "Plus" plan, priced at $2.50, adds to that screen sharing and access to a corporate directory; and the "Pro" package, which costs $15 per month, allows meetings between up to 100 participants. Scott Gode, VP of products at Seattle-based Unify Square, a longtime provider of management services for large Microsoft Skype for Business customers, said Amazon's pricing is comparable with the cost of services offered by other UC providers.
"It's very much akin to unified communications-as-a-service providers that are out there and similar to the pricing that Microsoft offers as well," Gode said. "The big difference Microsoft has is it's bundled together with Office 365, whereas the Amazon stuff -- even though its backend is on AWS -- has a standalone pricing model."
Telecommunications service provider analyst Irwin Lazar, a VP at Chicago-based Nemertes Research, agreed that Chime lacks the integration Microsoft's Skype for Business offers with Office 365. But he believes over time that could change. Likewise, Lazar believes Amazon is looking to challenge Google's G Suite. "Ultimately I think their goal is to compete with Office 365 and G Suite. But they have a long way to go," Lazar said. "The biggest immediate impact is potentially downward price pressure on the Web conferencing providers," including PGi, GoToMeeting, Zoom, BlueJeans and Cisco's WebEx, among others.
While AWS runs the world's largest cloud infrastructure, the company hasn't focused on communications and networking services. To boost its entry into the Universal Communications as a Service (UCaaS) market, AWS has partnered with network backbone provider Level 3 Communications and Vonage. Amazon signaled its interest in jumping into the market late last year when the company acquired San Francisco-based Biba, a chat, video and audio conferencing tools provider, a year after buying video signal processing company Elemental for just under $300 million.
Lazar noted that many UCaaS providers that Chime will now compete with run their services on AWS. "But Cisco, Google, and Microsoft have pretty robust cloud infrastructures of their own, so I don't think AWS gives them any real advantage," he said.
Posted by Jeffrey Schwartz on 02/24/2017 at 1:21 PM0 comments
Veritas is migrating its Enterprise Vault.cloud data archiving service from its own managed datacenters to Microsoft Azure. The move to outsource EV.cloud is part of a global multiyear partnership announced today by Veritas to optimize its portfolio of data protection offerings for hybrid cloud environments using the public Microsoft Azure cloud storage as backup and archive targets.
In addition to EV.cloud and its on-premises EV, Veritas is best known for its Backup Exec for Windows and NetBackup enterprise data protection software, which now support Microsoft Azure as a target, as promised last fall. Veritas said it is announcing the Microsoft pact as employees today celebrate the one-year anniversary of its spinoff from Symantec. Private equity firm Carlyle Group officially closed the $7.4 billion buyout from Symantec in late January of last year and Veritas has since pledged to accelerate making its data protection wares more ready for the cloud. Prior to its pact with Microsoft, Veritas last week said it will integrate its Veritas 360 Data Management tool with Amazon Web Services cloud offerings.
While Veritas' decision to move EV.cloud to a global service provider stemmed from its determination to focus on its software development priorities, Microsoft Azure was a logical contender given it's a popular archiving service for Exchange, SharePoint, Skype for Business, Lync messaging and Office 365 data. "There is a lot of potential in having archival technology living and breathing within the same cloud framework as Office 365," said Jason Buffington, a principle analyst at Enterprise Strategy Group.
Nevertheless, Veritas' decision to move EV.cloud to Azure represents an important endorsement of Microsoft's public cloud infrastructure. Despite the obvious connections between Office 365, Microsoft's on-premises software and Azure, Buffington said that Veritas could just have easily chosen Amazon Web Services (AWS) or Google Cloud Platform (GCP), which are also partners and have comparable global-scale clouds.
"When you think about what Enterprise Vault is about, cloud or no cloud, it's your archive copy of last resort. It's the copy the auditor requires and, in many cases, the copy that means you won't have absorbent fines or go to jail," Buffington said. "Veritas has decided that they trust the Azure platform. When they looked at what the underlying frameworks were like and their potential to innovate on top of that platform, and yet an assurance of geopolitical boundaries and a whole bunch of considerations, Veritas' relocation of their cloud is a huge testament to the Azure platform."
Veritas decided to outsource the hosting of EV.cloud to focus adding new levels of capabilities to the platform to support a growing number of new industry and government regulations that are impacting data retention and availability requirements. Alex Sakaguchi, director of solutions marketing at Veritas, said the growth of Office 365 and hosted Exchange, SharePoint and Skype for Business is contingent on the ability to provide the necessary protections and governance.
"The primary driver of that is moving from Exchange on premises to Exchange Online," Sakaguchi said. "And then there's some unique capabilities for customers to move to Office 365 to employ Azure Storage. From a technology standpoint, this will help facilitate that movement and ensure customers have their data management capabilities, visibility and the information that they need as they make that transition."
Veritas also said as part of its partnership, the company's NetBackup 8.0 enterprise data protection software now supports storage tiering to the Azure cloud for those looking to use it for information lifecycle management. Likewise, Veritas Backup Exec, the company's data protection software for small- and mid-size organizations, supports Azure as a target. Veritas last fall had announced plans to support Azure storage with common connectors it calls Open Storage Technology (OST), which also provide connections to Enterprise Vault, the on-premises solution, and EV.cloud.
It would stand to reason that the move of EV.cloud to the Microsoft Azure public cloud would suggest that it will also support Azure Stack, the forthcoming converged systems that will provide Microsoft's cloud infrastructure and platform service, set for release later this year. For now, that's not part of this agreement, said Tad Brockway, Microsoft's partner director for Azure Storage, who also said that Azure Stack doesn't fall under this agreement. But Veritas' decision to move EV.cloud to Azure will allow the company to lower the cost of running the service.
"Backup and archival are natural scenarios for the public cloud," Brockway said. "The public cloud gives customers for archival and backup the cost and cloud economics, which make sense for those workloads. Especially, if you look at the exponential growth in data, the requirements can only be met via public cloud."
Veritas' Sakaguchi said the migration will be staged over time, though the company isn't providing any timelines at this point. However, he said customers will be informed.
Posted by Jeffrey Schwartz on 02/22/2017 at 1:11 PM0 comments
An overwhelming number of organizations appear to lack mature best practices when it comes to addressing identity and access management to their systems, making them more vulnerable to breaches, according to 203 IT decision makers surveyed by Forrester Consulting.
Results of the survey, commissioned by IAM provider Centrify, were shared this week at the RSA Conference in San Francisco. Centrify CEO Tom Kemp shared the findings Monday during the Cloud Security Alliance event. A report based on the survey's findings determined that the least mature organizations experienced twice the number of breaches as the most mature ones.
That's not to say those who have adequately addressed authentication are immune to breaches -- they reported 5.7 annual incidents, while those with lacking identity and access management policies reported an average of 12.5 incidents per year. Across the board, two thirds said they have experienced five or more breaches during the past two years, with misuse of identities and passwords the key causes.
Nevertheless, most IT and information security managers aren't ignoring authentication and identity management, Corey Williams, senior director of product management at Centrify, acknowledged. "It's a more piecemeal approach. They do a few tactical things but not looking at things holistically," Williams said. The Forrester report emphasized issues stemming from privileged access as a common cause of breaches.
During the RSA Conference, Centrify polled another 100 security managers, which found 68 percent enforce single sign-on and 43 percent have multi-factor authentication implemented in their organization. Only 36 percent responded that they don't allow sharing of their privileged accounts, with 13 percent not allowing session recording, 12 percent implementing granular deprovisioning of access across server and application accounts and only 8 percent having privilege elevation management.
Posted by Jeffrey Schwartz on 02/17/2017 at 12:02 PM0 comments
Microsoft President and Chief Legal Officer Brad Smith's call for a "Digital Geneva Convention" convention would seek to forge ties with international governments and the tech sector committed to nonproliferation of cyberweapons and toward making "the Internet a safer place" with the goal of putting an end to nation-state attacks. Smith proposed the neutral organization, comparable with the International Atomic Energy Agency focused on non-proliferation of nuclear weapons, during his Tuesday RSA Conference keynote address.
Smith's "Digital Geneva Convention" would have the scope of a global convention that would, among other things, call on the world's governments to disengage from nation-state attacks against targets in the private sector, pledge to respond to vulnerabilities and put an end to the spread of vulnerabilities by sharing them with appropriate tech providers. Acknowledging the rise of nationalism, the "global technology sector needs to become a trusted and neutral digital Switzerland," Smith told RSAC 2017 attendees.
"We need governments to take the page out of the 1949 Geneva Convention, and other instruments that have followed," Smith continued. "We need an agency that brings together the best of the best and the brightest in the private sector, in academia and public sector. We need an agency that has the international credibility to not only observe what is happening but to call into question and even identify attackers when nation-state attacks will happen."
Smith emphasized that this organization must be global and neutral. This would help ensure that the group would focus on providing 100 percent defense and would not support offense-based counterattacks. Technology providers must also make clear that they will assist and protect all customers irrespective of the country they're from and commit to refusing to aid government-sponsored attacks on customers.
"These two principals have been at the heart and soul of what we've been doing. We need to stay on that path," he said. "We need to make the case to the world that the world needs to retain its trust in technology. And regardless of a government's politics, or policies or individual issues at any moment in time, we need to persuade every government that it needs a national and global IT infrastructure that it can trust."
In his remarks, echoed in a blog post, Smith emphasized the fact that the Internet is based on infrastructure provided by private companies and they are the "first-responders" to nation-state attacks, hence their responsibility to commit resources accordingly. Smith gave a plug to Microsoft's own efforts, which include its $1 billion per year spent on security, last year's release of Advanced Threat Protection to Exchange Online that identifies malware and suspicious patterns within the content of messages and, more recently, the addition of Office 365 Threat Intelligence. Smith also gave mention to Microsoft's new Office 365 Secure Score, the tool that helps administrators assess risk factors that was launched last week.
Where this will go remains to be seen but Smith's choice to use his opening keynote of RSAC 17 as the podium to propose this "Digital Geneva Convention" suggests he wanted it to be heard. RSAC is the largest gathering of security information professionals, policy makers and law enforcement officials and is perhaps the most viable vehicle to get talks started. It will require a strong show of force and a willingness of governments with conflicting interests to come together, which, of course, is no easy task. However, despite the obstacles, there's too much at stake by remaining idle.
Posted by Jeffrey Schwartz on 02/15/2017 at 12:36 PM0 comments
Microsoft has tapped Cisco Systems as the fourth partner to offer the forthcoming Azure Stack cloud platform. Cisco will offer Azure Stack in its UCS converged server and network appliance, which Microsoft today said both companies will engineer.
The datacenter and networking giant was conspicuously absent from the list of Azure Stack partners when Dell, HPE and Lenovo were first announced by Microsoft to offer the software on their respective systems. Microsoft is co-engineering the Azure Stack solutions on the respective three players converged systems as well.
Despite Cisco's absence from last summer's list, it has various partnerships with Microsoft tying various wares from both companies, including one inked in 2014 to integrate Cisco Nexus switches and UCS Manager with System Center integration modules and Cisco PowerTools with Windows Server 2012 R2, System Center 2012 R2, PowerShell and Microsoft Azure. The two had also announced plans to integrate Microsoft's wares with Cisco ACI and Cisco Intercloud.
Cisco's cloud strategy is going through a transition. The company is now emphasizing its hybrid cloud portfolio on UCS, as it sets to shut down its Intercloud network of interconnected public clouds at the end of March. Cisco had launched Intercloud, an ambitious multi-cloud project, in 2014. Based on OpenStack, the company at the time said its $1 billion effort would become the largest public network of interconnected clouds. Azure Stack, which brings the same software and engineering design into a converged system as Microsoft runs in its public cloud, appears to fit into Cisco's hybrid cloud portfolio.
Microsoft first revealed plans to offer Azure Stack in May 2015 and had targeted the end of last year to release the software. But after the first technical preview, the company decided to limit partnerships with the three OEMs to provide systems co-engineered between Microsoft and the respective hardware suppliers. While the decision angered a number of partners and service providers who were already testing the software on their own hardware, Microsoft defended the move, saying it wanted to ensure that the first Azure Stack implementations provided a consistent experience with the company's public cloud.
Nevertheless, Microsoft officials have indicated the possibility of offering Azure Stack through other partners over time. Besides Dell, HPE and Lenovo, Cisco is among the largest providers of datacenter infrastructure and the leading supplier of networking hardware and software. Among the four developing Azure Stack-based systems, Cisco is newer to the server business. The company launched UCS, which consists of virtualized servers, switches and management software, in 2009.
It appears the company is targeting its offering, called the Cisco Integrated System for Microsoft Azure Stack, at service and hosting providers, though large enterprises have expressed interest in deploying the private instantiations of the Microsoft Azure Cloud.
"The Cisco Integrated System for Microsoft Azure Stack reinforces Cisco's complete approach to cloud, offering businesses the freedom to choose the best environment and consumption model for their traditional and cloud-native applications," wrote, Liz Centoni, senior VP and general manager of Cisco's Computing Systems Product Group, in a blog post announcing its Azure Stack offering.
Slated for release in third quarter of this year, Centoni noted that the Cisco Integrated System for Microsoft Azure Stack will offer both IaaS or PaaS services integrated in UCS with high-performance networking switches and the company's Cisco Virtual Card Interface optimized for Azure Stack.
Cisco's Azure Stack offering will also provide a common management interface for managing compute, an Azure Stack optimized adapter and networking with templates to support policies for multi-tenant environments. The Azure Stack offering on is UCS will also include Cisco One Enterprise Cloud Suite, which automates deployment and management applications, and is offered with more than 20 hybrid cloud platforms.
Posted by Jeffrey Schwartz on 02/10/2017 at 3:29 PM0 comments
Microsoft's Enterprise Mobility + Security (EMS) service is getting a facelift. A unified EMS console will roll out in the coming months to customers with Microsoft's cloud-based platform for enrolling, configuring and securing devices and services including Office 365.
The upgrade comes as Microsoft is making an aggressive push to accelerate growth of its combined enterprise mobile device management and identity management-as-a-service offerings, announced less than three years ago, as a bundle of Intune for device configuration and management, Azure Active Directory Premium and Azure Information Protection (aka Azure Rights Management).
More than 41,000 organizations have paid subscriptions to Microsoft's EMS with 4,000 signing up in the last quarter, according to a tweet by Corporate VP for Mobility Management Brad Anderson. Paid seats last quarter grew 135 percent, his tweet added.
The new EMS console will provide one common system for mobile device management and user policies, Anderson underscored in a blog post late last month, announcing the planned rollout. "This means that you no longer have to go to one console to set identity policies, and then another console to set device/app policies. It's all together," Anderson noted.
Customers will be advised when their existing EMS tenants will change, which Anderson said should be complete over the next several months. The new EMS console will be part of a Web-based portal that won't be dependent on the currently used Silverlight-based approach. Any new subscribers and those signing up for trials will automatically have access to the new EMS Console, and existing customers can sign up for free trials if they want access to it right away.
"What we are delivering with this new EMS console is an integrated administrative experience that makes the end-to-end scenarios we've enabled far simpler, much more powerful, and even more flexible," Anderson noted. In an example of what the integrated administrative experience offers, Anderson's post described how admins can set conditional access policies.
"Conditional Access enables IT to define the rules under which they will allow access to corporate data -- which EMS then enforces in real time," Anderson explained. "With an integrated EMS console, we can now bring together all the different areas where IT wants to define risk polices that govern access -- this allows you to define a complete and comprehensive set of rules."
The EMS console lets IT managers define their own risk policies and set rules for access, such as whether or not certain log-in attempts should be deemed suspicious whether a device meets an organization's mobile device management (MDM) policies. "We will now evaluate in real time the risk in each of those areas and only grant access to a service/application if the risk is within the constraints you define," he noted.
In addition to managing devices with the EMS console, EMS customers can apply policies to more than 3,000 SaaS third-party offerings, as well as applications running on premises.
Posted by Jeffrey Schwartz on 02/10/2017 at 12:33 PM0 comments
VMware is upgrading its Horizon virtual digital workspace portfolio to make desktop-as-service (DaaS) more viable over low-bandwidth networks and for those requiring high performance. The company said it is implementing new protocols and deployment options including several different hyperconverged appliances and support for IBM's SoftLayer Cloud.
The company announced three new additions to its end user computing set of offerings: the new Horizon Cloud Service, Horizon Apps, and Horizon 7.1 desktop and apps delivery platform. The new offerings take advantage of new protocols VMware has acquired and optimized to boost performance, making it more suitable for various virtual client and app environments, including Microsoft's Skype for Business. These improvements to Horizon are currently rolling out at the technology preview stage, according to VMware.
VMware has extended it Blast virtual client protocol, which it introduced last year, with technology designed to improve latency by offering improved adaptive bit rates, reducing packet loss and providing better forward error correction. The company claims its new Blast Extreme Adaptive Transport (BEAT) offers 6x faster transfers over intercontinental networks and 4x when the connection is cross-continental, while reducing bandwidth by 50 percent. These findings are based on the company's testing but officials say it's suited for use on public Wi-Fi networks. "It really opens up a lot of use cases, where now we can really deliver that great user experience," said Sheldon D'Paiva, a director of product marketing at VMware.
The company is also offering new integrated management controls. The new Just-in Time Management Platform (JMP), which the company said provides an integrated approach to managing virtual desktops and apps, enables real-time video delivery, contextual policy management and improved configuration and provisioning. The initial JMP release will allow customers to take advantage of VMware's Instant Clones, previously only available for virtual desktops, and supporting it for published apps as well as for cloud deployment.
"We feel bringing these best of breed technologies together allows customers to dynamically scale desktops and apps on demand both on premises and in the cloud to best meet their use case," said Courtney Burry, VMware's senior director of end user computing. "The benefit of what this provides to customers when you combine all of these things together is that they get the ability to pool their infrastructure on the back end to drive down costs. They can destroy and create sessions on demand for improved security. And users get a better, more personalized experience every time they log in."
Robert Young, research director for IDC's IT service and client virtualization software practice, said the addition of the Just in Time management tools should be welcome by customers and prospects who want common controls across DaaS, on-premises and hybrid virtual client offerings. "If you're going to take advantage of Horizon and want to use Horizon Cloud on prem or in the public cloud, or a mixture of both, how you can utilize that common set of management tools to do that," Young said. "That's important because a lot of large companies don't want to go in on DaaS. They want to do some DaaS, still some on-prem. The ability to have those management tools function across those different environments is an interesting innovation from VMware in that respect. "
JMP will be offered with the new Horizon App Volumes advanced edition, which offers the RDSH app, session-based desktops and includes the User Environment Manager, VMware vSphere and VMware vCenter. It's priced at $200 per named user and $325 per concurrent user. A standard edition that doesn't include the JMP technology is priced at $125 per named user and $200 per concurrent user.
The new Horizon Cloud Service will let organizations deploy virtual Windows desktops and apps using a VMware-managed service hosted on the IBM SoftLayer public cloud. VMware teamed up with IBM last summer to announce at its annual VMworld conference that it would offer services managed on Big Blue's SoftLayer cloud. Horizon Cloud supports 3G graphics with devices that have Nvidia GRID virtual GPUs. In addition to using the public IBM SoftLayer cloud, the new Horizon Cloud service allows organizations to provision their own hyperconverged appliances. The company is supporting systems from its parent company, Dell EMC, as well as Quanta and Hitachi Data Systems
Posted by Jeffrey Schwartz on 02/08/2017 at 9:00 AM0 comments
Slack last week showed it's not about to let Microsoft walk away with the market for chat-based workgroup collaboration without a strong fight. Addressing some of the weakness in Slack, which has emerged as a widely popular service for creating ad-hoc workgroups based on its chat interface, the company announced its new Slack Enterprise Grid.
The move comes amid the pending release of Microsoft Teams this quarter, which will create a formidable challenge to Slack's namesake chat service. Slack Grid aims to counter some of the objections by IT and compliance managers to the company's current offering. Slack Grid answers those limitations by letting administrators control permissions and configure integrations on a per-workspace basis, the company highlighted in a blog post announcing the new release.
Slack Grid will also let administrators build shared channels between workspace. Among improved functions, Slack Grid will offer data loss protection (DLP) controls and a console that'll let administrators manage security, policy and compliance across an enterprise. Slack Grid will also let administrators customize policies and data retention settings for each workspace. Hoping to give it more appeal to enterprises, Slack Grid will also offer unlimited workspaces among multiple groups and departments.
Relatively new to the scene, Slack was founded in 2013 and counts numerous emerging companies as its customers but also has become increasingly popular in large organizations that include Accenture, Capital One Bank and IBM, among others. Slack is in the early stages of benefiting from the whole bring-your-own-device and so-called "shadow-IT" trend that has enabled services like Salesforce.com, Box, Dropbox and even Amazon Web Services to gain critical mass.
Courting Millennials
Seeing that Slack's viral success in the workplaces, Microsoft saw another alarming trend: millennials -- those born between 1980 and 1997 -- are expected to account for half the workforce by 2020 and have embraced Slack as a form of setting up ad-hoc collaboration teams. Slack's workspace is a familiar and consistent form of communication many millennials use in their personal interactions, which is why it has become so popular in the workplace.
"As a younger generation are used to in their personal lives dealing with things like Snapchat and Google Hangouts and these other services with persistent chat capabilities, Slack is definitely a major attraction," said Shyam Oza, a senior product marketing manager at AvePoint, which has partnered with Microsoft to integrate with Teams. "There's not much of a learning curve in moving into the Slack Workspace."
Microsoft Teams Momentum
Yet Microsoft has one key advantage with Microsoft Teams: it'll be included with Office 365 subscriptions. This should give it significant gravitational pull with IT from a management and cost perspective. Microsoft Teams, announced in November, is slated for release within the next seven weeks, if not sooner.
In a blog post announcing the status of Microsoft Teams, Kirk Koenigsbauer, corporate VP of the company's Office group, revealed more than 30,000 organizations in 145 markets and 19 language have "actively" used the preview release during January.
Koenigsbauer tacitly suggested Microsoft Teams will address the shortcomings of the current Slack offering. In the coming weeks, he said the company will release compliance and reporting features into Microsoft Teams, ensured at addressing DLP and other security concerns.
"Great collaboration tools don't need to come at the cost of poor security or a lack of information compliance," Koenigsbauer said. Microsoft Teams will also come with WhoBot, based on the Language Understanding Intelligent Service (LUIS) developer model for creating chatbots, when the new service is released, Koenigsbauer added.
Google Aligns with Slack
Microsoft's "update" last week came conveniently timed on the day of the Slack Grid announcement, and likewise on the same day Google announced new enterprise controls for G Suite (formerly Google Apps). Google's announcement is relevant to Slack since the two in December extended their partnership to help keep G Suite in the battle against Microsoft's Office 365 juggernaut.
The G Suite update now makes it competitive with Microsoft's Office 365 E5 SKU, said Tony Safoian, CEO of SADA Systems, one of the largest systems integrators with dealings with both companies' offerings. "It's one part of the story for them to offer a more comprehensive, secure, holistic and higher value enterprise offering," Safoian said in an interview last week.
While Google's partnership with Slack also helps broaden that focus, Safoian noted SADA in October announced a partnership with Facebook to offer its Workplace offering. "We plan to help customers implement this new service without organizational headaches and maximum productivity through user adoption programs, bringing about transformational value for their teams," Safoian said in a statement at the time.
It was very clear from the time of the Microsoft Teams launch event in early November that the company is putting a lot of weight behind it, claiming it already has 150 partners lined up to support it upon its release. AvePoint, a longtime SharePoint ISV, which now also supports Office 365, is among those with plans to support Teams.
Oza said that while the Slack Grid announcement promises to make that offering more compelling than its current version, he wants to see how it works, what tooling is offered, how well it integrates with existing infrastructure (including Microsoft's Active Directory) and how much it will cost.
"There's a lot in the [Slack] announcement that is powerful, and it shows they are moving in a positive direction and kind of addressing a lot of the criticism and gaps they face," Oza said. "It will be very interesting to see the product out in the wild for sure."
Posted by Jeffrey Schwartz on 02/06/2017 at 1:04 PM0 comments
Microsoft President and Chief Legal Officer Brad Smith today said that the company has formally requested that the U.S. government grant an exception from the travel ban enacted by President Donald Trump to its employees with nonimmigrant visas who live and work in this country. The president's executive order, enacted late last week, temporarily bans entry into the U.S. from seven countries: Syria, Iraq, Iran, Libya, Somalia, Yemen and Sudan.
In the formal letter to the newly sworn-in Secretary of State Rex Tillerson and Homeland Security Secretary John Kelly, Smith requested the exception to its employees using their authority in Section 3(g) of the executive order. That clause allows petitioners with "pressing needs" to enter the U.S. As reported Monday, Microsoft has 76 employees with nonimmigrant visas that live and work in the in the U.S. that are affected by the travel ban.
In today's filing, Smith added that those 76 employees have 41 dependents, including one with a terminally ill parent. "These situations almost certainly are not unique to our employees and their families," Smith stated. "Therefore, we request that you create an exception process to address these and other responsible applications for entry into the country."
Citing the Section 3(g) clause of the executive order, Smith said "the Secretaries of State and Homeland Security may, on a case-by-case basis, and when in the national interest, issue visas or other immigration benefits to nationals of countries for which visas and benefits are otherwise blocked." Smith stated that this "is not only consistent with the Executive Order, but was contemplated by it."
The Microsoft employees and family members the exception would apply to have already gone through extensive vetting by the U.S. government when approved for employment with their nonimmigrant visas, he noted. That vetting includes background checks by the Interagency Border Inspection System (IBIS), which required fingerprint and name checks, along with submitting US VISIT's Automated Biometric Identification System (IDENT) fingerprint information, DHS's Traveler Enforcement Compliance Program (TECS) name check and National Crime Information Center (NCIS) information, Smith underscored.
Smith pointed to the disruption the travel restrictions is having on Microsoft and all companies who have employees who need to travel internationally. "The aggregate economic consequence of that disruption is high, whether in administrative costs of changing travel plans or the opportunity cost of cancelled business meetings and deals," he noted. But he closed his request underscoring the human consequences the executive order has placed since it was enacted last week.
"Even among just our own employees, we have one individual who is unable to start her new job in the U.S.; others who have been separated from their spouses; and yet another employee who is confronted with the gut-wrenching decision of whether to visit her dying parent overseas," Smith stated. "These are not situations that law-abiding individuals should be forced to confront when there is no evidence that they pose a security or safety threat to the United States."
Posted by Jeffrey Schwartz on 02/02/2017 at 9:27 AM0 comments
Looking to chip away at Microsoft's growth in Office 365 subscriptions, Google this week added new enterprise, management and security features to its rival G Suite. Google will roll out the upgraded features this month to organizations with G Suite Enterprise edition licenses.
Google's G Suite remains the most direct alternative to Office 365, given that the search giant offers a complete and comparable alternative platform comprising of productivity, collaboration, messaging and voice communication services.
By most accounts, Office 365 has made strong inroads among enterprises of all sizes, as Microsoft moves to shift many of its Exchange and SharePoint customers to the cloud. But the market is still ripe and the upgrades Google is adding to G Suite Enterprise promise to make it more appealing to businesses by addressing critical data retention, corporate policy and security requirements.
Google claims it has 3 million paying customers, including PwC, Whirlpool and Woolworths, while Microsoft last week reported nearly 24.9 million consumer-based subscriptions of Office 365 -- up from 20.6 million a year ago. Microsoft hasn't revealed enterprise subscriptions, though it said its overall cloud business, which includes Azure, grew 93% year-over-year last quarter.
The upgrade to Google G Suite announced this week aims to close the gap between the two services by improving management controls and security. "Having greater control and visibility when protecting sensitive assets, however, should also be a top concern in today's world," wrote Reena Nadkarni, Google's product manager for G Suite, in a blog post announcing the upgrades. "That's why starting today, we're giving customers the critical control and visibility they expect (and their CTOs and regulators often require) in G Suite."
The new features include:
- Extended access control for administrators with Security Key enforcement, allowing IT management to require the use of these keys. Admins will also be able to manage the deployment of Security Keys and view usage reports, according to Nadkarni.
- Data Loss Prevention (DLP) now extended to Google Drive with DLP for Gmail. Building on the DLP features launched in 2015, Nadkarni said it now allows administrators to configure rules and scan Google Drive files and Gmail messages to enforce policies.
- S/MIME support for Gmail. Customers will have the option to bring their own certificates to encrypt messages using S/MIME, while administrators will be able to enforce usage and establish DLP policies based on the requirements of specific lines of business.
- BigQuery integration with Gmail, aimed at offering extended analytics. BitQuery aims to allow IT to run custom report and dashboards.
- Support for third-party e-mail archiving. In addition to Google Vault, customers will now have the option to use other archiving services, including HP Autonomy and Veritas.
If these additions will move the needle on market share remains to be seen. But these features certainly up the ante by offering key data protection services many IT decision makers are demanding and should be welcome to existing customers. Many of these features are slated to roll out this month, according to Google's G Suite Release Calendar.
Posted by Jeffrey Schwartz on 02/01/2017 at 10:32 AM0 comments
President Donald Trump's stunning executive order late Friday temporarily banning visitors with visas from seven countries entry into the United States has hit home for several prominent tech leaders, including Microsoft CEO Satya Nadella, an immigrant from Hyderabad, India. The global turmoil created by the immigration ban is predictably taking a huge toll on the technology community, which employs many immigrants.
While the administration underscored the ban is temporary -- 120 days for all refugees and 90 days for those from earmarked countries -- it quickly wreaked havoc among thousands of travelers. A federal judge on Saturday blocked part of the order, but the move caused chaos at U.S. airports where many travelers were detained and protestors held demonstrations. It notably created uncertainty among those working for tech companies as well as IT professionals and developers working in the United States. Tech CEOs were among the most vocal to raise alarms about the move.
Nadella joined a chorus of tech CEOs condemning the president's unprecedented ban. "As an immigrant and as a CEO, I've both experienced and seen the positive impact that immigration has on our company, for the country and for the world," Nadella said on his LinkedIn page. "We will continue to advocate on this important topic."
Nadella also pointed to an e-mail to employees by Microsoft President Brad Smith, who noted 76 employees are citizens of the countries banned by the administration: Syria, Iraq, Iran, Libya, Somalia, Yemen and Sudan. Microsoft has reached out to those employees individually, Smith noted, and encouraged those that the company isn't aware of that have citizenship in those countries and those unsure if they're affected to reach out.
"As we have in other instances and in other countries, we're committed as a company to working with all of our employees and their families," Smith stated in his e-mail, which Nadella shared in his post. "We'll make sure that we do everything we can to provide fast and effective legal advice and assistance."
Smith also underscored Microsoft's commitment to all its employees who are immigrants. "We appreciate that immigration issues are important to a great many people across Microsoft at a principled and even personal level, regardless of whether they personally are immigrants," Smith stated. "Satya has spoken of this importance on many occasions, not just to Microsoft but to himself personally. He has done so publicly as well as in the private meetings that he and I have attended with government leaders."
Both Smith and Nadella were among a parade of tech leaders who met with President Trump in New York on Dec. 15, along with Amazon's Jeff Bezos, Apple's Tim Cook, Alphabet CEO Larry Page, Facebook's Sheryl Sandberg, IBM's Ginny Rometty and Oracle's Safra Catz. Immigration and global trade were issues discussed during that meeting. Nadella reportedly raised the issue of immigration to Trump during the gathering, according to a Recode report, in which the Microsoft CEO emphasized much of the company's spending on R&D takes place in the U.S., which benefits from contributions from immigrants. The report indicated that Trump responded positively by saying "let's fix that," though gave no specifics or promises.
Nadella's peers also condemned Trump's order. Among them was Apple's Cook, who reportedly told employees in an e-mail that he heard from numerous people concerned about the implications of the move. "I share your concerns. It is not a policy we support," Cook wrote, as reported by BuzzFeed. "In my conversations with officials here in Washington this week, I've made it clear that Apple believes deeply in the importance of immigration -- both to our company and to our nation's future. Apple would not exist without immigration, let alone thrive and innovate the way we do."
Google Cofounder Sergey Brin, who's also president of its parent company Alphabet, was spotted at a demonstration protesting the move in San Francisco. "I'm here because I'm a refugee," Brin reportedly told Forbes. More than 100 Google employees were immediately impacted by the ban, according to an e-mail to employees by CEO Sundar Pichai, obtained by Bloomberg. "It's painful to see the personal cost of this executive order on our colleagues," Pichai stated in the e-mail. "We've always made our view on immigration issues known publicly and will continue to do so."
The order was among a barrage of swift and controversial moves made by Trump since he was sworn in as the 45th U.S. president 10 days ago. Although the moves are in keeping with his campaign promises, which surely are pleasing to his supporters, it appears Trump is doing so with approaches that are testing the limits of presidential power. As rulings are issued, it remains to be seen what steps, if any, Congress will take. The economic consequences are uncertain, though the markets are down sharply today. Surely, that will hardly offer consolation to those who don't know what their futures hold.
Posted by Jeffrey Schwartz on 01/30/2017 at 10:26 AM0 comments
Cisco Systems this week made an offer to AppDynamics it couldn't refuse. Cisco's $3.7 billion deal to acquire AppDynamics, the leading supplier of application performance monitoring (APM) software, came just days before AppDynamics planned to go public.
Cisco reportedly paid more than twice the amount what AppDynamics was aiming to fetch at the time of its IPO, which was scheduled for Wednesday, leading some to shake their heads at the premium the networking giant was willing to pay. With this deal, Cisco is also acquiring AppDynamics for a whopping 20 times its earnings. AppDynamics recorded revenues of $158 million during the first nine months of its fiscal year 2016, according to the company's prospectus. While that pointed to more than a $200 million year for AppDynamics, the company's year-over-year growth during the nine-month period was 54%. Despite the premium paid, Cisco executives reminded Wall Street that it has a good track record for capitalizing on its big deals.
Cisco CEO Chuck Robbins responded to a question by CNBC about the premium paid, noting that AppDynamics is the leading supplier of cloud-based APM and argued it's growing nearly twice as fast as its next competitor, New Relic. Moreover, Robbins argued AppDynamics is growing faster than any publicly traded software company today.
Indeed, AppDynamics is regarded as the leading provider of cloud-based APM software. The eight-year-old company can take telemetry from the application stack down to the code layer to either predict pending performance issues and to track the cause of those that occur, while providing business impact analysis. AppDynamics can monitor performance of applications in datacenters as well as public clouds including Amazon Web Services, Microsoft Azure and the Google Cloud Platform. As organizations move to hybrid clouds, software-defined infrastructures and use more Internet-of-Things devices, Cisco is looking toward the ability to holistically measure telemetry of apps at all of those tiers.
"The synergies between the application analytics that they can drive and the infrastructure analytics that we can drive across both private and public clouds creates business insights for our customers that no one else can deliver," Robbins told CNBC. "What we're doing with AppDynamics is to really help our customers understand what's going on in their environments."
During a conference call with media and analysts this week, AppDynamics President and CEO David Wadhwani explained why he decided to ditch the company's IPO at the 11th hour and accept Cisco's bid. "It's inevitable in my mind that we are moving to a world that's going to be focused on systems of intelligence," he said. "We are in that rarified position that can redefine not just how IT departments operate but how enterprises as a whole operate."
AppDynamics is used by many large enterprises including Nasdaq, Kraft, Expedia and the Container Store. In all, AppDynamics claims that 270 of the 2000 largest global companies use its platform. Wadhwani said Cisco, which is used by almost all large organizations, has an opportunity to extend AppDynamics' reach through its extensive channel partner network. The deal to acquire AppDynamics also comes as Cisco is on the cusp of rolling out its Tetration analytics platform, which the company claims will provide monitoring and telemetry for all activity in the datacenter.
"We saw an opportunity here together to provide a complete solution," said Rowan Trollope, senior VP and general manager of Cisco's Internet of Things and applications business, during this week's analyst and media briefing. "Infrastructure analytics, paired together with application analytics, to provide not just visibility into the application performance, candidly visibility and insight into the performance of the business itself."
Enterprise Management Associates VP of Research Dennis Drogseth said presuming Cisco can successfully integrate AppDynamics with Tetration could advance business performance management technology. Vendors touting BPM solutions have to deliver on the promises of such offerings to date, according to Drogseth.
"One of the questions I'm not sure they know the answer to is are they going to have a central analytic pool? Will all of the data from AppDynamics feed into the machine learning engine of Teration? Or, when they have the two together, could Tetration move to a cloud environment? And how will those two integrate? But again, it could be a very promising combination."
Asked by analysts on a conference call announcing the deal why Cisco was paying such a high premium for AppDynamics, executives pointed to its 2013 acquisition of Meraki, which provided a cloud-based platform for central management of Wi-Fi networks and mobile devices for $1.2 billion. At the time the deal was announced Meraki's annual revenues were $100 million. "Today, it's at a $1 billion bookings rate," said Rob Salvagno, Cisco's VP of corporate development. "And we see the same potential for synergies in this opportunity,"
Posted by Jeffrey Schwartz on 01/27/2017 at 1:15 PM0 comments
In a sign that Microsoft wants to utilize the technical chops of talent it has brought in with last month's acquisition of LinkedIn, Microsoft CEO Satya Nadella has named Kevin Scott to the newly created position of chief technology officer (CTO). Scott, who was the senior vice president of engineering at LinkedIn, joins Nadella's senior leadership team. He will be in charge of driving company initiatives. It's a dual promotion for Scott, who will continue at LinkedIn in a stepped-up role as senior VP of infrastructure.
The surprising move comes about seven weeks after Microsoft closed its $26.2 billion deal to acquire LinkedIn, by far the company's largest acquisition. By his choice, Nadella is signaling confidence that Scott can presumably play a major role in bringing together both the Microsoft and LinkedIn graphs that are key in creating new forms search, machine learning and artificial intelligence capabilities.
When initially announcing the acquisition deal, Nadella indicated the potential LinkedIn brought to Microsoft to help combine users' professional social networks with the tools provided in the workplace. "You look at Microsoft's footprint across over a billion customers and the opportunity to seamlessly integrate our network within the Microsoft Cloud to create a social fabric, if you will, that can be seamlessly integrated into areas like Outlook, Calendar, Office, Windows, Skype, Dynamics [and] Active Directory. For us, that was an incredibly exciting opportunity," Nadella said at the time. (See Redmond magazine's August 2016 cover story analyzing the implications of the deal.)
Given Scott's deep engineering and management background at LinkedIn (and before that at Google and AdMob), it appears Nadella believes Scott will lead that effort. "We are thrilled that Kevin will bring to Microsoft his unique expertise developing platforms and services that empower people and organizations," Nadella stated in today's announcement. "Kevin's first area of focus is to bring together the world's leading professional network and professional cloud."
The brief announcement emphasized that Scott will remain active at LinkedIn and on its executive team. "I am very optimistic about where Microsoft is headed and how we can continue to use technology to solve some of society's most important challenges," Scott stated.
Scott has a 20-year career in the field of academics as well as a researcher, engineer and leader, according to Microsoft. In addition to LinkedIn, Scott has held engineering and management roles at Google and AdMob, and has advised a number of startups. Microsoft said Scott is also an active angel investor who has founded Behind the Tech, an early stage nonprofit organization focused on giving visibility to lesser-known engineers responsible for, and involved in, advances in technology.
Posted by Jeffrey Schwartz on 01/24/2017 at 12:37 PM0 comments
After acquiring four companies over the past several years to extend beyond its core specialty of PC patch management, LANDesk has combined with Heat Software and the two companies effective today are now called Ivanti.
Heat Software is a SaaS-based provider of IT service management (ITSM) and endpoint configuration and control tools that are part of private equity firm Clearlake Capital's portfolio of companies, which earlier this month agreed to acquire LANDesk from Thomas Bravo. Terms weren't disclosed, through The Wall Street Journal reported the deal is valued at more than $1.1 billion.
While the individual brands will remain, at least for now, Ivanti is now the new identity of the combined company. Steve Daly, LANDesk's CEO, will lead Ivanti. By combining with Heat Software, Daly said it will accelerate LANDesk's ability to move from traditional endpoint management software into offering its products as SaaS-based tools.
"At LANDesk, historically we've been slow to the cloud," Daly acknowledged during an interview following the announcement of the deal. "From a Heat perspective, what it brings to us at LANDesk is first and foremost, their very robust cloud platform. First and foremost, that for us was the main strategic reason for this deal."
Daly said Heat has invested extensively over the past several years on bringing its ITSM tools to the cloud and he added that Heat offers a workflow engine that can manage endpoint lifecycle management. "That's really where the power is," he said.
LANDesk has a long history as a provider of patch management software but Daly has looked to extend its portfolio, most recently with last year's acquisition of AppSense, a popular provider of endpoint virtualization software. The other two companies under LANDesk's umbrella are Shavlik, which provides a broad range of security, reporting and management tools that include a System Center Configuration Manager (SCCM) add-in module, and Wavelink, a provider of mobile modernization and mobile enablement tools.
The combined company offers a broad range of offerings, ranging from privilege and patch management, security, IT asset management, ITSM, password control, desktop management and what Daly described as a complete suite of device management and reporting tools. Bringing together the two companies comes as Windows has more security and self-updating features, and the task of managing endpoints is now falling on both IT and security teams, Daly said.
"Because a lot of the management techniques are getting easier, the OS is building more and more management into the platform. It's really about how you secure that end user environment," Daly said. "This is particularly acute at the endpoints because the endpoint is such a dynamic environment, whereas the datacenter is pretty static and well controlled. Our endpoints change every day, as we download stuff or as we add content."
Going forward, Daly said he believes that the profile technology it uses for Windows 10 migrations and building support for mobile devices will become a key factor in delivering a so-called "digital workplace" because of the end user activity the platform gathers. "If you lose your laptop and you need a new one, bang, you don't lose anything -- we just grab your personality that we've watched and stored."
Posted on 01/23/2017 at 12:57 PM0 comments
As many retailers are reporting declining in-store growth as consumers continue to conduct more of their shopping online, Microsoft last week emphasized how a growing number of its retail customers are deploying IoT-based sensors to capture data that can help improve operations and sales in stores.
The company was among almost every major IT player showcasing technology with that very same focus at last week's annual National Retail Federation (NRF) show, held in New York. At this year's gathering, there was a greater focus on the use of helping lift sales and control inventory by using beacons to deliver data that can give retailers the ability to replenish inventory more rapidly without overstocking, while also delivering promotional messages to customers on their phones based on their proximity to certain products.
"Every retailer knows they have products that go out of stock. But if you can quantify how long an item has been out of stock and convert that into dollar figure and then spread that across your whole store, they can react to it," said Marty Ramos, CTO of Microsoft's retail industry segment, during an interview at the company's booth at NRF. "So, we're seeing sensors that tell how much product we have on the shelf. We are doing that with shelf mats and a robot that roams by the shelf and just checks whether or not that shelf needs to be restocked. Analytics has this power where you can just pop back in these numbers. I love the fact that you can quantify these hidden metrics."
During a Microsoft-sponsored session at NRF, executives at Nordstrom Rack, Mars Drinks, Hickory Farms and Coca-Cola discussed how they're using some of the artificial intelligence (AI) and predictive analytics features offered with Azure Machine Learning and Azure IoT services.
Nordstrom Rack is in the midst of a pilot at 15 stores where it has deployed in-store beacons provided by Microsoft partner Footmark, which connects the beacons to Azure IoT. The beacons track customers that are running an app on their phones to provide promotional offers based on their proximity to certain products. Among the 15 stores that have deployed the beacons, a better-than-expected number of customers have opted to use the app, said Amy Sommerseth, Nordstorm Rack's senior director of service and experience. Sommerseth also discussed the retailer's ability to reduce waiting times at the checkout counter and better help customers find merchandise. She described the pilot as successful and the store plans to roll it out to its 260 stores this year.
Hickory Farms, whose business ranges from a pure online channel to seasonal pop-up locations in shopping malls, is deploying Microsoft's new Dynamics 365 to replace its legacy inventory management systems. The company plans to start rolling it out in April, initially to upgrade and automate back-office and inventory management, with the retail component slated to be operational by October in time for the peak holiday shopping system, said Gordon Jaquay, Hickory Farms' director of IT. Microsoft further described the two companies' implementations, among others showcased at NRF, in a blog post.
Consumer packaged goods companies, which rely on retail to distribute their goods, are also using some of these new services. Saurabh Parihk, a VP at Coca-Cola, discussed at the NRF session how the company has started using Azure services to gather data for those in the field serving a broad constituency, which includes grocery markets, restaurants and vending machine operators. "Our current 2017 big focus is on predictive capabilities," Parihk said. "That's where we are understanding how we can better predict demand using our internal transaction data as well as external data, and, at some time in the future, we want to do more optimization because of some of the capabilities are not there yet."
Posted by Jeffrey Schwartz on 01/20/2017 at 12:17 PM0 comments
Microsoft has set its sights on retail as an industry that could benefit from Blockchain technology. The company hosted a demo at its booth during this week's annual National Retail Federation (NRF) show in New York that aims to help retailers streamline their supply chain operations by creating smart contracts based on Blockchain.
A solution developed by Microsoft partner Mojix, best known for its RFID hardware and data analytics software, lets retailers automate their supply chains to enable smart contracts, making the delivery of goods more reliable with less overhead, according to officials at both companies. While RFID, which uses RF signals to track the whereabouts of high-value inventory and pallets, has gained major inroads among certain segments of the retail industry, notably apparel, retailers that have adopted it still lack holistic visibility and control over their entire supply chains.
The solution developed by Mojix allows for Blockchain-based smart contracts between retailers, suppliers and logistics providers. During a discussion in Microsoft's booth at NRF, Mojix VP of Products Scot Stelter explained how a grocery chain implementing a smart contract could stipulate that an order of blueberries had to be picked on a certain day, arrive within five days and stored within a specific temperature range throughout the logistics and shipping processes.
"At each step of the way, that's a smart contract, where effectively a box gets checked, cryptographically locked and published to the Blockchain," he said. "When I am at the end of the chain, I see it so I can track the prominence of those berries so when they arrive I know if they are fresh. All parties to a contract have to agree that all the boxes that are checkable. Once they are checkable, the contract gets locked and it fulfills itself."
The smart contracts are based on Microsoft's Azure Blockchain-as-a-service, code-named Project Bletchley, consisting of a distributed ledger that's an immutable and unchangeable database record of every transaction, where specific values can be shared as desired, ensuring that even competitors in the chain can't access or compromise data not applicable to them.
For Mojix, offering smart contracts using Blockchain is a natural extension of its OmniSenseRF Intentory Service and ViZix IoT Software Platform, which provides location-based near real-time inventory management information and performance data. Microsoft has made an aggressive push to offer Blockchain services for the past year, as reported last summer. Almost all major banks and financial service companies are conducting extensive pilots using Blockchain, which is the technology that bitcoin currency is based on.
Microsoft has said it believes Blockchain has applications in many other industries as well and Yorke Rhodes said the company is working with Mojix and other partners to help automate supply chains. "In a typical supply chain, you have 10 or more legal entities that are disparate from each other," Rhodes said during a session at NRF. "[The supply chain] is a prime example of where Blockchain technology comes into play. The nature of the shared distributed ledger actually allows all parties to be contributing to the same ledger without one party owning the ledger. And all parties agree on what is actually the one state of truth. So, there's huge benefits here, across industries."
Merrill Lynch and mining operator BHB Billiton are among those that are using Microsoft's Blockchain service, and Rhodes believes automating retail makes sense as well. "What we are trying to do is pick use cases across sectors to be leading-light use cases," he said, in a brief interview following yesterday's session.
Traditional retailers will need all of the help they can get with several reporting slower in-store sales than their online counterparts during last quarter's peak holiday season. Since the beginning of the month, Macy's and Sears announced that they will close hundreds of stores, and Target today said in-store sales fell 3%, though its online sales surged 30%.
Posted by Jeffrey Schwartz on 01/18/2017 at 11:26 AM0 comments
Microsoft is taking another key step to move forward to advance artificial intelligence (AI) and machine learning with its agreement to acquire Maluuba, a Montreal startup that promises to squash the limitations of search and AI. Maluuba, founded in 2011, has some of the most advanced deep learning and natural language research and datasets in the world, according to Microsoft.
The Maluuba team will become part of Microsoft's Artificial Intelligence and Research group. Announced last fall, the new group brings together 5,000 engineers and computer scientists from the company's Bing, Cortana, Ambient Computing and Robotics, and Information Platform groups. Microsoft has made advancing AI, machine learning and conversational computing a key priority. Cortana, Microsoft's digital assistant in Windows, is a key pillar of that effort.
But as Maluuba CTO and Cofounder Kaheer Suleman said last year at a demonstration in New York, today's digital assistants are confined to a limited number of buckets or domains. This is why if you ask a question to Siri, Cortana or Google Now that they are designed to be asked, they work well. But if you ask anything outside of that they fall flat on their faces, Suleman said (see the 17-minute video and YouTube demo here).
"Maluuba's vision is to advance toward a more general artificial intelligence by creating literate machines that can think, reason and communicate like humans -- a vision exactly in line with ours," said Harry Shum, executive VP of Microsoft's AI and Research Group, in a blog post announcing the deal. "Maluuba's impressive team is addressing some of the fundamental problems in language understanding by modeling some of the innate capabilities of the human brain -- from memory and common sense reasoning to curiosity and decision making. I've been in the AI research and development field for more than 20 years now and I'm incredibly excited about the scenarios that this acquisition could make possible in conversational AI."
The company's founders, CEO Sam Pasupalak and Suleman, met in an AI class at the University of Waterloo and share a common goal of making AI more intuitive by developing machine learning techniques optimized for natural interactions. Professor Yoshua Bengio, described as "one of Deep Learning's founding fathers," was key in supporting their efforts and will advise Microsoft's Shum while maintaining his role as head of the Montreal Institute for Learning Algorithms.
"Understanding human language is an extremely complex task and, ultimately, the holy grail in the field of AI," the two said in a blog post announcing the Microsoft deal. "Microsoft provides us the opportunity to deliver our work to the billions of consumer and enterprise users that can benefit from the advent of truly intelligent machines."
The two also pointed to Microsoft's Azure public cloud as the optimal choice for applying the datasets the company has developed because of the GPUs and field-programmable gate arrays (FPGAs) used to bolster its global infrastructure, which the company revealed back at last fall's Ignite conference in Atlanta. Maluuba's most recent advance came last month when the company announced the release of two natural language datasets.
The first dataset is called NewsQA, which the company said can train algorithms to answer complex questions that typically require human comprehension and reasoning skills. It uses CNN articles from the DeepMind Q&A Dataset, described as a collection methodology based on "incomplete information and fostered curiosity." The questions in the dataset "require reasoning to answer, such as synthesis, inference and handling ambiguity, unlike other datasets that have focused on larger volumes yet simpler questions. The result is a robust dataset that will further drive natural language research," according to the company.
The other dataset, called Frames, addresses motivation and is based on 19,986 turns designed to train deep-learning algorithms to engage in natural conversations. The model simulates text-based conversations between a customer interacting with a travel agent.
Terms of the deal were not disclosed but the company was backed by Samsung Ventures.
Posted by Jeffrey Schwartz on 01/13/2017 at 12:40 PM0 comments
In its latest effort to extend its AirWatch platform beyond core mobile device management (MDM), VMware today said it has tapped Adaptiva to integrate its OneSite peer-to-peer software distribution tool with its new AirWatch Unified Endpoint Management (UEM) offering.
Adaptiva's OneSite is a popular option among large enterprises with Microsoft's System Center Configuration Manager (SCCM) for distributing OS and software images with thousands of client endpoints. OneSite's appeal is its efficient form of software distribution via content delivery networks (CDNs) using peer-to-peer distribution across end user endpoints, rather than a server-based approach. Designed specifically to bring this software distribution capability to SCCM, the pact with VMware, its first with the company, will extend OneSite's use, while also offering an important option to organizations with the new AirWatch UEM.
OneSite will be built into AirWatch UEM and offered as an option sold by VMware. AirWatch UEM, launched in October, looks to broaden the focus of MDM to include configuration and lifecycle management. It's regarded as the most formidable alternative to Microsoft's Enterprise Mobility + Security (EMS), which includes its Intune device management tool and Azure Active Directory, Azure Information Protection and the ability to deliver Remote Desktop Services (RDS).
The new AirWatch UEM aims to include MDM but also is a platform for deployments, security and managing all endpoints. Aditya Kunduri, VMware's product marketing manager for AirWatch UEM, said UEM was designed to align with Windows 10's modern application framework, while allowing organizations to continue to run Win32 applications in their native modes.
"IT can now deploy their traditional Win32 applications, be it EXEs, MSIs or ZIP files from the same platform, and also their existing mobile applications," Kunduri explained. "And it's not just about pushing down an application -- they can also tie these applications with additional lifecycle management features such as attaching dependencies to those apps."
For example, if you are deploying WebEx Outlook plugins to your end users who don't have Outlook, it would be a waste of network resources, he said. "We provide this intelligent platform that can manage those features alongside just pushing down the app. And on top of that, you can manage the application patches when they're available and attach these to those files too," he said.
Kunduri argued that deploying to PCs that might have gigabytes of data is no longer practical with traditional PC Lifecycle Management (PCLM) tools (especially for organizations with thousands or tens of thousands of devices) because they must rely on distribution management nodes and servers, which provide significant overhead and points of failure. "You don't need to manage those distribution servers and distribution points anymore because it's all delivered from the cloud and directly to the peers. Now you're reducing your network and bandwidth infrastructure so it's a huge cost savings," he said.
Adaptiva COO Jim Souders said MDM platforms will continue to evolve into a common platform for managing all endpoints over the next few years. "Ultimately you will get to a common means of which to deploy software across disparate types of device types," Souders said. "Obviously, Microsoft and VMware are two of the bigger players going down that path. I think that's where there will be potential redistribution or affirmation of players."
Posted by Jeffrey Schwartz on 01/11/2017 at 1:10 PM0 comments
Following on last year's plan to deliver new virtual desktops and application-as-a-service offering on Microsoft's Azure public cloud, Citrix today released its Windows 10 desktop-as-service VDI offering that will run on Microsoft Azure. Citrix also said its apps-as-a-service offering, poised to replace Microsoft's Azure RemoteApp, will arrive this quarter.
The new services, to be called Citrix XenDesktop Essentials and XenApp Essentials, were launched today at the annual Citrix Summit, taking place in Anaheim, Calif. Both are key new offerings developed by both companies as part of broad extensions of their longstanding partnership announced back in May at the company's Synergy conference. Citrix XenDesktop Essentials was the first offering announced that lets organizations provision virtual Windows 10 instances as a service using their existing software licenses. The forthcoming XenApp Essentials will let organizations deploy Windows 10 Enterprise images on Azure.
This is aimed at "... those organizations seeking a simplified way to deploy Windows 10 virtual desktops in the Microsoft Azure cloud," said Calvin Hsu, VP of product marketing at Citrix, discussing several key announcements at its partner conference. "Microsoft customers who have licensed Windows 10 Enterprise on a per-user basis will have the option to manage their Windows 10 images on Azure through our XenDesktop VDI solution. Once XenDesktop Essentials is set up and running, the service can be managed by the Citrix Cloud."
The two companies, which have a longstanding partnership, described last year's extension of its pact as their broadest to date. In additions to offering VDI and app services on the Azure public cloud (managed by Citrix), the extended pact aims to offer a new delivery channel for Windows desktops and apps, including Skype for Business, Office 365 and Microsoft's Enterprise Mobility Suite (including Intune) via the Citrix Workspace Cloud. Citrix is running its digital workspace platform on Microsoft Azure. Citrix, through its service provider partners, will offer these new services via XenApp Essentials.
Citrix also said it will kick off a pilot for its network of service providers looking to offer its workspace platform using its cloud. Based on the licensing model found in other Citrix offerings, the company is looking for existing and new service providers to deliver hosted, managed desktop-as-service offerings and app workspaces.
Another pillar of last year's pact between the two companies included plans to integrate Microsoft's Enterprise Mobility Suite with Citrix NetScaler, the company's application delivery controller (ADC) and load balancing appliance. Citrix said the resulting integration of the two, the Citrix NetScaler Unified Gateway with Microsoft Intune, is now available. Citrix said the new offering lets administrators apply policies tied to Microsoft's EMS to NetScaler, allowing for conditional single sign-on access based on specific endpoint and mobile devices.
"Together, our solution allows IT administrators to define access control policies based on the state of the end user mobile device," explained Akhilesh Dhawan, principal product marketing manager at Citrix, in a blog post. "These policies will check each end-user mobile device before a user session is established to determine whether the device is enrolled with Microsoft Intune and is compliant with the security policies set by an organization and -- only then -- grant or deny access accordingly."
For customers looking for hybrid solutions, the company launched a new program that will provide hyper-converged infrastructure running on integrated appliance. Providers of hyper-converged infrastructure hardware that are part of the new Citrix Ready HCI Workspace Appliance Program will offer appliances designed to automate the provisioning and management of XenDesktop and XenApp.
Hewlett Packard Enterprise and Atlantis Computing are the first partners inked to offer a new solution based on the new program, according to Citrix's Hsu. The appliance will include XenApp or XenDesktop running on HPE's new Edgeline Converged EL4000 Edge System, a 1U-based system available in configurations from one to four nodes and four to 16 cores based on Intel's Xeon processors with GPU compute and integrated SSD storage. Included on the HPE system is Atlantis' USX software-defined storage offering, which creates the hyper-converged infrastructure delivering the Citrix Workspace.
Citrix also announced today that it has acquired Unidesk, a well-regarded supplier of virtual application packaging management software that, among other thing, can manage both Citrix XenDesktop and Microsoft's Remote Desktop Service (RDS). Citrix said Unidesk's application layering technology "offers full-stack layering technology, which enhances compatibility by layering the entire Windows workspace as modular virtual disks, including the Windows operating system itself (OS layer), apps (app layers), and a writable persistent layer that captures all user settings, apps and data."
The company also described the latest Unidesk 4.0 architecture as a scalable solution that offers the company's app-layer technology and aims to ease customers' transition to the cloud by providing a single app image that covers both on-premises and cloud-based deployments. Citrix said it will continue to offer Unidesk as a standalone product for VMware Horizon and Microsoft virtual desktop customers.
Posted by Jeffrey Schwartz on 01/09/2017 at 1:42 PM0 comments
Looking to create a new way for digital content creators to interact with Microsoft's forthcoming Windows 10 Creators Update, Dell took the wraps off a novel desktop device with a next-generation display that promises to create more immersive user experiences. The new Dell Canvas debuted at this week's Consumer Electronics Show (CES) in Las Vegas and effectively merges display and input into a single device.
At first glance, the 27-inch device will compete with Microsoft's recently launched Surface Studio. While both are aimed at bringing forth the new input capabilities coming to Windows 10 Creators Update later this quarter, the Surface Studio is an all-in-one computer while the Dell Canvas is only a display that connects to any PC running the new operating system.
Like the Surface Studio the Dell Canvas will appeal to digital content creators ranging from engineers, industrial designers and users in finance who create simulations and models. The Dell Canvas even supports a new interface device called the Totem. The Totem, about the size of a hockey puck, is similar to the new Surface Dial, which Microsoft introduced as an option to the Surface Studio.
Despite the similarities, Dell believes the new offering, which the company's engineers collaborated on with Microsoft's Windows team on for more than two years, will offer a more immersive and interactive user experience. "It's transitioning from physical analog interactions that are hardware based to more dynamic digital-based interactions," explained Sarah Burkhart, a product manager with Dell's precision computing group, who gave me a demonstration of the Dell Canvas during a briefing last week in New York.
Burkhart believes that the Canvas and the use of the Totem, which, unlike the Surface Dial, delivers digital signals and can stick on the surface of the display, will open new ways for vertical software vendors such as Avid, Cakewalk, Solidworks and Sketchable to interact with Windows. "We have been pleasantly surprised at the interest in horizontal plus vertical from lots of areas outside of digital content creation," she said.
The Surface Dial will also work with the Dell Canvas, Burkhart said. The device comes with the Dell Canvas Pen based on the electro-magnetic resonance method and supports 20 points of touch. It can also be positioned at any angle or lay flat.
Posted by Jeffrey Schwartz on 01/06/2017 at 1:19 PM0 comments
Drivers of Volvo's newest 90 Series of cars will be able to initiate and participate in calls and conferences using Microsoft's Skype for Business. Volvo is displaying the new in-dash Skype for Business feature at this week's Consumer Electronics Show (CES), taking place this week in Las Vegas.
Skype for Business will be part of dashboard infotainment experience in Volvo's high-end S90 sedan, as well as the company's XC90 SUV and V90 Cross Country wagon. Drivers have access under Skype for Business subscriptions, offered with Office 365 licenses. They use a Skype for Business application that will let drivers bring up their schedules and contacts to initiate or access calls and conferences.
"With the addition of Microsoft Skype for Business app, Volvo will eliminate the need for fumbling with your phone while driving or remembering long conference call codes," according to the narrative of a one-minute video advertising the feature. Once calls are completed, drivers can send themselves a note with the recording of the call.
"We see a future where flexible in-car productivity tools will enable people to reduce time spent in the office," said Anders Tylman-Mikiewicz, Volvo's VP of consumer connectivity services, in a prepared statement. "This is just the beginning of a completely new way of looking at how we spend time in the car."
Volvo is among a handful of automakers who have recently begun working with Microsoft to improve manufacturing and in-car experiences. Just over a year ago, Volvo revealed it was experimenting with Microsoft's HoloLens enhanced reality headsets. Other noteworthy automakers working with Microsoft include Nissan, Toyota and BMW.
Microsoft recently showcased its work with BMW at last fall's Ignite conference in Atlanta, where Executive VP for Cloud and Enterprise Scott Guthrie demonstrated software that he said provides better customer engagement and an "immersive end-user experience" within the vehicle that spans both dashboard display and native mobile apps that the car owner can use to manage the car remotely. BMW built the capabilities using Azure IoT and the company's Azure Data Analytics services.
Asked if Volvo plans to make Skype for Business available in all of its models, a spokesman said for now the company is only offering it in the 90 Series 2017 models. Volvo said the company is also exploring the possibility of embedding Cortana, Microsoft's digital assistant technology built into Windows 10, into its vehicles to enable voice recognition.
Volvo signaled strong interest in extending the use of voice recognition into its cars at last year's CES, when the company showcased the ability to control navigation, the heating system, door locks, lights and the horn using its Volvo on Call app via the Microsoft Band 2. A few months ago, Microsoft quietly discontinued the Microsoft Band, which could explain why Volvo is now looking at Cortana.
Posted by Jeffrey Schwartz on 01/04/2017 at 2:06 PM0 comments
After reportedly flirting with acquiring Slack -- the chat-based collaboration service is popular with millennial workers -- for $8.5 billion, Microsoft decided to build rather than buy. The result was Microsoft Teams, which the company unveiled last month, and will be added to Office 365 Business and Enterprise subscriptions early next year. Apparently, not taking the competition lightly, Slack ran a full-page ad in The New York Times on Nov. 2, the day of the launch, with a lengthy letter to Microsoft warning the company that chat-based collaboration is more complicated than it looks. The move was borrowed from the late Apple CEO Steve Jobs, who infamously ran the Welcome IBM, Seriously ad in 1981 when Big Blue unveiled its first PC.
Apparently realizing talk is cheap, Slack has found a potentially more potent way to defend its turf by extending its ties with Google. Slack already works with Google Docs, representing one of its early integration deals. Google Drive files are imported into Slack roughly 60,000 times each weekday, according to Google, which calculates that with a file shared every 1.4 seconds. Now the two companies have embarked on a joint development pact, which Google announced last week.
"We're increasing our joint product and engineering efforts to strengthen the link between the content in Google Drive and the communication in Slack," said Nan Boden, Google's head of technology partners, in a blog post. The two companies are going to develop the following capabilities:
- Instant permissions: Customizing permissions in Google Drive ensuring files shared in Slack are immediately available to the proper teammates.
- Previews and notifications: Google Drive and Docs notifications will be delivered in Slack, which will enable previews.
- Slack channels tied to Team Drives: A centralized repository for shared content will keep Team Drives synchronized. Google announced Team Drives in September. Slated for release early next year, Google said customers will be able to pair Team Drives and Slack channels to keep content and conversations in sync.
- Provisioning Slack from G Suite: The G Suite (the new name for Google Apps) administrative will be able to provision Slack.
The new Google Teams will aim to make it easier for users to find and access files, simplify group scheduling, add more capabilities to its Google Sheets spreadsheet app and add support for external conferencing, among other near features. With Microsoft and the Google-Slack tandem targeting next quarter to release their respective new offerings, the battle for the mindshare of millennials in the workplace is set to intensify in the new year.
Posted by Jeffrey Schwartz on 12/15/2016 at 1:17 PM0 comments
Deciding on which vendors' security tools to implement is a complex process, especially as threat and attack vectors frequently change, along with the environment itself (new infrastructure, apps and devices) that IT pros need to protect. But an even bigger challenge for IT professionals is finding skilled security experts.
A survey conducted by Intel Security's McAfee Center for Strategic and International Studies released last summer pointed to a global shortage of skilled cybersecurity professionals. The report gave a grim assessment of the current availability of those with security expertise across all disciplines. "The cybersecurity workforce shortfall remains a critical vulnerability for companies and nations," the report's introduction warned. "Conventional education and policies can't meet demand. New solutions are needed to build the cybersecurity workforce necessary in a networked world."
The survey of 775 IT decision makers in eight countries (including the U.S.) found that 82 percent are coping with a shortage of cybersecurity skills in their IT department, 71 percent report that the lack of talent is causing direct and measurable damage and 76 percent believe their respective governments aren't investing adequately in educating or training cybersecurity talent.
Among the three top skill sets respondents have the most demand for are for those who can help address intrusion detection, secure software development and attack mitigation. The report also estimated that up to 2 million cybersecurity jobs will remain unfilled in 2019. In the U.S., 209,000 open cybersecurity positions went unfilled and job postings have risen 74 percent over the past five years, according to the Bureau of Labor Statistics.
IT compensation Expert David Foote, founder of Foote Partners and author of this month's Redmond feature "IT Jobs Poised To Pay," emphasized the shortage of cybersecurity experts during the opening keynote address at last week's annual Live! 360 Conference, held in Orlando, Fla. "Companies are struggling to make this leap from information security to cybersecurity," Foote said. "Information security in so many companies and in so many regulated industries [is addressed] with very skeletal staffs."
Foote's report for the third quarter of 2016 showed that skills-based pay premiums for cybersecurity experts increased 10.7 percent for the year. Based on data compiled throughout 65 U.S. cities, the average salary for an information security analyst is $99,398, while senior information security analysts earn $127,946. The average salary for security architects were $133,211. At the management tier, VPs of information security earned an average of $206,331, while managers of security compensation averaged $137,466.
Posted by Jeffrey Schwartz on 12/14/2016 at 10:48 AM0 comments
Intel Security's True Key password manager can now work with Windows Hello, enabling multifactor authentication using biometrics.
The integration of both authentication capabilities comes nearly two years after Intel Security introduced True Key at the January 2015 Consumer Electronics Show. At the time, Intel Security saw it as a better alternative to Windows Hello, a feature delivered in Windows 10 six months later.
Intel Security Chief Technology Officer Steve Grobman, hadn't ruled out supporting Windows Hello at some point. But during an early 2015 interview he pointed out that True Key targeted all devices and operating systems, not just Windows 10.
"I think Microsoft has done a lot of good things in Windows 10, but I don't know if Hello and Passport will completely change the use of passwords right away because users have many devices they work on," Grobman said at the time. "But Microsoft is definitely taking steps in the right direction." Intel described True Key as a password manager that supports multifactor authentication including fingerprints and any 2D camera, and works with Intel's RealSense cameras for more extensive security.
Now Intel Security, which will become McAfee once TPG completes its $4.2 billion buyout of a 51 percent majority stake of the business unit with Intel Corp. retaining a 49 percent minority investment, is offering True Key as an app extension that integrates with Windows Hello via the Edge browser. The app extension lets users add Windows Hello as an authentication factor to a True Key profile, which remembers existing and new passwords. Intel True Key will automatically enter a username and password when logging in to apps and Web sites with Windows Hello authentication.
"The password problem won't disappear overnight, which is why working with Windows Hello is a big step in the shared vision between Intel Security and Microsoft of a password-free future," said Intel Security CTO for True Key Richard Reiner in a statement. "By providing the True Key app with its enhanced multi-factor protection and support for dynamic Web form-filling, we continue to build an application that will encourage better password management and online security."
Users can download the free app from the Windows Store and then enable Windows Hello in their security settings.
In addition to Edge, Intel Security last week said True Key also supports Internet Explorer, Chrome and Firefox. Intel Security has also extended the number of authentication factors Android and iOS can combine to three. Factors supported include facial, fingerprint, trusted device and a master password. Android users can also now authenticate with a fingerprint and use it in the Android browser or Opera.
Posted by Jeffrey Schwartz on 12/12/2016 at 12:18 PM0 comments
Microsoft for the first time will require certificate authorities to institute minimum requirements for how digital certificates tied to Windows-based executables and scripts are verified. The move is being made in hopes of making it much more difficult to distribute malware. The new requirements apply to code signing, the process of applying digital signatures to executables and scripts to verify an author's identity and validate that the code hasn't changed and isn't malicious. Following two years of discussions, the Certificate Authority Security Council (CASC), a group that includes the top seven CAs, this week said they have agreed on the code-signing requirements for Windows-based systems.
The new requirements will apply to Windows 7 and above, which introduced the dynamic root store update program enabling the removal of roots easily from the root store. CASC officials said standardizing code signing has become essential in verifying that software installed in an OS is authentic.
Code or executables won't run in Windows and most browsers if their certificates are unsigned, as user authentication is required first for them to execute. However, more sophisticated rogue actors have issued seemingly legitimate certificates. Because CAs had no code-signing standards, a rogue signature only needed to get by once for the malicious code to spread. The new CASC requirements, which the CAs and Microsoft will implement Feb. 1, aims to block such attempts to distribute malicious code with invalid signatures. CASC officials said Microsoft will be the first to institute the new guidelines, with other key players expected to follow in the near future.
"The main aim was to encourage better [digital] key protection, make sure there was a standard for validating identity information within digital certificates and to make sure there is a very prompt and streamlined process for revoking certificates if they are found to be used with malware. And then implement brand new standards for time-stamping services so that you can time-stamp your code and it will work on a longer period," Jeremy Rowley, VP of business development at DigiCert and a member of the CASC, said. "What we came up with is something everyone is happy with. It looks like it will accomplish those advantages." Rowley said all of the members of CASC are supportive of the new code-signing requirements, including the top CAs, which in addition to his company include Comodo, Entrust, GlobalSign, GoDaddy, Symantec and Trustwave. "The entire CA community and industry have bought into this," he said.
"Since it's being added to Microsoft's policy and part of their root distribution policy, it ends up being a mandatory item for any CAs working with that policy to follow the guidelines," added Bruce Morton, another CASC member and a director at Entrust Certificate Services. "In other words you didn't have a choice."
Morton added that this will extend beyond just Windows and Microsoft's browsers. "We did write the policy so it could work with non-Microsoft root policies with the expectation that other browser providers or other software vendors who rely on code-signing certificates would eventually want to use it," he said.
Having spent the entire week at the Live! 360 conference in Orlando, I asked some security experts attending TechMentor sessions about the new rules. MVP Sami Laiho, CEO of Adminize, who last week disclosed a Windows in-place upgrade security flaw, said the move is important.
"It's very big, because before this the whole certificate issuing industry has been the biggest cause of lacking trust," Laiho said. "We've had these issuers but we've had no restrictions on who the issuers can be or how they operate. This will increase security on the technical side. The whole issue of this is the whole concept of finally having some sort of a certification for those partners."
Dale Meredith, an ethical hacking author for Pluralsight, was among a few who wondered if the move will make it harder for legitimate users such as students, researchers and startups. Nevertheless, Meredith agreed with Laiho that it should improve security. "It will definitely make it harder for attackers," he said. "If it's done right it will protect users and companies from malicious attacks."
CASC spelled out three of the key guidelines, which include:
- Stronger protection for private keys: The best practice will be to use a FIPS 140-2 Level 2 HSM or equivalent. Studies show that code signing attacks are split evenly between issuing to bad publishers and issuing to good publishers that unknowingly allow their keys to be compromised. That enables an attacker to sign malware stating it was published by a legitimate company. Therefore, companies must either store keys in hardware they keep on premises, or store them in the new secure cloud-based code signing cloud-based service.
- Certificate revocation: Most likely, a revocation will be requested by a malware researcher or an application software supplier like Microsoft if they discover users of their software may be installing suspect code or malware. After a CA receives a request, it must either revoke the certificate within two days, or alert the requestor that it has launched an investigation.
- Improved code signatures time-stamping: CAs must now provide a time-stamping authority (TSA) and specify the requirements for the TSA and the time-stamping certificates. Application software suppliers are encouraged to allow code signatures to stay valid for the length of the period of the time-stamp certificate. The standard allows for 135-month time-stamping certificates.
The CASC published a technical white paper that describes the new best practices, which is available for download here.
Posted by Jeffrey Schwartz on 12/09/2016 at 12:43 PM0 comments
Rapid changes in technology and businesses finding they must become more agile if they want to survive means IT pros and developers need to calibrate their skills accordingly. That was the call-to-action by David Foote, founder and chief analyst at Foote Partners, who gave the opening keynote at the annual Live! 360 conference, taking place this week in Orlando, Fla.
Demand for IT pros and developers remains strong for those who have honed their skills on the right technologies. But having strong "soft skills" and knowledge of the business is also more important than ever. However, those who can't keep pace with those requirements are likely to find a more challenging environment in the coming years, warned Foote.
Foote would know. His company has spent the past two decades tracking IT salaries to create quarterly and annual benchmark reports based on data from 250,000 employees at companies in the U.S. and Canada in 40 different industries. The scope covers 880 IT job titles and skills, both certified and non-certified.
Foote's keynote coincided with his most recent findings and analysis published in this month's issue of Redmond magazine. "It's a great time to be working in technology. There's so many jobs, there is so much need coming down the pike," Foote said. Many IT pros and developers, nevertheless, may find those jobs coming in areas they have never considered, he said. One example, as the growth if Internet of Things continues to accelerate, more software experts will need to learn specialized hardware skills.
Likewise, cross-skilling will be critical, he said. Despite the hype, digital transformation initiatives are placing skills premiums on experience in key areas that include architecture, Big Data, governance, machine learning, Internet of Things, cloud, micro-services, containers and security. These areas are among those on this quarter's "hot skills" list, which on average command skills premiums of 12 percent.
Skills-based pay is based on the premium a specific job title commands. The percentage represents the amount of base pay employers will pay for that premium. That pay is guaranteed without targets, but those percentages can fluctuate from year to year, as demand for skills changes. Currently, large organizations are looking to put these digital transformation efforts on a fast track for their business survival, Foote explained.
"More and more companies are creating innovation groups," Foote said. "They are usually not part of IT, they are on their own. They may take people from IT, but they figured out that this has to be a department that sits in the business and they need to be left alone to do their stuff because they are creating the future of the company. These companies are betting that their money on these digital groups as something they need now but they are really going to need five to ten years from now."
Salaries for those at director level who can run these efforts start at $225k with bonuses often 50 percent of base. With stock options, these experts can earn more than $400,000 per year, he said. At the same time, many IT people find themselves working in organizations for many years earning below market rates.
"Don't feel bad if you're basically happy but could do better. Because, as part of my advice here is that you probably need to look for a job somewhere else," Foote said. "There's a thing called salary compression, which if you are in the compensation world, it means that job is growing faster in the market in terms of pay value than the annual increase that you are getting at the company."
The need by companies to become more agile is often leading to more contingent hiring, where instead of employing full-timers, organizations are hiring consultants or those willing to work on a contract basis. "One of the reason we have such an under-employed population out there in the world is that companies are just not hiring, because it's expensive, and they need an agile workforce. They need somebody to come in and get this work done," he said.
Other reasons for turning to the so-called "gig economy" is hiring the right person on a full-time basis is time consuming and complex these days, he noted. Many often will hire those consultants and contract workers on a full-time basis in the future when it makes sense, he added.
To view his entire presentation and see what's in store, click here.
Posted by Jeffrey Schwartz on 12/07/2016 at 1:31 PM0 comments
The European Union's approval of Microsoft's $26.2 billion acquisition of LinkedIn clears the final regulatory holdup, allowing the deal to close in the coming days.
If there was any potential for Microsoft to miss its target of closing its $26.2 billion acquisition of LinkedIn, announced in June, the European Union yesterday put that to rest by approving the deal, which is now set to close in the coming days.
The approval dashes the last hope by those objecting to the deal -- notably Salesforce.com CEO Marc Benioff, who filed objections with both the U.S. regulators and subsequently with the EU. Benioff, who was outbid by Microsoft in an attempt to acquire LinkedIn, raised concerns that Microsoft would lock out rival social networking service using its large Office and Windows market share.
Before clearing the deal, the EU last month sought some concessions from Microsoft, which had reportedly offered compromises. Microsoft President Brad Smith outlined the key commitments it will maintain for at least five years. Among them Microsoft will:
- Continue to make its Office add-in program available to third-party professional social networking providers, which will let developers integrate those services into Outlook, Word, PowerPoint and Excel.
- Maintain programs that allow third-party social networking providers to promote their services in the Office Store.
- Allow IT pros, admins and users to customize their Office configurations by letting them choose whether to display the LinkedIn profile and activity information in the user interface when Microsoft provides those future integrations as is anticipated.
- Ensure PC manufacturers aren't required to install new LinkedIn apps or tiles in the European Economic Area (EEA). Likewise, Microsoft is promising not to hinder users from uninstalling the apps and tiles. Microsoft also said it won't use Windows itself to prompt users to install a LinkedIn app, though it'll remain in the Windows Store and customers may be promoted in other ways to use it.
In the EEA, Smith also said that Microsoft has agreed not to form agreements with PC makers to pre-install LinkedIn exclusively, thereby blocking competitors. "We appreciated the opportunity to talk through these and other details in a creative and constructive way with the European Commission," Smith noted in yesterday's post. Having cleared approval in Europe, Smith said Microsoft is ready to move forward.
"Microsoft and LinkedIn together have a bigger opportunity to help people online to develop and earn credentials for new skills, identify and pursue new jobs and become more creative and productive as they work with their colleagues," Smith noted. "Working together we can do more to serve not only those with college degrees, but the many people pursuing new experiences, skills and credentials related to vocational training and so-called middle skills. Our ambition is to do our part to create more opportunity for people who haven't shared in recent economic growth."
Posted by Jeffrey Schwartz on 12/07/2016 at 10:45 AM0 comments
Box last week continued to show it can grow it's cloud-based file sharing and collaboration enterprise business despite competition from Microsoft and Google, among other players, by posting more than $100 million in revenues in a quarter. Despite competing with Microsoft's OneDrive file share for SharePoint and Office 365, Box CEO and Cofounder Aaron Levie believes his company's partnership with Redmond is beneficial.
Upon announcing revenues for its third quarter fiscal year 2017 earnings of $102 million, the company projected improved demand in the current quarter with sales expected to break $108 million. "The need for Box is clear," Levie said during last week's earnings call (according to a transcript). "Today, business content is spread across separate legacy systems, on-premises storage, disparate collaboration and workflow tools, and sync and share solutions."
Box last week released several new ties to Office 365 including integration with the Office 365 Android client, the ability to create files to be saved in Box from Office Online and improved search sorting and previewing of Excel files in Box. The company described the added integration capabilities in a blog post.
In an interview last week, Levie and I discussed the partnership between Box and Microsoft and the benefits and risks of teaming up with a company that has a major stake with its own collaboration platform.
"I have to say, Microsoft has really transformed itself over the past couple of years to being a much more partner-friendly and customer-friendly organization, and I can say we are getting that feedback from customers right now because of that openness has been phenomenal," Levie said. "I think unequivocally what they [Microsoft] are realizing is with billions of people on the Internet and on mobile devices, the market is so much bigger. It is so much broader in terms of opportunity and they are taking advantage of that by partnering and building for multiple platforms."
Asked if Box will provide integration with Microsoft Teams, the new chat box feature the company will add to Office 365 business and enterprise subscriptions, Levie said that presuming there's demand for it, Box will provide the integration.
"I haven't played with it yet but I would say strategically this is an important space," Levie said. "The future of communications is no longer going to be just dominated by e-mail. I don't believe that e-mail necessarily goes away but I think we are going to use different tools to communicate in different contexts. It's not exactly just instant messaging like they have with Skype and Lync, and it's not as asynchronous as e-mail. It's really a space in the middle. Microsoft is recognizing that this is a very real opportunity, so I think they have to go after this space. That being said, Slack is rapidly growing and I think what we are going to all benefit on as users and partners in this ecosystem is more innovation."
Levie didn't always see Microsoft as an innovator or a good partner. Years ago, Box was founded to offer an alternative to SharePoint and Office. While Levie still believes that Box offers a superior cloud-based content management solution for large enterprises, he also said there's room for both, while arguing that the two offer very different types of capabilities.
"Our focus is that we're trying to build an incredibly simple, yet robust platform to help enterprises manage, collaborate and secure all of their data. And when you look at what we built up, it builds a very different kind of experience than SharePoint or OneDrive," Levie said. "In many respects, we've been building out a very different kind of product over the past decade where it's much more of a platform. it's a real end-to-end content platform that can solve every use case around working with documents, working with files and working with your information. But then importantly, it connects into every application in your organization and that's what's fundamentally different about Box and any other product in this space."
The new security and governance features Box has rolled out this year are also taking hold, Levie said on last week's earnings call, noting a $500,000 deal from a multinational pharmaceutical company centered around the new Box Governance offering. Box Governance allows organizations to meet data retention requirements and most recently security classification.
"Security is massive," Levie said. "It's one of the key reasons why customers will go to Box. It's one of the bigger catalysts that drive our growth and more and more we have chief information security officers that are driving the adoption of Box within the enterprise."
Posted by Jeffrey Schwartz on 12/05/2016 at 12:07 PM0 comments
Amazon executives this week left no stone unturned with an extensive barrage of new deliverables covering a wide gamut of services -- many that will define its next generation of cloud services. This week's annual springboard of new offerings ranges from added instance types to services aimed at addressing the next wave of IT, which, for many business and technology planners rests on undergoing digital transformation efforts.
During five hours of keynotes spanning two days at the annual AWS re:Invent conference in Las Vegas, Amazon executives set out to prove that their company's cloud dominance will remain uncontested anytime soon. More than 30,000 customers and partners attended this year's event as AWS enters its second decade with a consistent focus on driving organizations to tap its broad portfolio of services to replace more and more of the functions now running in datacenters.
AWS has become so dominant that 451 Research this week put out a report predicting that "AWS +1" will become a strategic choice, or "operating principle," for enterprises in 2017." At this year's gathering, which has become AWS' largest staging event for new offerings, the new services ranged a from simpler starter kit for developers to spin up VPCs via the company's new Lightsail offering and a code-free visualization workflow tool called AWS Step Functions. A wider range of compute options including GPUs for all instance types and new field programmable gate arrays for gaming and high performance computing, a new batch processing service, management and automation capabilities, extended open source contributions and tools to advance its push into AI, machine learning, Internet of Things and edge locations were also on display
Customers' trust in the public cloud to transform the way they deliver IT was equally a key theme as well-known customers including Capital One Bank, McDonalds, Netflix and FINRA all explaining how they are broadening their use of the AWS. Netflix, which streams 150 million hours of programming each day and has 86 million customers, remains the poster child of companies that have transitioned from on-premises datacenters, an effort that dates back to 2008. Chief Product Officer Neil Hunt told attendees that this year marked the final phase of that transition. "We unplugged our last datacenter," he said.
Still, the customers touted by AWS are the exception rather than the rule, said Chris Wegmann, managing director at Accenture, which this week extended its partnership with AWS that it first kicked off a year ago. Accenture's AWS Business Group now has 2,000 IT pros that have 500 AWS certifications that are working with several large enterprises such as Avalon, Hess, Mediaset and Talen Energy. Wegmann said Accenture believes in the coming years cloud migrations, especially to Amazon's, will become more prevalent, while concerns still linger for some.
"We are seeing customers that are still hesitant," he said. "They're still trying to figure out whether or not it's the right thing for them, or whether or not they are going to try to have cloud independence. We are seeing them try to go slow and hit some roadblocks and they lose momentum. When you lose momentum, it doesn't go very well." Often those organizations "can't get out of their own way," Wegmann added.
In contrast, organizations that are successful in making the transition take disciplined approaches but still stick with their plans.
The companies that are being successful are maintaining that momentum," he added. "They are not wavering on their decisions and they make realistic decisions, while not trying to end world hunger."
Posted by Jeffrey Schwartz on 12/02/2016 at 1:42 PM0 comments
One of the most important new products from Microsoft this year was the release of SharePoint Server 2016. In addition to coming closer into sync with Office 365, it's the best-suited version of SharePoint to run as an instance in an infrastructure-as-a-service (IaaS) cloud. Dan Usher, lead associate with Booz Allen Hamilton, believes Microsoft Azure is the most logical IaaS for SharePoint, though he said customers can certainly run it in cloud infrastructure environment that meets their needs and budgets.
Usher, who has helped his clients deploy SharePoint Server in Microsoft Azure, will be joined by Scott Hoag, an Office 365 solutions architect at Crowley Maritime at next week's SharePoint Live! Conference in Orlando, where he'll demonstrate how to deploy SharePoint in Azure. Both presenters last month spoke with my colleague Lafe Low, but unbeknownst to him, I also recently met up with Usher where we discussed specifically why he recommends running SharePoint in Azure either entirely or in a hybrid architecture.
Are you seeing customers looking to provision or move SharePoint Server farms into Azure?
It goes back to what your use case is. If you've got a mission-critical system and you don't have a datacenter already, the question is, why not? Cloud services, at least from a procurement perspective, will be a lot easier because you don't have to find space in a co-lo [collocation facility] and pay for electricity.
Is there a case to use SharePoint in Azure these days rather than using SharePoint Online in Office 365?
I'd say there is. A lot of organizations still want to be able to deploy applications onto the server and interact directly with the server.
And of course, the newest release, SharePoint 2016, lends itself to do that more than any previous version.
Tremendously. You have things like the cloud search service application to act as that conduit to let Office 365 go in and actually do your crawl across the enterprise and work through your index effectively. That helps out tremendously to find that information. But if you have that specific thing that needs to sit in SharePoint server and don't want in Office 365, for whatever reason -- whether you're afraid or have security compliance requirements or if you've got some specific application code you need to put directly on the server, that's one of the main core reasons to stay on-premises.
If they're going to take that on-premises version, does it make sense to put in Azure versus AWS, or some other third-party cloud provider?
If they don't want to do it in their own datacenter, or if they want to have it out there in a more available, and in some cases more secure infrastructure and need multiple 9s of availability -- which can be pretty difficult as an IT pro -- I don't think there's a reason not to use Azure. I know for some systems, depending on how they want to architect it out, they might run into some limitations where they go. Some can't deploy something that requires them to join it to Azure Active Directory and, while even the Azure team has made it possible to put out Azure AD domain services so you can connect servers into Azure Active Directory, you're still kind of hitting some areas, where if you have things that need to be integrated into your home network, using Azure still works pretty well.
Where do you hit those limitations?
A capability that was put out in preview awhile back, Azure Active Directory Domain Services, doesn't let you extend the schema. If there's something where you have a customization back in your own Active Directory that extends schema, you might run into a limitation there.
Does that impact SharePoint itself?
I believe there is only one spot that actually touches this sort of schema for a connection point to just identity itself.
So it's not a showstopper?
No, not by any means.
What about the cost of running your SharePoint Server farms in the public cloud versus your own datacenter? Are you going to save money? Will it cost more but perhaps have other benefits?
I hate saying the depends line, but the problem you run into is everyone says cloud is going to be cheaper for you. And say if you're running 24x7x365, you may actually be better off looking at software as a service [like Office 365] instead of going with infrastructure as a service. Because you're paying for a usage license instead of compute, storage, network and everything.
How many clients that you're working with are looking at or running, SharePoint in Azure?
It's mixed. Some are still staying on-premises; some are taking advantage of Azure just because they don't want to have to run something internally. It's a mixed crowd. I'd say one of the more interesting things is the folks that are looking at keeping things on-premises but also setting up hybrid. And now that they're seeing things like cloud service applications, they're basically buying in and saying let's move those workloads that aren't critical up in Office 365 because it just makes more sense because then you don't have as much to back up and keep operational. And then we can make it more accessible. One of the cooler things that pushes that story even further forward is OneDrive for Business with the commitments they have made around the synchronization engine supportability.
How would you characterize the improvements to synchronization?
It's working better and better.
Given the latest developments, what will you be talking about during your SharePoint Live! session?
We will be running demos about how to set SharePoint up in Azure, and how to configure it. A lot of folks will step in and say, 'oh, I see this template that's already here for me -- I'll just use that.' You can definitely go down that path, and Bill Baer [senior technical product manager and Microsoft Certified Master for SharePoint] and that team has put a lot of effort into having an environment that you can automatically spin up. But those are the use cases where you kind of step in and say, 'hey it's great but I'd still like to be able to go in and customize this or customize the way SharePoint interacts with those other components.' So we're going to be walking though some of the ARM [Azure Resource Manager] templates -- how to use them to build out a SharePoint environment. If you don't want to use that, if you just want to use the auto SP installer tool to build out your installation for you, I'll be showing off some of that as well, and showing some of the more complex use cases you might have with getting things connected back to your own network.
Overall how much interest are you seeing in this?
There's a lot of interest by customers who want to get out of their own datacenter. I would say, a lot of organizations have a SharePoint guy and that SharePoint guys may have five other roles, and probably started off doing something completely different, such as Windows Server administration or Exchange administration. And when he steps into SharePoint, he freaks out. So we're hoping this can at least give them information and knowledge to go back and get their job done more effectively.
Find out more about Live! 360 2016 in Orlando here.
Posted by Jeffrey Schwartz on 11/30/2016 at 12:33 PM0 comments
Love is a two-way street. Microsoft has shown considerable affection for Linux over the past two years and by some measure consummated its marriage to open source by joining the Linux Foundation. However, this news overshadowed the fact that Google has joined the .NET Foundation's Technical Steering Committee.
Microsoft announced both milestones earlier this month at its Connect() conference in New York, where the company outlined several key deliverables that will extend the company's portfolio of tools, new platform and computing capabilities that will bring intelligence to its apps with its new Bot Service preview and Azure cloud platform services. While many of Microsoft's key moves show more than just lip service to the open source community, the company is equally committed to extending its own .NET platform.
Having released the first iteration of multiplatform .NET Core this summer, the company revealed the preview of .NET Core 1.1, which offers 1,380 APIs, support for additional Linux distributions, performance improvements, thousands of new features, hundreds of bug fixes and improved documentation. Bringing Google into the fold builds on Microsoft's ".NET Everywhere" strategy, which aims to bring C# code to phones, TVs and to open source infrastructure and cloud services.
But Google's decision to join the .NET Foundation's Technical Steering committee drew a more muted applause than the news earlier that same morning that Microsoft was joining the Linux Foundation. When Scott Hanselman, Microsoft's principal program manager for Azure, ASP.NET and Web tools, made the announcement during his keynote presentation at Connect(), there was a brief silence, followed by a polite, though hardly exuberant, applause. "I don't want your pity applause," Hanselman quipped.
Allowing that the reaction was more surprise than pity since the move could give a boost to the Google Cloud Platform, Hanselman explained the significance of having Google join Red Hat, JetBrains, Unity and Samsung help steer the direction of .NET Core.
"Googlers have quietly been contributing to .NET Foundation projects and they are also helping drive the ECMA standard for C#," Hanselman told attendees at Connect(). "The Google Cloud Platform has recently announced full support for .NET developers and they have integrations into Visual Studio and support for PowerShell. All of this is made possible by the open source nature of .NET and of .NET Core, and I could not be more thrilled to be working with our friends at Google to move C# forward and to move .NET forward and to make it great everywhere."
Support for .NET goes back for several years when Google added .NET support throughout its infrastructure, including libraries for over 200 of the company's cloud services. The company has also recently added native support in Google Cloud Platform for Visual Studio and PowerShell. Back in August, Google announced support for the .NET Core release including ASP.NET and Microsoft's Entity Framework.
Google isn't the only major player to give .NET a lift. Apps built with .NET Core will also find their way to TVs, tablets, phones, wearables and other devices with help from Samsung, which used the Connect() event to release the first preview of Visual Studio Tools for Tizen, the embedded OS used to power Samsung devices. The tooling will enable mobile application development using device emulators and an extension to Visual Studio's IntelliSense and debugging features.
Martin Woodward, executive director of the .NET Foundation, noted that there are 50 million Tizen-powered devices in the wild. Said Woodward: "This will allow .NET developers to build applications to deploy on Tizen across the globe and continues in our mission to bring the productive .NET development platform to everyone."
Posted by Jeffrey Schwartz on 11/28/2016 at 1:34 PM0 comments
Microsoft is hoping to finalize its acquisition of professional social networking giant LinkedIn by year's end. But it still has a key hurdle to meet -- gaining approval from European Union regulators. Salesforce CEO Marc Benioff, which was unable to outbid Microsoft's successful offer to acquire LinkedIn for $26.2 billion, is trying to convince the EU it shouldn't approve the deal, arguing it will stifle competition.
While it has already cleared regulatory approval in the United States and other countries, the EU is expected to decide by Dec. 6 to either clear the deal or to delay it pending further investigation. A report by Reuters on Nov. 16 revealed that Microsoft officials last week met with EU regulators and offered concessions. The report, citing unnamed EU officials, said details weren't provided but they'll share the concessions with competitors and customers.
Benioff has complained to regulators for some time since the deal was first announced by both companies in mid-June. In late September, Salesforce Chief Legal Officer Burke Norton told CNN in a statement that "Microsoft's proposed acquisition of LinkedIn threatens the future of innovation and competition. By gaining ownership of LinkedIn's unique dataset of over 450 million professionals in more than 200 countries, Microsoft will be able to deny competitors access to that data, and in doing so obtain an unfair competitive advantage."
Amid the latest hurdles, I was among four participants in a panel discussion Tuesday night in New York City and again Wednesday morning in Woodbridge, N.J. at a meeting of Microsoft's partners. The meeting was planned a while back to discuss if, and how, Microsoft partners might benefit from the deal.
In advance of the meeting, I shared results of the Redmond magazine survey we conducted and shared in our August cover story. About 20-plus attendees were at both meetings, held by the local chapters of the International Association of Microsoft Channel Partners (IAMCP) -- and some were hopeful the integration of LinkedIn's large pool of data and social graph will open new opportunities for them such as offering new capabilities on the Dynamics CRM suite to make it easier to sell and market to customers.
Among those bullish about the potential of the deal is Eric Rabinowitz, CEO of Nurture Marketing, who organized the panel discussion and recruited me to participate along with Jeffrey Goldstein, founder and managing director of Queue Associates, a Dynamics integrator, and Karl Joseph Ufert, president of Mitre Creative, a New York-based digital agency that works with various Microsoft partners.
Rabinowitz uses LinkedIn today to mine his clients' "circle of influences," he explained. "We look at our clients, go into LinkedIn and look at what surrounds them, who their peers are, who they report to and who reports to them," Rabinowitz explained. "Then we harvest that information and get e-mail addresses for those people. And instead of marketing to one person in an organization, we reach their whole circle of influence. What's beautiful about the Microsoft acquisition, I think what I just described will all be there at the push of a button."
Asked later if he was concerned that this capability would marginalize his business, Rabinowitz argued to the contrary. "Right now, it's a labor-intensive service that does not make us much money," he said. "If it's a service then what Microsoft will do for us is improve our service, gaining improved results. Also, I can envision us packaging the service into something else we do."
Others, though, are concerned that once Microsoft absorbs LinkedIn, resulting services could put the squeeze on their own offerings, depending on how they're priced. That's specially the case for those who offer virtual software training services. Lisa Eyekuss, president of Corporate Training Group, based in Iselin, N.J., shared her concerns, especially if Microsoft bundles LinkedIn's Lynda.com training service with Office 365 subscriptions. "If they include it, then Microsoft just slapped the face of its partners who do all the end user training because it's the first item in the budget to be cut."
While she takes some comfort in having diversified her business in recent years, Eyekuss is wondering to what degree Microsoft will slash the cost of Lynda.com to Office 365 subscribers. "I know it will affect our business," she said. "We have to figure out how to stay away from it."
It also remains to be seen to what degree Microsoft will integrate LinkedIn data, including contacts and the newsfeed, with Office 365 and the new Microsoft Teams (announced earlier this month), as well as ties to the new Dynamics 365 business suite, particularly CRM. Goldstein said he believes it will add much richer capabilities to the CRM stack. Goldstein said he is hoping Microsoft shares more specifics as soon as the deal closes -- presuming the EU doesn't delay it. "I'm excited and I don't know why," he said. "The only thing I do know is if Marc Benioff is upset about this acquisition and is trying to block it, it's got to be good," he said. "There's got to be something there."
Posted by Jeffrey Schwartz on 11/17/2016 at 10:17 AM0 comments
If you have considered treating yourself or someone else to a new PC, next weekend's Black Friday and Cyber Monday might be the best opportunity yet to do it. Microsoft today revealed on its Windows blog that it will offer some attractive deals for some of its Surface Pro 4 and Surface Book models, and pointed to some notable markdowns for PCs from third-party OEMs from a number of retailers. As far as I can recall, this year's lineup of holiday discounts may be the best Microsoft has ever offered during the traditional holiday buying season.
Arguably the most tempting offers are for a Surface Pro 4 with an Intel m3 processor, 4GB of RAM, a 128GB SSD and its Signature Type Cover keyboard, which regularly costs $1,029 for $599. For those wanting a step up, the company is also offering its Surface Pro 4 with an i5 processor, 256GB SSD, 8GB of RAM and the keyboard for just $999 (the bundle typically costs $1,429).
Those $400-plus discounts will only be available at Best Buy and the Microsoft Store from Nov. 24 to Nov. 28. While there are rumors that Microsoft at some point next spring might roll out a Surface Pro 5 and thereby marking down the predecessors anyway, the discounts offered on the Surface Pro 3s after the launch of the Surface Pro 4s weren't as attractive as these current holiday deals.
Also during Black Friday next weekend, Microsoft is taking $400 off its Surface Book, offering a model with an i5 processor that includes a GPU, 8GB of RAM and a 256GB SSD for $1,499. The entry level Surface Book with 128GB SSD and 8GB of RAM, which normally sells for $1,499 will be on sale for $1,249.
In addition to its own hardware, Microsoft pointed to a variety of other Black Friday and Cyber Monday deals available at other retailers from its OEM partners Dell, Lenovo, HP, Acer, Asus and Samsung. One of the deals Microsoft considered most appealing to business users included $150 off of the new Lenovo Yoga 910 with Intel's 7th Generation Core i7 processor, a 14-inch display, 8GB RAM and a 256GB SSD for $1,049 (not mentioned but also planned is a similar model with 16GB of RAM and a 4K Ultra HD display for $1,399) at Best Buy. During the weekend Costco members can save $300 on the costly and compact Dell XPS 13 Ultrabooks for $300 off or $100 markdown on the Acer Spin 5.
Those looking for lower-end PCs that are lighter on the wallet will find some good deals worth considering as well including:
- Dell Inspiron PCs starting at $399 and HP Notebook 15 for $299 at the Microsoft Store.
- HP X360 for $229 or Lenovo Ideapad for $400 at Best Buy.
- HP laptops NT Ci3 for $269 and NT Ci5 for $329 at Office Depot.
- ASUS Transformer Mini T102 for $299 from Amazon.
- Dell's Inspiron 11 3000 for $99 at the company's Web site.
For gamers, Microsoft said it is discounting its 1TB Xbox and Xbox One S by $50, bringing the latter to its lowest price ever, at $249. The game console deals will be available starting next Sunday Nov. 20 through Wednesday Nov 23.
Posted by Jeffrey Schwartz on 11/14/2016 at 1:29 PM0 comments
Whether you're pleased or shocked by the stunning upset Donald Trump notched last night, his election as the 45th president raises questions on how his administration will try to change Internet policy and address the wide number of cybersecurity issues facing businesses and end users.
If some of his remarks about cybersecurity, encryption and Internet regulation including net-neutrality during his 17-month campaign are any indication, there are reasons to believe big changes are in store. One big question is whether he will press for higher restrictions on encryption and the government's overall approach to encryption.
When the California Federal District Court Magistrate Judge Sheri Pym earlier this year ordered Apple to help the FBI decrypt the iPhone used by suspected terrorist Syed Rizwan Farook, who, with his wife Tashfeen Malik, killed 14 people in the December 2015 San Bernardino, Calif., shootings, Trump called on a boycott of Apple.
"Apple ought to give the security for that phone, OK. What I think you ought to do is boycott Apple until such a time as they give that security number. How do you like that? I just thought of it. Boycott Apple," Trump said at the time. "The phone's not even owned by this young thug that killed all these people. The phone's owned by the government, OK, it's not even his phone," Trump said. "But [Apple CEO] Tim Cook is looking to do a big number, probably to show how liberal he is. But Apple should give up, they should get the security or find other people."
According to a post today by the Information Technology and Innovation Foundation, Trump supports weakening of encryption in favor of stronger homeland security. The post also addresses his positions on number of other issues that will impact the IT industry, including his opposition to H-B1 visas and has articulated little in terms of whether he would advocate for increased R&D investments in technology.
The new president-elect has also advocated for restricting Internet access to stop terrorist organizations like ISIS from recruiting. "I would certainly be open to closing areas where we are at war with somebody." Trump said, according to the On the Issues site. "I don't want to let people that want to kill us use our Internet."
The site also questioned Trump's understanding of 'Net Neutrality when he compared it to the Fairness Doctrine.
Clearly by his remarks, the new president-elect is not up to speed on many of these issues. Now that the campaign is over and once he assumes office though, the advisors he surrounds himself with and the appointments he makes could have a major impact.
What's your prediction on how he will address cybersecurity, Internet policy and other IT-related issues?
Posted by Jeffrey Schwartz on 11/09/2016 at 1:15 PM0 comments
Researchers at Microsoft achieved what they say is a breakthrough in speech recognition claiming they've developed a system that's as effective or better than people with professional transcription skills. The software's word error rate (WER) is down to 5.9 percent -- an improvement from the WER of 6.9 the team reported in September. The milestone was enabled with the new Microsoft Cognitive Toolkit, the software that enables those speech recognition advances (as well as image recognition and search relevance). Microsoft announced both developments two weeks ago, though the timing wasn't the best as IBM was holding its huge World of Watson event in Las Vegas.
Watson, of course, is Big Blue's AI system made famous several years ago when it appeared on Jeopardy and, in advance of its latest rollout, made the talk-show circuit including CNN and CBS's 60 Minutes, where IBM Chairman, President and CEO Ginni Rometty talked up Watson's own achievements including the ability to discover potential cancer cures deemed not possible by humans, among other milestones.
Microsoft believes it has the most powerful AI and cognitive computing capabilities available. The Microsoft Cognitive Toolkit is the new name for what it previously called the Computation Network Toolkit, or CNTK. In addition to helping the researchers hit the 5.9 WER, the new Microsoft Cognitive Toolkit 2.0 helped the researchers enable what the company is calling "reinforcement learning."
The new open source release, like its predecessors available on GitHub, now supports Python including migration from C++. "We've removed the barrier for adoption substantially by introducing Python support," said Xuedong Huang, a Microsoft distinguished engineer, in a recent interview. "There are so many people in the machine learning community who love using Python."
Most noteworthy, Huang said is that the new software has a significant boost in performance, enabling it to scale across multiple Nvidia GPUs, including those added with the field-programmable gate arrays in the Azure cloud. Huang acknowledges the tool isn't as popular as other open source frameworks such as Google's TensorFlow, Café or Torch, but he argues it's more powerful, extensible and able to scale across multiple machines and environments.
"It' the fastest, most efficient distributed deep learning framework out there available," Huang said. "The performance is so much better. It's at least two to three times faster than the second alternative." The new Microsoft Cognitive Toolkit includes algorithms that can't degrade computational performance, he added.
For its part, IBM made some pretty big news of its own. The company released the new Watson Data Platform (WDP), a cloud-based analytics development platform that allows programming teams including data scientists and engineers to build, iterate and deploy machine-learning applications.
WDP runs on IBM's Bluemix cloud platform, integrates with Apache Spark, works with the IBM Watson Analytics service and will underpin the new IBM Data Science Experience (DSX), which is a "cloud-based, self-service social workspace that enables data scientists to consolidate their use of and collaborate across multiple open source tools such as Python, R and Spark," said IBM Big Data Evangelist James Kobielus in a blog post outlining last month's announcements at the company's World of Watson conference in Las Vegas. "It provides productivity tools to accelerate data scientists' creation of cognitive, predictive machine learning and other advanced analytics for cloud-based deployment. It also includes a rich catalog of learning resources for teams of data science professionals to deepen their understanding of tools, techniques, languages, methodologies and other key success enablers."
There are also free enterprise plans that include 10 DSX user licenses and a Spark Enterprise 30 Executor Plan, he noted. IBM claims more than 3,000 developers are working on the WDP and upwards of 500,000 users are now trained on its capabilities.
Has IBM already won the war against Microsoft, Google, Amazon and Facebook when it comes to intelligent cognitive computing, AI and machine learning? Karl Freund, a senior analyst at Moor Insights and Technology, said Microsoft has a long way to go to compete with IBM Watson for mindshare without question. "IBM is a brilliant marketing machine, and they are spending a lot of money to establish the Watson brand. Microsoft has nothing comparable, Freund said. "From a technology perspective, however, IBM has not convinced the technologists that they have anything special. In fact, most people I speak with would say that their marketing is ahead of their reality."
Freund said what IBM is offering is a collaborative development platform. "Microsoft is releasing their software as open source," he added. "IBM is all about the services, while Microsoft is seeking to gain broad support for their software stack, regardless of where you run it."
Microsoft's new toolkit and its multi-GPU support are significant, while Watson is likely to appeal to existing Big Blue shops including those with mainframes and organizations using the IBM SoftLayer cloud.
Posted by Jeffrey Schwartz on 11/07/2016 at 1:57 PM0 comments
The launch of Microsoft Teams this week is an important building block in Microsoft's focus on digital transformation and ensuring a new generation of workers gravitate to Office 365. It's critical because many millennials hate e-mail, are more accustomed to using chat and likely use other productivity suites such as Google Apps.
Microsoft Teams is poised to be omnipresent in the workforce early next year when the company releases it to all business and enterprise subscriptions of Office 365, though it apparently will let users opt-in rather than opt-out. But the goal is clear -- the company wants Teams to evolve into users' core digital workspace with the Web-based interface serving as a hub for all collaboration.
"Think of Microsoft Teams as a digital transformation of an open office space environment. One that fosters easy connection and conversation to help people build relationships. One that makes work visible, integrated and accessible across the team so that everyone can stay in the know," said Kirk Koenigsbauer, corporate VP overseeing Microsoft's Office client team, at this week's New York launch of the new offering.
Workers under the age of 30 are more accustomed to communicating in environments such as SnapChat, Facebook Messenger and a slew of other chat-focused environments. And as they join the workforce, Microsoft risks many being turned off by Office 365 and Outlook e-mail in particular, which is what Teams aims to overcome.
"Our workforce is already two-thirds millennial," said Andrew Wilson, CIO of the large IT consulting firm Accenture, speaking at Wednesday's event. "So they have been behaving like this in the consumer space. But what this does is provide enterprise security, enterprise foundation and that nice integration with the things we've already invested in."
Alaska Air is another customer that has spent several months testing Microsoft Teams. Systems engineer Randy Cochran said the airline's customer service group is testing the new tool. "Teams provides persistent chat for keeping track of conversations," said Cochran. "So if a customer calls and speaks to a different rep, they can retrieve a history of the prior discussion. They can just discover documents [and] it gives them the ability to share knowledge and go back and retrieve knowledge that's already been maybe in the silo and get that back. It also gives us a single pane of glass for documents and manuals and everything that's already been put into SharePoint."
Time will tell whether millennials embrace Microsoft Teams. But the early testers believe it has a good shot. "This will be a no-brainer," said Wictor Wilén, a digital workplace architect at Avanade, who is currently testing Microsoft Teams. "They will understand from day one exactly how to use it." When asked about the older generation, he said that remains to be seen, given they've used e-mail for two decades.
Posted by Jeffrey Schwartz on 11/04/2016 at 2:16 PM0 comments
After mulling an acquisition of popular enterprise chat platform Slack for $8.5 billion earlier this year, Microsoft decided to build its own. The company today revealed Microsoft Teams, hoping it will emerge as the hub for user experience across Office 365 and third-party apps and services.
Microsoft Teams, introduced at an event in New York, will bring a new chat-based workspace that will bring together all the components of Office 365 and Skype for Business into an integrated workspace tied to the Microsoft Graph.
Microsoft Teams brings together Word, Excel, PowerPoint, SharePoint, OneNote, Planner and Power BI into a common workspace. Microsoft Graph is used to share intelligence and context among the apps, and Active Directory and Office 365 Groups are used to associate people with information. Microsoft also released an SDK that allows enterprise and commercial developers to build connectors, based on the same model as Exchange Server Connectors, that can allow chats to be integrated among each other. For example, Microsoft Teams can integrate Twitter or GitHub notifications into the workspace.
Among other things, Microsoft Teams will appeal largely to the millennial workforce that have become accustomed to using online chat as communication, while reducing reliance on e-mail. Slack is among a number of popular tools many workers have started using in recent years, and Microsoft believes it has a solution that provides security, governance and context enabled by the underlying Microsoft Graph, Office 365 and much of its machine learning efforts including its new Bot Framework and Azure Machine Learning service.
"Microsoft Teams is a chat-based workspace where people can come together in a digital forum to have casual conversations, work on content, create work plans -- integrated all within one unified experience," CEO Satya Nadella said at the New York launch event. "It's designed to facilitate real-time conversations and collaboration while maintaining and building up that institutional knowledge of a team."
Kirk Koenigsbauer, corporate VP overseeing Microsoft's Office client team, gave an overview of the new chat platform, saying it will be added to Office 365 Business and Enterprise subscriptions in the first quarter of next year. "It will be automatically provisioned in Office 365 and managed as any other Office 365 service," he said.
A preview is now available along with an SDK that will allow developers to tie their own apps to Microsoft Teams. Currently 70 Office 365 connectors are available in the toolkit, Koenigsbauer said. Support for Microsoft and third-party services will come via planned enhancements to the company's Bot Framework.
The toolkit lets developers build Tabs that present users individual Web experiences within Teams, providing instant access to a specific service, allowing for collaboration around that application's content. Using the Bot Framework, developers can create Bots that allow individuals and teams to chat with the service by making queries, while also allowing users to enable Connector notifications. Microsoft officials also previewed T-Bots, which include help features for Microsoft Teams and Who-Bots, a feature under development that will help individuals perform intelligent discovery of those with specific knowledge based on their identities and the context of their own communications.
The APIs will also allow for customizable workspaces and the company said it expects to have 150 integration partners at launch next year. Among them are Hootsuite, Intercom, ZenDesk and Asana. The company emphasized that Microsoft Teams will share the security and compliance support offered in its other services including EU Model Clauses, ISO 27001, SOC 2 and HIPAA, among others.
Time will tell if it's a Slack-killer but the richly-valued startup certainly showed it was paying attention. In addition to running a full-page ad in The New York Times, Slack noted the arrival of Microsoft Teams prominently on its own site.
"Congratulations on today's announcements," the company said in a company blog post. "We're genuinely excited to have some competition. It's validating to see you've come around to the same way of thinking. It's not the features that matter. You're not going to create something people really love by making a big list of Slack's features and simply checking those boxes."
"We're here to stay," the company said. "We are where work happens for millions of people around the world. And we are just getting started."
Slack claims it has one million paid subscribers (2.5 million overall) with $100 million in annual recurring revenue and six million apps now installed among its members' teams.
Posted by Jeffrey Schwartz on 11/02/2016 at 9:53 AM0 comments
Microsoft officials appeared to be fuming this week over Google's disclosure Monday of a 0-day vulnerability just days after alerting the company. The company said yesterday a patch will be available next week and said Google should have waited. Google defended its decision to disclose the vulnerability, saying it's a serious flaw that has been actively exploited.
The search giant acknowledged it was disclosing the vulnerability despite the fact that Microsoft still hasn't issued a fix, urging users to use the auto-updater for Adobe Flash and to apply the patches to Windows when Microsoft releases them.
Myerson made known his displeasure with Google's decision to issue its alert before Microsoft had a patch ready. "We believe responsible technology industry participation puts the customer first, and requires coordinated vulnerability disclosure," Myerson stated. "Google's decision to disclose these vulnerabilities before patches are broadly available and tested is disappointing, and puts customers at increased risk."
Myerson noted it wasn't the first time Google has done so, pointing to another occasion nearly two years ago and the company's call for better coordinated disclosure to avoid vulnerabilities from being exploited before patches can be readied.
The disclosure fueled continued debate over how disclosure of vulnerabilities should be disclosed in the best interest of users. Udi Yavo, co-founder and CTO of threat detection vendor enSilo, in an e-mail sent to media said that Google was wrong. In addition to advocating for a 90-day window for disclosure, Yavo called for legislation to hold companies legally accountable.
"In the case of Google's disclosure, justification for only allowing a week for Microsoft to develop a patch is because Google researchers were seeing the vulnerability actively exploited in the wild," Yavo noted. "To me, this doesn't ultimately help achieve everyone's goal, which should be keeping consumers and their data safe. By disclosing a vulnerability early, without allowing time for a patch, Google opened-up the small pool of people who found the vulnerability and knew how to exploit it, to all."
Not everyone shares that view. Ilia Kolochenko, CEO of Web security firm, High-Tech Bridge, said in an e-mail that Google did the right thing. "I think it's not a question of days, but rather of efficient cooperation to fix the vulnerability," he said. "Google has great cybersecurity experts and engineers who can definitely help other companies to understand the problem faster and help fixing it. Instead of endless discussions about the ethics of full disclosure, we should rather concentrate on inter-corporate coordination, cooperation and support to make the Internet safer."
What's your take? Should Google have waited or do you think it did the right thing by making the vulnerability known?
Posted by Jeffrey Schwartz on 11/02/2016 at 11:38 AM0 comments
Having spent the past year rationalizing its mammoth $67 billion acquisition of EMC, the newly combined company this month hit the ground running. During the Dell EMC World Conference two weeks ago in Austin, Texas, Michael Dell and the senior executive team of the new Dell EMC outlined a laundry list of deliverables that will bring together the two organizations' respective technologies.
From the outset of this month's conference, company officials emphasized their portfolio approach to bringing together EMC and the companies it controlled (including VMware, RSA, Pivotal and Virtustream) with Dell. All now fall under the umbrella of Dell Technologies, with Dell EMC consisting of the server, storage, network and enterprise infrastructure assets of the two companies.
Experts believe Dell EMC is now the dominant provider of datacenter infrastructure. But it has formidable competition from Cisco, Hewlett Packard Enterprise (HPE), Lenovo, IBM and Hitachi Data Systems, and smaller players as well.
At Dell EMC World, company officials showcased the benefits of coming together with this month's launch of a new software-defined version of its PowerEdge servers with EMC's Data Domain backup and recovery software. The company claims a six times increase in scalability and support when added to the new Dell EMC PowerEdge servers.
But Michael Dell showed he clearly wants to provide greater linkage among the assets beyond the core EMC storage business, notably VMware, Pivotal, RSA and Virtustream. Evidence of that was clear with the announcement of plans to deliver an integrated security solution that will bring together the assets of EMC, RSA, VMware AirWatch businesses and Dell SecureWorks. That's just one instance outlined at Dell EMC World.
Other examples include the launches of the Dell EMC's VCE-based hyper-converged appliances built on Cisco's Unified Computing System (or UCS), as well as its VxRack hyper-converged infrastructure that are both now available with Dell PowerEdge servers and the new Dell EMC Elastic Cloud Storage (ECS) 3.0 modern object-storage platform for cloud-native environments. VxRack is based on the object store that runs the Virtustream public cloud for mission critical applications that EMC acquired last year and the new Dell EMC Analytic Insights Module (AIM) based on the Pivotal Cloud Foundry Platform. In addition to bringing Dell PowerEdge as an option to the Cisco-powered VxRail, Dell EMC is offering an option with VMware's Horizon client virtualization platform in December. While Dell Technologies has a controlling interest in the independently run companies, Michael Dell talked of the benefits of this structure.
"This unique structure allows us to be nimble and innovative like a startup, but the scale of a global powerhouse," he said in his Dell EMC World keynote address. "For you that means a technology partner that can be number one in everything, all in one place."
Commitment to Cisco and Microsoft
At the same time, Dell made sure to emphasize that the new company will continue to embrace its partners that, in context of some of this next-generation infrastructure, are also competitors, notably Cisco and Microsoft.
Backing away from either company would not only alienate a formidable customer base committed to Cisco's UCS that is the core component of Dell EMC's VxRack line and the Microsoft Windows Server, Hyper-V and Azure platforms, but would undermine a substantial source of revenue that Dell needs to make this merger work, said Gina Longoria, an analyst at Moor Insights and Strategy. Like others, Longoria observed that Dell and company officials talked up the company's commitment to both Cisco and Microsoft. But like others, she agreed with my observation that the scale did appear to tip toward the Dell Technologies portfolio in terms of emphasis.
"I wasn't surprised how much focus they had on VMware but I was disappointed they didn't focus more on their Microsoft capabilities," she said. "I'm sure that's coming. Azure Stack is obviously not coming until next year, but hopefully next year they'll round that out a bit more. It's a little bit from a wait and see but I'd like to see a more balanced message.
Focus on Digital Transformation
David Goulden, who was CEO of EMC before the merger and now is president of the Dell EMC business unit, outlined how digital transformation will fuel growing demand for the hyper-converged products and new object storage platforms that will support containers and the building of next-generation cloud-native applications. While EMC had built its integrated systems with server and compute systems from Quanta before the merger, offering Dell's PowerEdge today has significant implications. For example, until now, with the Quanta server in VxRail, it was a four-node 2U system and had an entry level list price of $60,000. The new VXRail launched this month is now available in a 1U rack unit and at a starting price of $45,000 (or 25 percent lower), said Bob Wambach, vice president of marketing for the Dell EMC VCE business. The company claims the new appliances, which feature the most current Intel Broadwell-based compute platforms, are available in 250 more configurations, offer 40 percent higher CPU performance and are available with double the storage in all-flash nodes. Similarly, the new Dell EMC VxRack System with the PowerEdge servers offer two and a half times more capacity and 40 percent greater performance, the company claims.
"There's a much wider range from entry level going down in starting configurations to significantly more processing power and performance in the larger boxes," Wambach said. "It represents a very dramatic change in scope of use cases we can address."
Bringing the PowerEdge server to its converged and hyper-converged platforms will play a critical role in Dell EMC's hybrid cloud ambitions, according to Krista Macomber, a senior analyst covering datacenter infrastructure at Technology Business Research. "Dell's legacy manufacturing prowess is another major benefit in positioning the VCE team to more quickly and cost-effectively deliver more customized hyper-converged appliances and ensure spare parts availability," she noted in a research note published last week.
While converged and hyper-converged system growth is outpacing that of traditional servers, it still is a small percentage of the market. At Dell EMC World, Goulden said the company recognizes even as the future points to hyper-converged infrastructure, the vast majority of its customers still prefer, or at least rely upon, the building block approach to engineering. "The data shows that over 80 percent of you today still want the servers and the storage building blocks to build your own," he said in his Dell EMC World keynote. "So don't worry, we are still fully committed to these building blocks."
Posted by Jeffrey Schwartz on 10/31/2016 at 3:13 PM0 comments
The damage from last week's distributed denial-of-service attack suggests it was the most powerful to date and it could be a precursor to an even more sustained attack. A bipartisan committee of senators formed over the summer wants answers, but some critics want the government to act more swiftly. The incident also puts a spotlight on the vulnerability of Internet of Things-based components, ranging from sensors on gas meters to IP-connected webcams and thermostats. There are currently 6.4 billion IoT-connected devices in use and that figure is expected to grow to 20 billion by the year 2020, according to Gartner's most recent forecast.
DNS provider Dyn, attacked by the Mirai botnet, was overwhelmed last week by the massive DDoS attack. Dyn is one of a handful of DNS providers attacked last week. The operation brought down or interrupted hundreds of sites last Friday including Amazon, Netflix, Reddit, Twitter and Spotify. It also brought down services that enterprises rely on including Okta's single sign-on cloud service, Box Github and Heroku.
The source is still not known. But according to an analysis by Flashpoint, it didn't have the characteristics of a nation-state attack. The action took advantage of security flaws in IoT-based components provided by China-based XiongMai, which responded this week by recalling millions of its devices. Dyn's EVP of Products Scott Hilton on Thursday described in a company blog post the intensity of the attack, noting that while his team is still analyzing the data, he believes the botnet came from as many as 100,000 endpoints and noted reports that packets were coming at it at a speed up to 1.2Tbps – though that remains unverified at this time.
"Early observations of the TCP attack volume from a few of our datacenters indicate packet-flow bursts 40 to 50 times higher than normal," he stated. "This magnitude does not take into account a significant portion of traffic that never reached Dyn due to our own mitigation efforts as well as the mitigation of upstream providers."
Hilton described it as a complex and sophisticated attack that used targeted and masked TCP and UDP traffic over port 53. It generated compounding recursive DNS retry traffic, he noted, adding that further intensified its impact. Hilton confirmed that the Mirai botnet was the primary source of the attack and that it is working with authorities conducting criminal investigations.
In addition to law enforcement, Democratic U.S. Sen. Mark R. Warner, a member of the Senate Select Committee on Intelligence, who joined Republican Cory Gardner of Colorado over the summer in forming the bipartisan Senate Cybersecurity Caucus, wants answers and issued a statement calling for better protections. Warner called on three federal agencies -- the FCC, FTC and Department of Homeland Security's National Cybersecurity & Communications Integration Center (NCCIC) -- to provide information on the tools available and needed to prevent attacks from flaws in consumer devices and IoT components including IP-based cameras, connected thermostats and other products that that have connectivity. An FCC spokesman said the agency is still reviewing Warner's letter.
In his letter to FCC Chairman Wheeler, he questioned what can be done about the fact that consumers aren't likely to change passwords in their IoT devices (and if it's even an option). One implication was perhaps mandating improved software that enables automatic firmware updates. Warner also questioned the feasibility of enabling ISPs "to designate insecure network devices as 'insecure' and thereby deny them connections to their networks, including by refraining from assigning devices IP addresses? Would such practices require refactoring of router software, and if so, does this complicate the feasibility of such an approach?"
Morey Haber, VP of Technology at BeyondTrust, in a blog post earlier this week, called on Congress to come up with legislation that would put security requirements on all IoT devices. Haber believes the legislation should put the following requirements and restrictions on all IoT and Internet-connected devices:
- Internet-connected devices should not ship with common default passwords
- Default administrative passwords for each device should be randomized and unique per device
- Changing of the default password is required before the device can be activated
- The default password can only be restored by physically accessing the device
- The devices cannot have any administrative backdoors or hidden accounts and passwords
- The firmware (or the operating system) of the device must allow for updates
- Critical security vulnerabilities identified on the device for at least three years after last date of manufacturer must be patched within 90 days of public disclosure
- Devices that represent a security risk that are not mitigated or fail to meet the requirements above can be subject to a recall
Gartner analyst Tim Zimmerman last month called on IT organizations to address these proposed issues throughout all their infrastructure and software. Haber also believes the legislation is critical. "I think last Friday was just a test. I think it was just a huge warning," Haber said. "It was miniscule compared to what could have happened and that could result in huge financial losses and other implications." While praising XiongMai's recall, Haber also warned that "unfortunately, this is just one device of many from door bells to baby monitors that have the same type of problem."
Posted by Jeffrey Schwartz on 10/28/2016 at 11:50 AM0 comments
Microsoft today gave a new vision for the future of Windows, taking the OS into the realm of content and information creation and sharing.
Coming next spring, Microsoft said it will release the new Windows 10 Creators Update with support for 3D and virtual reality. Microsoft also took the wraps off new hardware designed to bring the best of the new Windows capabilities entering the high-end desktop all-in-one market with its 28-inch Surface Studio and updated Surface Books with twice the graphics processing power of the current top-end system and 30 percent more battery life.
While the forthcoming OS update and new hardware unveiled at an event for media, analysts and Windows Insiders in New York was interesting in its own right, the message here was clearly where Microsoft is taking Windows -- as a platform for not just consuming information and content but creating it as well. Microsoft's top executives emphasized that the new free Windows 10 Creators Update will introduce new ways to create new content and communicate better with people.
The Windows 10 Creators Update aims to bring 3D and "mixed reality" to mainstream use. The updated OS will ease the path for anyone to create holograms, while providing ties to Office including the ability for individuals to bring different communications networks together including e-mail, Skype and SMS.
Microsoft CEO Satya Nadella, during brief remarks at today's event, described these new features as key to bringing Windows into its new realm. The first wave of computing gave users the ability to become more productive, he noted. Over the past 10 years, advances in software, end-user computing devices and cloud services have introduced new ways for people to discover and consume information. Now Microsoft's goal with Windows is to enable new forms of content and information creation.
"I believe the next 10 years will be defined by technology that empowers profound creation," Nadella said. "At Microsoft, our mission is to empower every person and every organization on the planet to achieve more. We are the company that stands for the builders, the makers, the creators. That's who we are. Every choice we make is about finding that balance between consumption and creative expression. I am inspired by what I have seen in the Minecraft generation who see themselves as not players of a game but creators of new worlds they dream up -- the new forms of creativity and the expression we can unleash. This is what motivates us about Windows 10."
Windows 10 Creators Update
The ability to create and transform images and content to 3D will come in a tool most Windows users are quite familiar with: Paint. The new Windows Paint 3D will let users create or capture any image or photo and convert it into 3D and share those 3D images anywhere including on social media. To enhance that capability, Microsoft announced it is teaming with Trimble to 3D modeling app Sketchup, which claims it has millions of creators and content in its 3D Warehouse site.
While 3D will be a core focus in the next Windows release, Microsoft also intends to extend it to Office apps. Megan Sanders, a general manager in Microsoft's emerging technologies incubation group, demonstrated the creation of a 3D image in PowerPoint. Microsoft also said new inking capabilities coming to Windows will extend to Word as described in today's Office Blog.
A new MyPeople feature will let users pin the most important people to the taskbar. Saunders also demonstrated the ability to create an e-mail message sent to her husband who could receive it via Skype or SMS (on Android or Windows phones). "Over the next year, you will see us integrate 3D across our most popular Microsoft applications," she said.
Terry Myerson, executive VP of Microsoft's Windows and devices group, said the new Windows 10 Creators Update will bring new capabilities and devices for a wide audience of users ranging from gamers, software developers, artists and engineers as well as for everyday collaboration. "It is our mission to see that everyone can achieve their potential at work and play," he said.
In addition to new Xbox hardware, Microsoft is taking a key step toward bringing its HoloLens technology to the mainstream through its OEM partners. Myerson announced that Acer, Asus, Dell, HP and Lenovo will offer headsets enabled with the Windows 10 Creators Update that can create Holograms at starting prices of $299. Myerson said the headsets from those companies will have sensors that offer "six degrees of freedom" not requiring any setup when integrating between physical and virtual worlds, he said.
The New Surface Studio All-in-One Desktop Canvas
Just as Microsoft pushed into the laptop market last year with the launch of the Surface Book, the company is now entering the all-in-one desktop market. The company took the wraps off the Surface Studio, sleek system with a 28-inch collapsible 4.5k ultra HD screen, claiming it produces 13.5 million pixels (63 percent more pixels than a high-end 4k TV). Microsoft Corporate VP for Devices Panos Panay took the wraps off the new Surface Hub, which the company said transforms the function of a workstation into a "powerful digital canvas."
It produces 102 PPI and offers TrueColor DCO-P3 and initially comes in three configurations. The entry level unit, equipped with an Intel Core i5 processor, 1TB of storage, 8GB of RAM and a 2GB CPU costs $3,000. The mid-range system has an i7 processor, 16 GB of RAM, a 2GB CPU and 1TB of storage and will cost $3,500. And the most powerful system, equipped with an i7, 16 GB of RAM, a 4GB GPU and 2TB of storage, will set you back $4,200.
A New Peripheral for Creators: The Dial
"We want to transform the way you create and think about creating," Panay said. "It's built to pull you in, it is all fundamentally designed to immerse you into the content or the creation you want to work with."
One way Microsoft hopes to do that is with its new Dial, a peripheral shaped somewhat like a hockey puck, which Panay believes will provide a new way to navigate and interact with content. The new Dial is priced at $99 and will be available early next year. It will also work with the Surface Pro 3, 4, Surface Books and the new Surface Studio.
New Surface Books
Microsoft is also rolling out three new Surface Books, also based on the Intel 6th Generation Core CPUs, that Panay said will sport double the graphics processing power and add 30 percent more battery life than the original, bringing the total to 16 hours. The three new systems (specs are here), will be available next month, ranging in cost from $1,899 to $2,799.
Patrick Moorhead, president and principal analyst with Moor Insights and Technology, noted Microsoft decided not to roll out new systems with Intel's new 7th Generation Processor, code-named Kaby Lake, or the USB-C Thunderbolt interfaces.
"They didn't take as may risks as they did last year," Moorhead said. "They were conservative in my opinion." That said, he expects the new hardware will be a hit. "They are going to sell a ton of them."
Posted by Jeffrey Schwartz on 10/26/2016 at 2:12 PM0 comments
It was only a matter of time before hackers would find a way to unleash a massive distributed denial-of-service (DDoS) attack by taking advantage of millions of unprotected endpoints on Internet-connected sensors and components on consumer devices such as webcams, according to security experts. Friday's botnet attack on Dyn, a major DNS provider based in Manchester, N.H., was what Chief Strategy Officer Kyle York described as what will likely to be remembered "as an historic attack," which intermittently took down sites such as PayPal, Twitter, Netflix and Amazon. It also impacted business-critical service providers including cloud-based authentication provider Okta and various providers of electronic medical record systems.
The attacker and motive for the attack are not immediately clear. But threat-assessment firm Flashpoint confirmed that the attackers unleashed botnets based on Mirai, malware that were used last month to bring down the popular Krebs on Security site run by cybersecurity expert Brian Krebs and French hosting provider OVH. Flashpoint said it wasn't immediately clear if any of the attacks were linked to each other. The attackers unleashed a 620Gpbs attack on Krebs' site, which he noted is many orders of magnitude the amount of traffic necessary to bring a site offline.
The Mirai malware targets Internet of Things (IoT) devices ranging from routers, digital video records (DVRs) and webcams from security cameras, according to a description of the attack published by Flashpoint, which also noted that a hacker going by the name of "Anna Senpai" released Mirai's source code online. Flashpoint has also confirmed that botnet was specifically compromising flaws in DVRs and webcams manufactured by XiongMai Technologies, based in China. Flashpoint researchers told Krebs all of the electronics boards infected with Mirai share the default "username: root and password xc3511." Most concerning is that "while users could change the default credentials in the devices' Web-based administration panel, the password is hardcoded into the device firmware and the tools needed to disable it aren't present." Krebs noted. XiongMai today said it is recalling millions of its devices.
Security experts have long warned that such devices and other IoT-based sensors and components are vulnerable because they are not protected. Following the attack last month on the Krebs site, security expert Bruce Schneier warned in a blog post that it validated such fears. "What was new about the Krebs attack was both the massive scale and the particular devices the attackers recruited," Schneier wrote two weeks ago. "What this attack demonstrates is that the economics of the IoT mean that it will remain insecure unless government steps in to fix the problem. This is a market failure that can't get fixed on its own." Schneier last month suggested he had strong reason to believe these are nation-state attacks.
Morey Haber, vice president of technology at BeyondTrust, a provider of privileged identity management access software, agrees. In an interview this morning, Haber said the government should require all Internet-connected hardware including IoT sensors to have firmware that will enable passwords.
This attack could be just the tip of the iceberg, considering that only 10 percent of the Mirai nodes were actually involved in these attacks, said Dale Drew, CSO of Internet-backbone provider Level 3 Communications, in a brief video. "But we are seeing other ones involved as well," Drew said.
If that's the case, Haber said it appears someone is trying to send a message. "What would 50 or 90 percent look like if all of the bots were all turned on and used?," Haber asked. "That begs the question, was this a test, or was it a paid for hire? If it really is only 10 percent, as recorded by L3, we could be in store for something a lot larger because we haven't torn down that network yet."
Level 3's Drew advises companies that believe the attacks are impacting their sites to stay in contact with their ISPs and to use multiple DNS providers.
Posted by Jeffrey Schwartz on 10/24/2016 at 1:27 PM0 comments
Microsoft tried to acquire Facebook in its early years for $24 billion, former CEO Steve Ballmer today told CNBC. The fact that he tried to buy Facebook nearly a decade ago is hardly a shocking revelation, given his appetite at the time to make Microsoft more relevant to consumers at the time. Ballmer's comment, though, appears to be the first public acknowledgment of the company's interest in sealing such a deal.
"He said no, and I respect that," Ballmer told the hosts of CNBC's Squawk Box during a guest segment on the early morning program. Microsoft of course did settle for an early $240 million investment in Facebook in early 2007, a fraction of one percent of what Microsoft was willing to pay for the whole thing. It was a noteworthy move back then when Facebook was valued at $15 billion. Certainly Microsoft wasn't the only company circling its wagons around Facebook back then.
Putting aside whether it was a strategic fit for Microsoft, clearly Founder and CEO Mark Zuckerberg and the rest of Facebook made the right bet for themselves as the now publicly held company is valued at $375 billion. One of the CNBC hosts pointed out a bit of irony in the fact that Snapchat rebuffed Zuckerberg when he tried to buy the popular mobile messaging company for $3 billion. "I think trading that for some short-term gain isn't very interesting," Snapchat Founder Evan Spiegel told Forbes back in 2014, months after turning down the offer. In effect, Zuckerberg was denied in the same way he turned down Ballmer.
Asked about reports that he was recently looking into acquiring Twitter after buying a 4% stake in the company last year, Ballmer said "I have never, ever, ever wanted to buy Twitter myself. I got a good life right now. I don't need to do that." Asked which he believed would be the best company to acquire Twitter, Ballmer said Google's search engine would be an ideal platform to surface tweets.
As CEO of Microsoft, Ballmer said he never tried to acquire Salesforce.com. "I believe in revenue and profit," Ballmer said, in apparent dig to his onetime nemesis Marc Benioff. "I could never make the math work on Salesforce." Apparently Ballmer's successor, Satya Nadella, saw things differently last year when he had reportedly tried to acquire Salesforce.com for $60 billion.
Posted by Jeffrey Schwartz on 10/21/2016 at 12:20 PM0 comments
Dell will deliver a new platform that combines its existing endpoint security offerings with those from its newly acquired EMC and its RSA and VMware AirWatch business. The new endpoint security and management portfolio is the first major new offering resulting from Dell's $67 billion acquisition of EMC, which closed six weeks ago, representing the largest-ever merger of two IT infrastructure providers.
The new security offering is one of a significant number of announcements announced at Dell EMC World, a two-day gathering in Austin, Texas that kicked off this morning.
Like any large deal, many fail to see its potential benefits, but Michael Dell, the company's founder and chairman, showed no doubt during the opening keynote that the newly combined company will not only succeed but thrive.
"Today Dell is the largest enterprise systems company in the entire world," he said. Among the many new offerings released and in the pipelines revealed today, Dell pointed to the benefits of coming together to address the mounting cyberattacks and threats.
"Every time I sit down with our colleagues with RSA or SecureWorks and they take us through not what happened in the last quarter but just this week, it's very scary," Dell said during a press conference following the keynote session. "The nature of the attacks and the sophistication of the attacks is increasing."
The new suite will come out of Dell's client solutions group, which includes its PC offerings. "You have a package from us that meets the changing needs of the workforce," said Jeff Clarke, vice chairman of operations and president, client solutions group, during the keynote session announcing the new suite. "We now have the ability for endpoint security that's unmatched in the market."
Initially Dell will offer the new portfolio as a bundled suite, Clarke said, and over time the various products will interoperate with each other. Ultimately Dell will offer a single management console for the entire offering, Clarke said.
The company is looking at this platform as three key components: Identity and authentication, data protection and unified endpoint management. The bundle will include:
- Dell Data Protection-Endpoint Security Suite, which provides authentication, file-based encryption and advanced threat protection
- MozyEnterprise and MozyPro, the company's cloud-based backup and recovery, file sync and encryption offering.
- RSA SecurID Access, the multifactor, single sign-on authentication solution.
- RSA NetWitness Endpoint, a tool designed to utilize behavioral analytics and machine learning to provide more rapid remediation to advanced threats.
- VMware AirWatch, the company's mobile device management offering. With this release Dell said organizations can use Dell Data Protection with AirWatch to report on activity for compliance.
Dell believes companies are looking to streamline the number of security solutions in their organizations. Technology Business Research's new benchmark survey showed that large companies typically have 50 security products and smaller organizations have 10. TBR's latest research report shows companies would like to get that number down to 45 and 8, respectively. This is the second year in a row, customers are spending more of their new budgets on endpoint security than any other segment, said TBR Analyst Jane Wright, during a panel session at DellEMCWorld yesterday that she moderated.
"Customers are telling us that they're spending more money than ever on endpoint security," Wright said. "And it's not just that they want to protect those nice pretty endpoints, they're looking to protect the data that's on those endpoints or passing through those endpoints at any given time."
The increased spend isn't just on products Wright added. "They are dedicating more people and creating more policies and procedures and testing, than ever before."
Posted by Jeffrey Schwartz on 10/19/2016 at 12:48 PM0 comments
When you think of Lenovo, ThinkPads, Yogas and servers may come to mind -- but not digital workspace technology. The company is hoping to change that view over the next few years as Lenovo aims to extend its software and cloud solutions portfolio.
Lenovo last week announced its new Unified Workspace Cloud, a managed service based on its on-premises Unified Workspace technology. Similar to the on-premises Workspace Cloud, a platform that it has offered since its acquisition of Stoneware four years ago, Unified Workspace Cloud is HTML5 based and uses RDP to access applications in a consistent manor on any PC or mobile device. Sal Patalano, Lenovo Software's chief revenue officer, in an interview said that unlike the digital workspace offerings Citrix and VMware have recently rolled out, that its Unified Workspace offering doesn't require an agent or plugins. "The big thing is being able to do it via browser. We don't deal with any desktop agents," Patalano said, adding it also doesn't require VPN connections. "The ability to negotiate and get into my corporate environment without having to deal with a VPN logon is huge."
The on-premises Unified Workspace front-ends a secure proxy and users can log in to it via their Active Directory credentials to access applications in a datacenter --internally hosted, Web application and SaaS apps. With the current on-premises version, the customer runs two servers -- one to interact with Active Directory and connect into any of those internal private applications they need to access, said Dan Ver Wolf, a Lenovo senior sales engineer. The second server is deployed in the DMZ functioning as a relay. "Users that are remote, using personal devices, whatever it might be, access everything through that external relay, so they get secure access, remain physically separated from the datacenter, but still get access to internal resources," Ver Wolf said.
With the new hosted offering, it uses a similar approach, though it's a managed service hosted via Amazon Web Services and administered by Lenovo's professional services team. "When a customer wants a new application added to the service, they just call to have it deployed," he said.
In addition to no longer requiring the infrastructure associated with the current version, it's half the price. The MSRP for one user access to the on-premises offering is $50 and $100 per user per month for concurrent access, versus $25 and $50 respectively for the new cloud offering. Granted, no one pays MSRP, but pricing will vary based on the number of employees.
Lenovo also announced it has inked a partnership with Nimble Storage, a rapidly growing provider of flash storage systems. The two companies will look to deliver a "self-healing" converged solution with Nimble's InfoSight. Lenovo said the first product based on that solution, the ThinkAgile CX Series, is set for release at the end of the month.
Posted by Jeffrey Schwartz on 10/17/2016 at 11:44 AM0 comments
The release of Windows Server 2016 this week is a major upgrade to Microsoft's venerable server OS thanks to a number of significant new features. But it could be argued that the most distinct new capabilility is its support for containers. That's important because with containers, Windows Server 2016 will be able to run applications and workloads not built to run on Windows, notably Linux, but also those designed to run in cloud environments.
In addition to supporting Windows and Hyper-V containers -- initially via the runtime environment of the Docker open source container platform -- Windows Server 2016 will include a commercial version of the Docker Engine.
At last month's Ignite conference in Atlanta, Docker and Microsoft said they have extended their partnership, inked more than two years ago, in which a commercially supported version of the Docker engine will be included with Windows Server 2016 at no extra cost.
"This makes it incredibly easy for developers and IT administrators to leverage container-based deployments using Windows Server 2016," Microsoft Executive VP for Cloud and Enterprise Scott Guthrie said in the Ignite keynote.
I had a chance to speak with Docker COO Scott Johnson at Ignite, where he described the next phase of the two companies' relationship, the details of the new arrangement and how the company hopes to widen the reach of Docker containers to the Windows world.
With regard to this new arrangement, does that mean Docker Engine is built into Windows Server 2016?
If you buy Windows Server 2016, you have access to the Docker Engine, which, behind the scenes, will be downloaded from Docker. The user will have the option to just activate Docker and it will appear in front of them.
Are you both providing joint support?
The deal has three legs. First is the commercially supported Docker Engine, the second is commercial support and that's provided by Microsoft, backed by Docker. And the third leg is with our Docker Datacenter product, which helps IT organizations manage these containerized workloads. So that will be jointly promoted by Microsoft and Docker to the Windows Server user base.
Where do you see customers using Docker Datacenter?
What we see is they'll start with the Docker Engine. They will play with a couple of containers, get them fired up. But once IT operations gets a sense that this is a real application architecture, IT operations will says "how do I manage all of these containers? How do I move them from lab to production? How do I move from datacenter to cloud?" Docker Datacenter is the management tooling that helps them do that. So it's the management tools on top of the runtime.
Will it work with Microsoft's System Center?
It pares well with System Center and OMS [Operations Management Suite] in that you can think of them as managing the infrastructure layer. So they're managing the hardware and the hypervisors, and Docker Datacenter is managing the applications in the containers on top of the infrastructure. Microsoft actually produced an OMS monitoring agent for Docker already. So there's already good integration happening already.
How do the Windows Containers fit into Docker containers? Meaning, what is the relationship between them?
The Windows kernel has the container primitives and the Docker Engine takes advantage of those primitives. So when Microsoft says Windows Containers or Hyper-V containers, that's synonymous with Docker Engine containers. The way you take advantage of Windows containers is using the Docker Engine interface. They're part and parcel of the same thing.
Are you anticipating a lot of Windows Server shops will go this route?
What we've seen with the tech previews is that there's actually quite a bit of pickup even in a raw technology preview stage of Windows shops doing a lift-and-shift-type motion with their .NET apps. So they will take an existing app, pick it up off the host, off the VMs, put it over into a Docker container and right away they're able to iterate faster in their CI [continuous integration efforts]. They have a build artifact that they can move from developer to developer. So with Tyco's use case, they're using Docker containers on Windows Server to do a lift and shift and bring a dev ops process to their Windows development, which a couple of years ago, peoples' heads would have exploded. But you're seeing those two worlds come together, largely facilitated by the Docker containers.
What apps lend themselves best for this lift and shift?
Web apps work very well. Mobile apps work very well.
There's this question of whether containers will replace the VM. Does this lend credibility to that argument?
I think that's a long-term discussion. VMs have a hardware isolation level that is built in with 10 years of development. The automation and tooling and security signoffs, have all been built around VMs, the entire army of VMware and Hyper-V admins have built their careers on VMs. So they're not going away anytime soon. And VMs and containers are actually very complimentary because containers are OS virtualization and VMs are hardware virtualization. So they're different layers of the stack. Today they're not one to one, where one replaces the other. In the future is that how it rolls forward? We'll have to wait and see, but that's not how we're positioning the main benefit is today.
But it is an option when you talk about trying to reduce the footprint?
What we see happening is we see them using a single VM with Docker and multiple containers in that VM so they get the isolation benefits and the automation benefits that they've already invested in. They also get the density benefits of multiple containers with multiple apps inside a single VM. You can get the best of both worlds by doing that and still take your footprint down, but still have the security and automation tooling.
Posted by Jeffrey Schwartz on 10/14/2016 at 11:44 AM0 comments
I'm thrilled to share that David Foote, chief analyst with Foote Partners and a prominent expert in IT skills, certification and salary benchmarks, will be the keynote speaker at this year's Live! 360 conference, where he'll reveal some important trends and pay data and discuss the importance of choosing a set of IT skills amid digital transformation efforts that are taking priority in a growing number of organizations.
Live! 360, which is produced by this publication's conference group, is an annual gathering of IT professionals and developers, scheduled to kick off Dec. 5 in Orlando, Fla. It brings together a number of conferences we produce including TechMentor, SharePoint Live!, SQL Live!, Visual Studio Live,! Modern Apps and App Dev Trends.
Foote is a renowned expert on IT skills and compensation and his firm for two decades has produced deep research and analysis on technology, economic and most notably compensation data covering hundreds of technology skills and disciplines. I've known Foote for many years and he plans to discuss how their IT skills are valued, taking into account shifts in technology and business requirements and will present key data and some forecasts. Amazon, IBM, Google, Oracle Red Hat and Microsoft, among others, often talk up how businesses and the public sector are looking to become more agile. And with the availability of new technology and methodologies ranging from cloud, mobile, DevOps, automation, along with rapid acceleration of real-time analytics, machine learning and Internet of Things, demand for IT professionals and developers with new skills is growing.
Digital transformation is a key theme CEOs of all the major tech providers are talking up these days. According to a survey by Gartner earlier this year, 50 percent of CEOs and senior businesses say digital business transformation is on their agenda. Likewise, organizations are struggling to find people with modern cyber security skills to address current threats and those introduced by these new technologies. Amid these changes, IT professionals will need to carefully consider the approach to maintaining their skills to ensure they can maximize their earning potential, he says.
"If you are looking for job opportunities and greater pay, it's important to maintain cross-skilling," Foote said. "The whole ocean is rising." These digital transformation initiatives are changing and creating demand for in a variety of new areas such as UX designers, digital product designers, digital modelers and digital analytics managers, with expertise in a variety of platforms, form factors, development and infrastructure environments.
During the keynote, Foote will field questions from audience members about how they should consider managing their careers and how organizations are placing premiums on various skills and certifications. It should be an interesting opportunity if you're looking to make sure you're aligning your skills with where business and technology are headed. I hope to see you there.
Posted by Jeffrey Schwartz on 10/11/2016 at 1:49 PM0 comments
Amazon Web Services and Microsoft have each brought native IPv6 connectivity to their respective cloud services as the need for the new addresses continues to rise amid new software that requires them and a limited pool of those based on those on the original IPv4.
AWS first added IPv6 to its S3 storage offering back in August and last week extended it to other services. Microsoft announced that Azure now supports native IPv6 addresses at last month's Ignite conference in Atlanta, though the move was overshadowed by the news that the company had quietly upgrade all of its network nodes with field-programmable gate arrays (FPGAs), providing 25Gbps throughput, a substantial boost. Internet service and cloud provide