Docker today released several new tools aimed at letting IT pros and developers build and manage distributed applications that are compatible across environments including Amazon Web Services, Google, IBM, Joyent, Mesosphere, Microsoft and VMware.
Over the past year, these players have pledged support for Docker's open source container environment, which has quickly emerged as the next-generation architecture for developing and provisioning distributed apps. Today's beta releases are key deliverables by Docker and its ecosystem partners to advance the building, orchestration and management of the container-based platform.
For its part, Microsoft said it is supporting the newly released betas of Docker Machine (download here), the orchestration tool that gives IT pros and developers the ability to automate the provisioning and management of Docker containers on Linux or Windows Servers; and Docker Swarm, a tool that lets developers select infrastructures for their apps including Azure virtual machines. Microsoft also said it will natively support the new Docker Compose developer tool as a Docker Azure extension.
The new Docker Machine beta allows administrators to select an infrastructure to deploy an application built in the new environment. Microsoft has contributed drivers in Azure that allow for rapid and agile development of Docker hosts on Azure Virtual Machines. "There are several advantages to using Docker Machine, including the ability to automate the creation of your Docker VM hosts on any compatible OS and across many infrastructure options," said Corey Sanders, Microsoft's director of Azure program management, in a post on the Microsoft Azure blog.
"With today's announcement, you can automate Docker host creation on Azure using the Docker Machine client on Linux or Windows," he added. "Additionally, Docker Machine provides you the ability to manage and configure your hosts from a single remote client. You no longer have to connect to each host separately to perform basic monitoring and management tasks, giving you the flexibility and efficiencies of centralized devops management."
With Docker Swarm, which spins a pool of Docker hosts into one virtual host, IT pros can deploy their container-based apps and workloads using the native Docker clustering and scheduling functions, Sanders added. It also lets customers select cloud infrastructure such as Azure, enabling them to scale as needs necessitate for dev and test. Sanders noted that using the Docker CLI, customers can deploy Swarm to enable scheduling across multiple hosts. The Docker Swarm beta is available for download on Github.
The Docker Compose tool enables and simplifies modeling of multi-container Docker solutions using the declarative YAML file format. "This single file will be able to take a developer-modeled application across any environment and generate a consistent deployment, offering even more agility to applications across infrastructure," Sanders noted. "In Azure, we are working to expand our current Docker extension to support passing of the YAML configuration directly through our REST APIs, CLI or portal. This will make the simple even simpler, so you can just drop your YAML file details into the Azure portal and we take care of the rest."
Microsoft said it will be releasing documentation to build Docker hosts on Azure virtual machines using a Docker Machine.
Posted by Jeffrey Schwartz on 02/26/2015 at 12:53 PM0 comments
The National Security Agency (NSA) continues to hold its stance that the only way to thwart terrorist attacks and other crimes is to continue the surveillance programs exposed by Edward Snowden nearly two years ago. The latest report alleges that the NSA, along with the British government counterpart Government Communications Headquarters (GCHQ), has hacked encryption keys from SIM cards on smartphones.
Documents provided by Snowden and reported last week by The Intercept allege that the U.S. and British governments specifically were hacking into SIM cards from Gemalto, the largest provider of SIM cards, used in smartphones to store encrypted identity information. According to the report, the breach was outlined in a secret 2010 GCHQ document.
If indeed the encryption keys were stolen, it gave the agencies the ability to eavesdrop on and wiretap voice and data communications without approval from governments or wireless providers. The bulk key theft also gave the agencies the ability to decrypt communications that they had already intercepted, according to the report. The ability to do so was the result of mining communications of engineers and other Gemalto employees, the report added, noting that the company was "oblivious to the penetration of its systems."
Now Gemalto is shedding doubt on the severity of the breach. The company released a statement which did acknowledge it detected the intrusion that took place in 2010 and 2011. The findings of the investigation "give us reasonable grounds to believe that an operation by NSA and GCHQ probably happened," according to the Gemalto statement. However, in questioning the extent of the breach, the statement said that "the attacks against Gemalto only breached its office networks and could not have resulted in a massive theft of SIM encryption keys."
By 2010, the company said it had already implemented a secure transfer system with its customers and in only some rare instances could theft have occurred. Moreover, in many of the targeted countries at the time, many of the networks only had 2G mobile communications networks, which are inherently insecure. The modern 3G and 4G networks weren't vulnerable to such interceptions, according to the company. Gemalto said none of its other cards were affected by the attack. While the statement also pointed to some inconsistencies in the document that was leaked, including some of the customers it claimed the company worked with, Gemalto said that the SIM cards have customized encryption algorithms for each telecom provider.
For its part, the NSA is making no apologies on its surveillance policies. NSA Director Mike Rogers spoke last week at the New America Foundation's cyber security conference in Washington, D.C., where he said backdoors would not have a negative impact on privacy, weaken encryption or dampen demand for technology from the U.S.
Alex Stamos, Yahoo's chief information security officer, who was in attendance at the conference, took Rogers to task on his contention that the government has backdoors or master keys, according to The Guardian. When Stamos asked Rogers how Yahoo, which has 1.3 billion users throughout the world, could be expected to address requests for backdoors, Rogers reportedly skipped over the foreign requests, describing its overall process as "drilling a hole in a windshield. I think that this is technically feasible. Now it needs to done within a framework."
The problem is, it's unlikely that the feds will come up with a framework that will sit well with many people.
Posted by Jeffrey Schwartz on 02/25/2015 at 10:27 AM0 comments
Lenovo Chief Technology Officer Peter Hortensius yesterday apologized for the SuperfIsh spyware installed on several of its PC models, saying it shouldn't have happened and said the company is putting together a plan to ensure it never happens again.
"All I can say is we made a mistake and we apologize," Hortensius said in an interview with The New York Times. "That's not nearly enough. So our plan is to release, by the end of the week, the beginning of our plan to rebuild that trust. We are not confused as to the depth of that this has caused people not to trust us. We will do our best to make it right. In the process, we will come out stronger. But we have a long way to go to make this right."
Hortensius said so far Lenovo has not seen any evidence that the malicious software that was embedded deep within the company's systems put any customers or their data at risk. "We are not aware of this actually being used in a malevolent way," he told The Times' Nicole Perlroth. Asked if it's possible that Lenovo engineers installed this on any other models than the two already reported (the Yoga 2 models and Edge 15), Hortensius said he didn't believe so but the company is investigating and will have an answer by the end of the week.
Nevertheless, some of his responses were troubling. Why did it take more than a month for Lenovo to get to the bottom of this once it was reported to the company? "At that time, we were responding to this issue from a Web compatibility perspective, not a security perspective," he said. "You can argue whether that was right or wrong, but that's how it was looked at it." Hortensius also wasn't able to answer Perlroth's question regarding how the opt-in processes work.
He was also unable to explain how the company was unaware that Superfish was hijacking the certificates. "We did not do a thorough enough job understanding how Superfish would find and provide their info," he said. "That's on us. That's a mistake that we made."
Indeed mistakes were made. Some might credit him for saying as much and apologizing. But based on the comments from my report on the issue earlier this week, it may be too little, too late.
"I didn't trust Lenovo even before this issue," said one commenter who goes by the name "gisabun." "Expect to see sales drop a bit [even if the corporate sales are generally unaffected]. Microsoft needs to push all OEMs to remove unnecessary software."
"Bruce79" commented: "Inserting a piece of software that opens unsuspecting users up to security attacks? That is a clear betrayal, regardless of price."
Kevin Parks said, "We need a class-action lawsuit to sue them into oblivion. That would tell vendors that we won't accept this kind of behavior."
Another had a less extreme recommendation: "What Lenovo could and should do is simple. Promise to never put third-party software on their machines for [X number] of years. After X number of years, no software will be preloaded; Lenovo will ask if you want the software downloaded and installed."
Was Lenovo CTO's apology a sincere mea culpa or was he just going into damage-control mode? Do you accept his apology?
Posted by Jeffrey Schwartz on 02/25/2015 at 9:36 AM0 comments
BlueStripe Software today said its FactFinder monitoring suite now supports distributed applications residing in Docker containers. The company said an updated release of FactFinder will let IT operations administrators monitor and manage application containers deployed in Docker containers.
Granted, the number of full-blown transaction-oriented systems that are developed and deployed in Docker containers today are few and far between. But BlueStripe Director of Marketing Dave Mountain said a growing number of development teams are prototyping new applications that can use Docker containers, which are portable and require much less overhead than traditional virtual machines.
"It's something where there's a lot of activity on the dev site with Docker containers," Mountain said. "Generally when you hear people talking about it, it's very much at the development level of [those] prototyping new ideas for their applications and they're pulling things together to build them quickly. We're not seeing them being deployed out to the production environments, but it's coming. As such, we want to be ready for it."
FactFinder is a tool used to monitor transactions distributed across server, virtualization and cloud platforms and is designed to troubleshoot the root of a transaction that might be failing or just hanging. The company last year added a Microsoft System Center Operations Manager module and the Microsoft Azure Pack for hybrid cloud infrastructures. The BlueStripe "collectors" scan physical and virtual machines to detect processes taking place. With the new release BlueStripe can view, isolate and remediate those processes housed within the container just as if they were in a physical or virtual machine.
Despite the early days for Docker containers, Mountain believes they will indeed become a key tier in distributed datacenter and cloud architectures. "As it continues to go, we expect this to become more mainstream. So this was a move on our part to make sure we're addressing that need," Mountain said. "I think it's real, I don't think this is just hype."
Posted by Jeffrey Schwartz on 02/24/2015 at 10:25 AM0 comments
Lenovo's decision to install the adware program Superfish on some of its PCs, notably the Yoga 2 models and Edge 15, was the latest inexcusable action by a company that we should be able to trust to provide a secure computing environment. It's hard to understand how Lenovo could let a system that was able to bypass the antimalware software it bundled from McAfee (as well as others) into the market.
While Microsoft swiftly updated its Windows Defender to remove the certificate for Superfish and Lenovo on Friday released its own downloadable removal tools including source code, this wasn't just another typical bug or system flaw.
Unbeknownst to customers, Lenovo apparently installed the Superfish software, designed to track users' online sessions including all SSL traffic, making their systems vulnerable to theft from hackers of passwords and other sensitive information. Adding insult to injury, Lenovo took the rather unscrupulous move of installing it at the BIOS level, making it impervious to antimalware and AV protection software.
Justifying the move, Lenovo said it had knowingly installed the adware under the guise that it would "enhance the shopping experience." The only thing it enhanced was the level of suspicion users have that whoever Lenovo does business with are putting their information at risk to further their own objectives.
Just in the past few weeks, we learned that hackers stole user information from Anthem, the nation's second largest health insurer. Some 80 million customers' private information (myself included) were victims of this attack. Also last week, the latest leak by Edward Snowden to The Intercept accused the National Security Agency (NSA) and the British government of hacking into SIM cards from Gemalto, a company whose chips are used to store personal information in smartphones such as passports and identity information. And the list goes on.
What's galling about the Lenovo incident is that the company only put a stop to it when Peter Horne, the person who discovered it, raised the issue (the company argued it was due to negative user feedback). Horne, a veteran IT professional in the financial services industry, came across the installation of Superfish in the Lenovo Yoga 2 Notepad he bought. Horne told The New York Times that not only did the bundled McAfee software not discover it but Superfish also got past the Trend Micro AV software he installed. Looking to see how widespread the problem was, he visited Best Buy stores in New York, Boston and retailers in Sydney and Perth and the adware was installed on all the PCs he tested.
Yet upon fessing up, Lenovo argued that it was only installed on consumer systems, not ThinkPad, ThinkCentre, Lenovo Desktop, ThinkStation, ThinkServer and System x servers. Horne had a rather pointed suspicion about Lenovo's decision to install the adware in the first place. "Lenovo is either extraordinarily stupid or covering up," he told The Times. "Either one is an offense to me."
But he noted an even bigger issue. "The problem is," he said, "what can we trust?"
Posted by Jeffrey Schwartz on 02/23/2015 at 2:50 PM0 comments
Many of us look forward to the day when we can get any information we want and have systems intelligently bring us what we're looking for. In a sense that's what Microsoft's new Azure Machine Learning service aims to do. While IBM is among those who have demonstrated the concept with Watson and is looking to advance the technology as well, Microsoft is looking to bring the service to the masses more easily and affordably.
"Simply put, we want to bring big data to the mainstream," wrote Joseph Sirosh, corporate vice president for Machine Learning, in a blog post announcing the general availability for the service that was announced last summer. Azure Machine Learning is based on templates and usual workflows that support APIs and Web services, enabling developers to tap into the Azure Marketplace to easily pull together components to build applications that incorporate predictive analytics capabilities.
"It is a first-of-its-kind, managed cloud service for advanced analytics that makes it dramatically simpler for businesses to predict future trends with data," Sirosh added. "In mere hours, developers and data scientists can build and deploy apps to improve customer experiences, predict and prevent system failures, enhance operational efficiencies, uncover new technical insights or a universe of other benefits. Such advanced analytics normally take weeks or months and require extensive investment in people, hardware and software to manage big data."
Yet despite the rapid growth and rollout of new Hadoop-based services that are the underpinnings of the most sought out predictive analytics platforms, growth is somewhat stalled, according to a survey conducted during Gartner's latest quarterly Hadoop webinar. The percentage of the 1,200 participants who this month said they have deployed Hadoop-based applications has remained flat since last quarter's survey (only 15 percent said they have actually deployed).
However, when the Gartner survey results are examined based on respondents who said they were in the "knowledge gathering" mode, the percentage of Hadoop deployments was lower than 15 percent. Meanwhile, those who said in the survey that they were developing strategies for Hadoop had rates of deployment that were higher than 15 percent. Gartner Research VP Merv Adrian indicated in a blog post that while it's hard to draw any broad conclusions, it may indicate renewed interest by those who have put their plans on hold. "My personal speculation is that it comes from some who have been evaluating for a while," he said.
And indeed there is plenty to look at. Microsoft has rolled out some noteworthy new offerings and is gaining partner support. That includes the latest entry to the Azure Marketplace, Informatica, which released its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage.
"Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data," wrote Informatica's Ronen Schwartz, in a blog post. "The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others -- for advanced analytics."
Do you think machine learning is ready for prime time?
Posted by Jeffrey Schwartz on 02/20/2015 at 12:54 PM0 comments
Microsoft CEO Satya Nadella "loves" Linux. So it should come as little surprise that Microsoft is planning to support its Azure HDInight big data analytics offering on the open source server platform. The company announced the preview of HD Insight on Linux at the Strata + Hadoop World conference in San Jose, Calif. Microsoft also announced the release of Azure HD Insight running Storm, the popular Apache streaming analytics platform for streaming analytics.
The open source extensions aim to widen Microsoft's footprint in the growing market for big data services, enable users to gather more information that they can parse and analyze to make better decisions and bring big data into mainstream use, as Microsoft has indicated with its development of Cortana, now available on Windows Phone and in beta on Windows 10.
In addition to the public preview of HDInsight on Linux and general availability of Apache Storm for HDInsight, Microsoft announced Hadoop 2.6 support in HDInsight, new virtual machine sizes, the ability to grow or reduce clusters running in HDInsight and a Hadoop connector for DocumentDB.
"This is particularly compelling for people that already use Hadoop on Linux on-premises like on Hortonworks Data Platform because they can use common Linux tools, documentation and templates and extend their deployment to Azure with hybrid cloud connections," said T. K. "Ranga" Rengarajan, corporate vice president for Microsoft's Data Platform and Joseph Sirosh, corporate vice president for Machine Learning, in a blog post.
Support for Storm is also another key advance for Microsoft as it has emerged as a widely adopted open source standard for streaming analytics. "Storm is an open source stream analytics platform that can process millions of data 'events' in real time as they are generated by sensors and devices," according to Ranga. "Using Storm with HDInsight, customers can deploy and manage applications for real-time analytics and Internet-of-Things scenarios in a few minutes with just a few clicks."
Despite its open source push, Microsoft isn't part of the Open Source Platform Alliance that was announced this week to ensure an interoperable Apache Hadoop core. Among those on board are GE, Hortonworks, IBM, Infosys, Pivotal, SAS, Altiscale, Capgemini, CenturyLink, EMC, Splunk, Verizon Enterprise Solutions, Teradata and VMware.
Asked why, a Microsoft spokeswoman stated, "Microsoft is already partnered with Hortonworks to use HDP which will utilize the Hadoop core from the Open Data Platform Initiative moving forward. We also will continue to contribute to the broader Apache Hadoop ecosystem." The statement also offered support for the project. Microsoft sees the Open Data Platform Initiative as a good step forward to having everyone run on the same Hadoop core including HDFS, YARN and Ambari. "We see standardization in the Hadoop space as a good thing as it reduces fragmentation and makes adoption of the technologies easier."
In addition, Microsoft is focused on contributing Hadoop projects like Hive (Project Stinger, Tez), YARN, REEF and others, as well as partnering with Hortonworks, she said. "We see this Open Data Platform Initiative as complimentary to these efforts and will help the overall Hadoop ecosystem."
Posted by Jeffrey Schwartz on 02/20/2015 at 12:20 PM0 comments
A number of prominent SharePoint MVP experts say they are confident that the on-premises server edition of SharePoint has a long future despite Microsoft's plans to extend the capabilities of its online counterpart -- Office 365 -- as well as options to host it in a public cloud service such as Azure. At the same time, many realize that customers are increasingly moving (or considering doing so) some or all of their deployments to an online alternative, either by hosting it in the cloud or moving to Office 365 and SharePoint Online.
In one of many Tweetjams -- online discussions via Twitter -- hosted by prominent SharePoint MVP Christian Buckley (@buckleyplanet), the experts weighted in on the forthcoming SharePoint 2016 release, due out later this year, and what it will mean to the future of the premises-based edition.
"On-prem is very much alive and well. Don't think it's going away anytime soon," said MVP Asif Rehmani (@asifrehmani), founder and CEO of Chicago-based VisualSP. "Alive and well. Oh, and heavily customized," added Daniel Glenn (@DanielGlenn), Technical Consultant at InfoWorks and president of the Nashville SharePoint User Group.
Not everyone sees it that way. Some participants say the move toward hybrid deployments is gaining traction and is a sign that SharePoint in the datacenter has peaked. "SharePoint OnPrem is trending down, but still steady and above 70 percent -- there is room to grow still," tweeted Jeff Shuey (@jshuey), chief evangelist at K2, an ISV that provides workflow apps for SharePoint.
Barry Jinks (@bjinks), CEO of collaboration app provider Colligo, argued that the economies of Office 365 are compelling to many customers. "Eventually enterprises will move there," Jinks tweeted. "Just going to take way longer than hoped."
Buckley, the moderator and principal consultant with GTConsult, noted that while Microsoft may want everyone to move to the cloud, enterprises have too much invested in their on-premises SharePoint deployments. "SP on-prem CAN'T be killed by MSFT or anyone, only supplanted as cloud gets ever better," he tweeted. "Our Enterprise customers are looking at Hybrid. Still loving the on-prem #SharePoint as they have hefty investments there," said Gina Montgomery (@GinaMMontgomery), strategic director for Softmart, where she manages its Microsoft practice.
"IT [and collaboration tools] are evolving much faster than a 3 year DVD release cycle," said SharePoint and Office 365 Architect Maarten Visser (@mvisser), managing director of meetroo. " SharePoint OnPrem gets old quickly."
Asked if hybrid SharePoint deployments are becoming the new norm, the experts argued the hype doesn't match what they're seeing from their customers. "I don't think it will be a norm as much as what will be the best fit to meet requirements," said Stacy Deere-Strole (@sldeere), owner of SharePoint consultancy Focal Point Solutions.
"MSFT want it to be [the new norm]," observed SharePoint MVP Jason Himmelstein (@sharepointlhorn), Sr. Tech Director at Atrion. "Like with much of what we see coming out of Redmond it will be a bit before the desire matches the reality."
Yet many acknowledged that many are moving to hybrid deployments, or are in the process of planning to do so. "The story for OnPremises #SharePoint only gets better when you can work seamlessly with the cloud #SPO -- Hybid is a must," said SharePoint MVP Fabian Williams (@FabianWilliams). "Is hybrid the right idea? DAMN RIGHT. Move the right workloads for the right reasons," Himmelstein added.
"Yes, hybrid is becoming the norm for enterprises as well now. It just makes sense," Rehmani added. "Hybrid brings conservative customers the stability they need and allows them to experiment in the cloud," said Visser. "That's why SharePoint 2016 will be all about hybrid to force the transition," said MVP Michael Greth (@mysharepoint), based in Berlin. "Soon -- complex enterprise landscape will require a balance that hybrid can provide," tweeted Michelle Caldwell (@shellecaldwell), a director at integrator Avanade and a founder of the Buckley, Ohio SharePoint User Group. "Many are still planning and dabbling."
Williams added: "Hybrid can be considered normal because you need a 'bridge' to work between SPO & ONPrem since not all features are on both," he tweeted.
Many are also looking forward to hearing about new management features coming in SharePoint 2016. "This will be the super exciting stuff at @MS_Ignite," said Dan Holme, co-founder & CEO of IT Unity, based in Maui. "I believe it will be the differentiator over O365," Glenn said. "But O365 will absorb some (if not all) of it via Azure services over time." Buckley is looking forward to hearing more from Microsoft on this. "There has always been a gap for management across SP farms, much less hybrid," he said. "Will be interesting to see what is coming next."
What is it about SharePoint 2016 you are looking forward to hearing about?
Posted by Jeffrey Schwartz on 02/19/2015 at 7:31 AM0 comments
The latest Silicon Valley startup looking to ride the wave of cloud-based software-defined datacenters (SDDCs) and containerization has come out of stealth mode today with key financial backers and founders who engineered the VMware SDDC and the company's widely used file system.
Whether Sunnyvale, Calif.-based Springpath will rise to prominence remains to be seen, but the company's hyper-converged infrastructure aims to displace traditional OSes, virtual machines (VMs) and application programming models. In addition to the VMware veterans, Springpath is debuting with $34 million in backing from key investors with strong track records. Among them is Sequoia Capital's Jim Goetz, whose portfolio has included such names as Palo Alto Networks, Barracuda Networks, Nimble Storage, Jive and WhatsApp. Also contributing to the round are New Enterprise Associates (NEA) and Redpoint Ventures.
Springpath's founders have spent nearly three years building their new platform, which they say will ultimately host, manage and protect multiple VMs, server OSes, apps and application infrastructure including Docker containers. The company's namesake Springpath Data Platform in effect aims to let organizations rapidly provision, host and virtualize multiple VMs (VMware, Hyper-V and KVM), compute and application instances, storage pools and distributed file systems while managing and protecting them running on commodity servers and storage.
Founders Mallik Mahalingam and Krishna Yadappanavar are respectively responsible for the development of VMware VXLAN, the underpinnings of the software-defined network (SDN), and VMFS, the widely deployed file system in VMware environments. The two believe the new distributed subscription-based datacenter platform they're rolling out will reshape the way enterprises develop and host their applications in the future.
Springpath's software runs on any type of commodity servers and storage hardware offered in a Software-as-a-Service (SaaS) subscription model. The hosting and systems management platform costs $4,000 per server per year and ties into existing enterprise management systems. Not surprisingly, it initially supports VMware vCenter, but will also run as a management pack in Microsoft System Center.
The platform is based on what Springpath calls its Hardware Agnostic Log-structured Objects (HALO) architecture, which purports to offer a superior method for provisioning data services, managing storage and offering a high-performance and scalable distributed infrastructure. Rather than requiring customers to buy turnkey appliances, the company is offering just the software. It's currently supported on servers offered by Cisco, Dell, Hewlett-Packard and Supermicro. Customers can deploy the software themselves or have a partner run it on one of the supported systems. Springpath has forged a partnership with mega distributor Tech Data to provide support for the platform.
Ashish Gupta, Springpath's head of marketing, described the HALO architecture in an interview. HALO consists of distribution, caching, persistence optimization and structured object layers, Gupta explained. "Married to all of this are the core data services that all enterprises are going to need from a management perspective like snapshots, clones, compression [and so on]," he said. "The idea here is you can put the Springpath Data Platform on a commodity server and then scale and essentially give core capabilities that can replace your array-based infrastructure. You will no longer need to buy expensive multimillion arrays and isolate yourself in that environment, you are buying commodity servers putting the software on top, and you're going to get all the enterprise capabilities and functionality that you expect."
The hardware-agnostic distribution layer, for example, enables enterprises to take advantage of all the underlying hardware to support an application, he added. "The platform can be running on N number of servers. We can take advantage of all the resources underneath the servers, be it the disk, be it the memory or the flash environment and essentially present that to the applications that are supported by the platform."
In that context the applications can run in a virtualized environment from VMware, Hyper-V or KVM, and can be supported in containerized environments. Gupta noted Springpath is also providing its Platform-as-a-Bare-Metal offering. "So it can look like a traditional storage device, except it can scale seamlessly in a bare metal deployment," he said. Gupta added Springpath has its own file system and supports either block objects or Hadoop plug-in infrastructures. "In essence, we can give you a singular platform for the app your application needs," he said.
While it's not the only hyper-converged platform available, it is the first potentially major one offered not tied to a specific hardware platform or offered as an appliance, said Chuck Bartlett, senior VP for Tech Data's advanced infrastructure solutions division, which is working to line up systems integration partners to offer the platform. "The fact that it is compute-agnostic, meaning the end-user client can use the server platform they have or want to implement, is unique and compelling.
Looking ahead, Springpath's Gupta sees HALO targeting emerging Web-scale applications and distributed programming environments built in programming languages such as Node.js, Scala, Akka or Google Go. The initial release is designed to support VMware environments and OpenStack Horizon-based clouds, though plans call for supporting Microsoft Azure, Hyper-V and System Center this summer.
Posted by Jeffrey Schwartz on 02/18/2015 at 4:02 PM0 comments
President Obama issued an executive order aimed at persuading companies who suffer breaches to share information in an effort to provide more coordinated response to cyberattacks. Though it stops short of mandating that they do so, the president is also introducing legislation that will pave the way for greater information sharing between the private sector and government agencies including the Department of Homeland Security. The legislation also calls for the modernization of law enforcement authorities to fight cybercrime and the creation of a national breach reporting authority.
The order, signed today by the president at the Cybersecurity Summit at Stanford University in Palo Alto, Calif., sets the stage for the latest round of debate on how to protect the nation's infrastructure and consumer information without compromising privacy and civil liberties. Obama's push to promote information sharing, which could help provide better threat intelligence and methods of responding to attacks, nonetheless won't sit well with organizations who loathe to do so for concerns over liability and business impact.
Specifically the president has proposed the formation of information sharing and analysis organizations (ISAOs). These will be private sector groups that would share information and collaborate on issues related to cyber security by creating Information Sharing and Analysis Centers (ISACs). It extends on the information sharing executive order Obama issued two years ago to the day and outlined in this State of the Union Address that led to the release of last year's Cybersecurity Framework.
Since then of course, the numbers of cyber attacks have become more severe with the 2013 Target breach, major attacks last year against Apple, Home Depot, the IRS, Sony and now this year's compromise of customer info at Anthem, the second largest health insurance provider.
Obama also met today with some key industry executives at the Cybersecurity Summit in Palo Alto, including Apple CEO Tim Cook and Intel president Renee James. Besides Cook, top CEOs are conspicuous by their absence including Facebook, Google, IBM, Microsoft and Yahoo. The president signed the executive order at today's summit.
The order also seeks to let law enforcement agencies prosecute those who sell botnets, while making it a crime to sell stolen U.S. financial information such as credit card and account numbers to anyone overseas. It will also give federal law enforcement agencies authority to go after those who sell spyware and give courts the authority to shut down botnets.
Several key IT providers and large companies at risk today attending the summit announced their support for the framework including Intel, Apple, Bank of America, U.S. Bank, Pacific Gas & Electric, AIG, QVC, Walgreens and Kaiser Permanente, according to a fact sheet released by the White House.
While some just announced support for the framework, Intel released a paper outlining its use and stated that it is requiring all of its vendors to use it as well. Apple said it's incorporating it as part of its broader security across its networks. Also requiring its vendors to use the framework are Bank of America, while insurance giant AIG said it is incorporating the NIST framework into how it underwrites cyber insurance for business of all sizes and will use it to help customers identify gaps in their approach to cyber security.
The White House also said several members of the Cyber Threat Alliance, which includes Palo Alto Networks, Symantec, Intel and Fortinet, have formed a cyber threat-sharing partnership that aims to create standards aligned with its information sharing order. Along with that, according to the White House, Box plans to participate in creating standards for ISAOs with plans to use its Box platform to extend collaboration among ISAOs. Further, FireEye is launching an Information Sharing Network, which will let its customers receive threat intelligence in near real time (including anonymized indicators).
Several companies are also announcing efforts to extend multifactor authentication, including Intel, which is releasing new authentication technology that seeks to make biometrics a more viable option to passwords. Credit card providers and banks, including American Express, Master Card and its partner First Tech Credit Card Union, are all advancing efforts to pilot and/or roll out new multifactor authentication methods including biometrics and voice recognition.
Much of the buzz is about the failure of the tech CEOs to attend, but it looks like today's event at Stanford has shown some potentially significant advances by companies and some proposals by the president that will certainly extend the noise level of debate from Silicon Valley to the Beltway.
What's your take on the president's latest executive order?
Posted by Jeffrey Schwartz on 02/13/2015 at 12:48 PM0 comments
The popular Sysinternals site acquired by Microsoft nearly two decades ago with troubleshooting utilities, tools and help files is now SSL-enabled. The cocreator steward of the site Mark Russinovich, Microsoft's Azure CTO, tweeted the news earlier in the week.
Microsoft and many others are making the move to use the SSL protocol for Web sites -- the long-established Secure Sockets Layer standard used for encrypted Web sessions. Enabled for decades in sites where financial transactions and other secure communications are necessary, the move to HTTPS sessions from HTTP is rapidly spreading rapidly as the hazards of intercepted communications is on the rise.
If you ask, why would a site that just hosts documentation need an HTTPS connection, consider there are lots of executables there as well, and though all the binaries are signed, using SSL to access the tools via the online share prevents man-in-the-middle tampering in cases where the user doesn't validate the signature before launching the tool.
Posted by Jeffrey Schwartz on 02/12/2015 at 1:28 PM0 comments
Microsoft's announcement back in October that it has partnered with Docker to enable Linux containers to run in Windows was an important step forward for enabling what promises to be the next wave in computing beyond virtualization. While things can change on a dime, it looks like Microsoft is going all in by supporting a widely endorsed (including IBM, Google, VMware and others) new computing model based on application portability and a more efficient use of compute, storage and network resource.
It sounds quite grand but so did virtualization -- and the idea of consolidating server resources -- when it hit the scene a decade ago. Of course, the proof will be in the implementation. It's very likely we'll hear about how to enable Linux containers in Windows Server at the upcoming Build and Ignite conferences in late April and early May, as Microsoft Distinguished Engineer Jeffrey Snover hinted last week.
"We're also going to talk about containers, Docker containers for Windows," Snover said. "There will be two flavors of the compute containers. There'll be a compute container focused in on application compatibility, so that will be server running in a containers, and then there will be containers optimized for the cloud. And with those containers you'll have the cloud optimized server."
Those wanting to start running Linux containers in Azure can start now, based on documentation posted by Microsoft yesterday. "Docker is one of the most popular virtualization approaches that uses Linux containers rather than virtual machines as a way of isolating data and computing on shared resources," according to the introduction. "You can use the Docker VM extension to the Azure Linux Agent to create a Docker VM that hosts any number of containers for your applications on Azure."
The documentation explains the following:
It also aims to explain how to:
In its description of Docker containers, it points out they're currently one of the most popular virtualization alternatives to virtual machines in that they isolate data and computing on shared resources, enabling developers to build and deploy apps across Docker resources, which may run in different environments.
As I noted earlier in the week, DH2i is now offering a platform that enables containers that run in Windows Server -- the difference being that they're Windows, not Linux-based, though they purport to work with Docker containers as well.
But if you're looking to start with Docker in Azure, Microsoft is making the push.
Posted by Jeffrey Schwartz on 02/12/2015 at 1:27 PM0 comments
Microsoft's new ExpressRoute service could emerge as a key piece of its hybrid cloud story for customers wary of using the public Internet to link their private datacenters to Azure. ExpressRoute, introduced at last May's TechEd conference in Houston, effectively provides dedicated links that are more reliable, faster and secure. To encourage customers and partners to try it out, Microsoft is offering the service free of charge through the end of June.
The offer covers Microsoft's Express Route 10 Mbps Network Service Provider (NSP) for new and existing customers in all Azure regions where ExpressRoute is currently offered. Several of Microsoft's telecommunications partners that offer ExpressRoute are also joining in the promotion including British Telecom, Colt, and Level 3. Also, AT&T is offering six month trials of its new NetBond service and Verizon is providing six months use of its Secure Cloud Interconnect offering for first-time users of its basic data plan with customers who sign two-year agreements for up to 1TB per month.
"ExpressRoute gives you a fast and reliable connection to Azure, making it suitable for scenarios such as data migration, replication for business continuity, disaster recovery, and other high-availability strategies," wrote Sameer Sankaran, a senior business planner within Microsoft's Azure group. "It also allows you to enable hybrid scenarios by letting you seamlessly extend your existing datacenter to Azure."
The service is especially complementary for services like Azure Site Recovery, which provides disaster recovery services using Azure targets and Hyper-V replication and for applications requiring private or more reliable links than using an Internet connection.
ExpressRoute is designed to connect on-premises resources such as physical and virtual server farms, storage, media services and Web sites, among other services. The service requires you to order circuits via one of the connectivity partners. Customers can choose either a direct layer 3 connection via an exchange provider or a standard layer 3 link from an NSP. Customers can enable one or both types through their Azure subscriptions but must configure both to connect to all supported services.
Posted by Jeffrey Schwartz on 02/11/2015 at 1:28 PM0 comments
A little-known startup that offers data protection and SQL Server migration tools today released what it calls the first native container management platform for Windows Server and claims it can move workloads between virtual machines and cloud architectures. DH2i's DX Enterprise encapsulates Windows Server application instances into containers removing the association between the apps, data and the host operating systems connected to physical servers.
The Fort Collins, Colo.-based company's software is a lightweight 8.5 MB server installation that offers a native alternative to that of Docker containers, which are Linux-based, though Microsoft and Docker are working on porting their containers to Windows, as announced last fall. In addition to its relationship with Microsoft, Docker has forged ties with all major infrastructure and cloud providers including Google, VMware and IBM. Docker and Microsoft are jointly developing a container technology that will work on the next version of Windows Server.
In his TechDays briefing last week, Microsoft Distinguished Engineer Jeffrey Snover confirmed that the company will include support for Docker containers in the next Windows Server release, known as Windows vNext.
DH2i president and CEO Don Boxley explained why he believes DX Enterprise is a better alternative to Docker, pointing to that fact that it's purely Windows Server-based.
"When you look at a Docker container and what they're talking about with Windows containerization, those are services that they're looking at then putting some isolation kind of activities in the future," Boxley said. "It's a really important point that Docker's containers are two containerized applications. Yet there are still going to be a huge amount of traditional applications simultaneously. We'll be able to put any of those application containers inside of our virtual host and have stop-start ordering or any coordination that needs to happen between the old type of applications and the new and/or just be able to manage them in the exact same way. It forces them to be highly available and extends now to a containerized application."
The company's containers, called "Vhosts," each have their own logical host name, associated IP addresses and portable native NTFS volumes. The Vhost's metadata assigns container workload management, while directing the managed app to launch and run locally, according to the company. Each Vhost shares one Windows Server operating system instance, which are stacked on either virtual or physical servers. This results in a more consolidated way of managing application workloads and enabling instance portability, Boxley explained.
Unlike Docker there are "no companion virtual machines running Linux, or anything like that at all," Boxer said. "It's just a native Windows application, you load it onto your server and you can start containerizing things right away. And again, because of that universality of our container technology, we don't care whether or not the server is physical, virtual or running in the cloud. As long as it's running Windows Server OS, you're good to go. You can containerize applications in Azure and in Rackspace and Amazon, and if the replication data pipe is right, you can move those workloads around transparently." At the same time, Boxley said it will work with Docker containers in the future.
Boxley said a customer can also transparently move workloads between any virtual machine platform including VMware, Hyper-V and Xen. "It really doesn't matter because we're moving the applications not the machine or the OS," he said. Through its management console, it automates resource issues including contention among containers. The management component also provides alerts and ensures applications are meeting SLAs.
Asked why it chose Windows Server to develop DX Enterprise, Boxley said he believes it will remain the dominant environment for virtual applications. "We don't think -- we know it's going to grow," he said. IDC analyst Al Gillen, said that's partly true, though Linux servers will grow in physical environments. Though he hasn't tested DX Enterprise, Gillen said the demo looked promising. "For customers that have an application that they have to move and they don't have the ability to port it, this is actually a viable solution for them," Gillen said.
The solution is also a viable option for organizations looking to migrate applications from Windows Server 2003, which Microsoft will no longer support as of July 14, to a newer environment, Boxley said. The software is priced at $1,500 per server core (if running on a virtual machine it can be licensed via the underlying core), regardless of the number of CPUs. Support including patches costs $360 per core per year.
Boxley said the company is self-funded and started out as a Microsoft BizSpark partner.
Posted by Jeffrey Schwartz on 02/10/2015 at 12:15 PM0 comments
The White House late last week said it has named Tony Scott as its CIO. This will only be the third person charged with overseeing the nation's overall IT infrastructure. Scott, who served as Microsoft's CIO, also served that role for Disney and VMware and was CTO at GM.
Scott's official title will be U.S. CIO and Administrator of OMB's Office of Electronic Government and Information Technology, succeeding Steve VanRoekel. The first CIO was Vivek Kundra, who launched the government's Cloud First initiative. The Obama Administration will task Scott with implementing its Smarter IT Delivery agenda outlined in the president's 2016 proposed budget.
I actually first met Scott when he was GM's CTO in the late 1990s when he spoke at a Forrester conference about managing large vendors, which included Microsoft, Sun Microsystems, Oracle and numerous others including some startups during the dot-com era. I later caught up with him more than a decade later attending Microsoft's Worldwide Partner Conference in 2010 in Washington, D.C.
Among his key initiatives at the time as Microsoft's CIO was enabling the internal use of new technologies the company had recently brought to market, among them Azure. "I think we've done what Microsoft always has done traditionally, which is we try to dog-food our own stuff and get the bugs out and make sure the functionality is there," he said during an interview at WPC, though he qualified that by adding: "We'll move them or migrate them as the opportunity arises and as the business case makes sense."
Nevertheless he was known as a proponent of cloud-enabling internal applications as quickly as possible. Scott's tenure at Microsoft ran from 2008 until 2013 and he has spent the past two years at VMware.
Posted by Jeffrey Schwartz on 02/09/2015 at 12:49 PM0 comments
The news that the next version of Windows Server and System Center won't come until next year caught many off guard who were under the impression one would come later this year. Microsoft brought its enterprise product roadmap into fuller view with that announcement and the promise of a new version of SharePoint Server later this year.
This latest information came at a fortuitous time for my colleague Gladys Rama, who was putting the finishing touches on the updated 2015 Microsoft Product Roadmap for sister publication Redmond Channel Partner. Check it out if you want to know planned release dates for anything noteworthy to an IT pro from Microsoft.
As for the delay of Windows Server v.Next, ordinarily it would seem par for the course. But after releasing Windows Server 2012 R2 just a year after Windows Server 2012 and messaging that Microsoft was moving toward a faster release cadence, it was more surprising. Whether by design or otherwise, the news removes a key decision point for IT pros who were considering waiting for the new Windows Server to come out before migrating their Windows Server 2003-based systems.
As soon as Microsoft got the word out that the new Windows Server is on next year's calendar, it issued another reminder that Windows Server 2003's end of support is less than six months away. Takeshi Numoto, corporate VP for cloud and enterprise marketing, gave the latest nudge this week in a blog post once again warning of the risks of running the unsupported operating system after the July 14 deadline.
"Windows Server 2003 instances will, of course, continue to run after end of support," he noted. "However, running unsupported software carries significant security risks and may result in costly compliance violations. As you evaluate security risks, keep in mind that even a single unpatched server can be a point of vulnerability for your entire infrastructure."
Microsoft has urged customers to migrate to Windows Server 2012 R2 and, where customers feel it makes sense, consider a cloud service such as Office 365 to replace Exchange Server on-premises as well as Azure or other cloud infrastructure or platform services to run database applications, SharePoint and other applications.
Did the news that Windows Server v.Next have any impact on your Windows Server 2003 migration plans? Or was the prospect of it possibly coming later this year too close for comfort for your planning purposes?
Posted by Jeffrey Schwartz on 02/06/2015 at 4:05 PM0 comments
Goverlan last week said it's giving away its version of its GUI-based Windows Management Instrumentation (WMI) tool for remote desktop management and control. The company's WMI Explorer (WMIX) lets IT pros with limited scripting or programming skills perform agentless remote administrations of Windows-based PCs.
The free tool -- an alternative to Microsoft's own command line interface called WMIC-- leverages WMI, a stack of Windows driver component interfaces supporting key standards including WBEM and CIM. WMI is built into all versions of Windows, allowing for deployed scripting to manage PCs and some servers remotely.
According to Goverlan, its WMIX tool includes a WMI Query Wizard, which will appeal to administrators with limited scripting or coding skills because it lets them create sophisticated standard WMI Query Language (WQL) queries with a filtering mechanism that generates results matching the needs of the specified remote systems. Goverlan's WMIX GUI lets administrators automatically generate Visual Basic scripts to define parameters to generate a script and report. It can also create WMI-based Group Policy Objects (GPO) and performs agentless system administration.
What's in it for the company to give away this free tool? Ezra Charm, Goverlan's vice president of marketing, noted that the company has never officially launched the tool nor has it significantly promoted it, yet it's popular among those who use it. "We are seeing a ton of interest," Charm said. Though many companies release free tools hoping to upsell customers to premium releases, Charm said the release of WMIX is primarily aimed at overall awareness for those who want to perform advanced WMI-based system administration functions and reports. Nevertheless, the effect would be the same.
WMIX is a component of the company's flagship systems administration tool, Goverlan Remote Administration Suite v8, which, depending on your environment, is an alternative or supplement to Microsoft's System Center Configuration Manager.
"At first, WMIX was implemented as an internal development tool to assist the integration of WMI Technology within the Goverlan Client Management Tool, but we quickly realized that this product would be of great services to all Windows System Administrators out there as it would allow anyone without advanced scripting knowledge to consume this powerful technology," said Goverlan CEO, Pascal Bergeot, in a statement announcing the release of the free tool
Goverlan requires contact information to activate the WMIX download.
Posted by Jeffrey Schwartz on 02/03/2015 at 12:20 PM0 comments
It's been three years since VMware has upgraded its flagship hypervisor platform, but the company yesterday took the wraps off vSphere 6, which the company said offers at least double the performance over its predecessor vSphere 5.5. VMware describes its latest release as the "foundation for the hybrid cloud," thanks to the release of its OpenStack distribution and upgrades to components of the suite that integrate virtualized software-defined storage and networking.
The new wares, set for release by the end of March, will offer a key option for enterprise IT decision makers to consider as they choose their next-generation virtual datacenter and hybrid cloud platforms. With the new wave of releases, VMware is advancing and integrating its new NSX software-defined networking technology. VMware, to some extent, is also challenging the business model of its corporate parent EMC by offering new storage virtualization capabilities with its new Virtual SAN 6 and vSphere Virtual Volumes, which will enable virtualization of third-party storage arrays.
The move comes as VMware seeks to maintain its dominant hold on large datacenter installations looking to move to hybrid and public clouds as giants such as Microsoft, Google, IBM and Amazon Web Services are looking to position their cloud platforms as worthy contenders. In what appeared to be a strategically timed move, Microsoft published its Cloud Platform roadmap, as reported, just hours before the VMware launch event.
With this release, it now remains to be seen whether VMware can successfully leverage the virtualized server stronghold it has with its network and storage virtualization extensions to its public cloud, vCloud Air, as Microsoft tries to lure those customers away with its Cloud OS model consisting of Windows Server, Hyper-V and Microsoft Azure. Despite Microsoft's gains, VMware is still the provider to beat, especially when it comes to large enterprise installations.
"VMware's strength remains their virtualization installed base, and what they're doing through NSX is building that out into cloud environments," said Andrew Smith, an analyst at Technology Business Research. "VMware realizes that they need to gain ground, especially in private cloud deployments, so they're going to use NSX to tie into security along with vSphere, to really try and take the hybrid cloud space by storm. And I think with updates to vSphere and the integration with OpenStack it's all pieces of the puzzle coming together to make that a reality to customers."
OpenStack Cloud Support
The VMware Integrated OpenStack (VIO) distribution, the company's first OpenStack distro, includes an API access that provides access to OpenStack-enabled public and private cloud and VMware vSphere infrastructure. "VIO is free to vSphere Enterprise Plus customers and comes as a single OVA file that can be installed in fewer than 15 minutes from the optional vSphere Web client. VIO support, which includes support for both OpenStack and the underlying VMware infrastructure and is charged on a per-CPU basis, said Tom Fenton, a senior validation engineer and lab analyst with IT analyst firm Taneja Group, in a commentary published on our sister site Virtualizationreview.com.
"VMware brings a lot to the OpenStack table with VIO," according to Fenton. "Many common OpenStack tasks are automated and can be performed from vCenter. vRealize Operations is able to monitor OpenStack, and LogInsight can parse OpenStack logs to separate the considerable amount of log noise from actionable items." The new NSX software-defined networking infrastructure will enable ties to OpenStack-compatible clouds, as well as VMware's own vCloud Air public cloud.
"The company now has offerings that address all layers of the Kusnetkzy Group virtualization model, including access, application, processing, storage and network virtualization, as well as both security and management for virtualized environments, sometimes called software-defined environments," wrote noted analyst Dan Kusnetsky, in his Dan's Take blog post.
Hypervisor Improvements with vSphere 6
Martin Yip, a VMware senior product manager, said in a company blog post announcing vSphere 6 that it has 650-plus new features, increased scale, performance, availability, storage efficiencies for virtual machines (VMs) and datacenter simplified management. "vSphere 6 is purpose-built for both scale-up and scale-out applications including newer cloud, mobile, social and big data applications," Yip noted.
Compared with the existing vSphere 5.5, vSphere 6 supports 64 hosts per cluster which is double the VMs per cluster, 480 CPUs versus 320, triple the RAM with 12TB per host and quadruple the VMs per host with 2,048. It also supports double the number of virtual CPUs per VM at 128 and quadruple the amount of virtual RAM per VM totaling 4TB, according to Yip.
The starting price for vSphere 6 is $995 per CPU. vSphere with Operations Management 6 starts at $1,745 per CPU and vCloud Suite 6 starts at $4,995 per CPU.
Posted by Jeffrey Schwartz on 02/03/2015 at 12:19 PM0 comments
When Microsoft last month announced that it will offer Windows 10 as a free upgrade to Windows 7, Windows 8 and Windows 8.1 users, the company said the deal doesn't apply to enterprise users. The company clarified that point late last week saying that the free upgrade is available to business users who have Windows Pro, but those wanting enterprise management capabilities should stick with or move to Software Assurance.
In a blog post outlining the way Microsoft will release new Windows 10 features and fixes via its new Windows-as-a-service model, the company elaborated on the free offer. Simply put, the free offer is available to anyone with Windows 7 Pro or Windows 8.x Pro. If you have Windows 7 Enterprise or Windows 8.x Enterprise and require the same "enterprise-grade capabilities, Windows Software Assurance (SA) will continue to offer the best and most comprehensive benefits," wrote Jim Alkove, a leader on the Windows Enterprise Program Management.
While Microsoft could change what's offered in its unnamed pro and enterprise versions, the latter edition will offer Direct Access, BranchCache, App Locker, support for VDI and the Windows to Go capability, according to a Microsoft description of the differences between the two SKUs. It's clear that Microsoft wants to get as many users as possible onto Windows 10, which required users to upgrade within a year of its release.
Posted by Jeffrey Schwartz on 02/02/2015 at 1:51 PM0 comments
Allaying concerns that Microsoft wasn't planning to develop any more on-premises versions of SharePoint, the company today said a new server release is scheduled for the second half of 2015. Microsoft's emphasis on SharePoint Online had many wondering at times whether the company was planning a new server release, although the company had indicated back in March that a new version was coming.
Despite its push to the cloud version, Microsoft has acknowledged and promoted the fact that organizations' best way to transition is via a hybrid architecture providing connectivity between server and Microsoft's SharePoint Online services. However unlike the on-premises version, the Office 365 version of SharePoint Online doesn't support the trusted code and apps developed for SharePoint Server.
"While we've seen growing demand for SharePoint Online, we recognize that our customers have a range of requirements that make maintaining existing SharePoint Server deployments the right decision for some," said Julia White, general manager for Microsoft's Office Products division in a blog post today. "We remain committed to meeting those needs. We're excited about the next on-premises version of SharePoint and we're sure you will be too. It has been designed, developed and tested with the Microsoft Software as a Service (SaaS) strategy at its core, drawing from SharePoint Online. With this, SharePoint Server 2016 will offer customers enhanced, flexible deployment options, improved reliability and new IT agility, enabled for massive scale."
The company didn't offer any details on what its plans are for SharePoint Server 2016 other than to say it will provide more details at its forthcoming Ignite conference in Chicago in the first week of May, although White insinuated that Microsoft would aim to extend the hybrid capabilities that the current on-premises and cloud versions offer.
"A hybrid SharePoint deployment can provide IT agility by creating a consistent platform spanning datacenters and cloud, simplifying IT and delivering apps and data to users on any device, anywhere," she said. "With SharePoint Server 2016, in addition to delivering rich on-premises capabilities, we're focused on robust hybrid enablement in order to bring more of the Office 365 experiences to our on-premises customers."
Microsoft is already jumpstarting that hybrid effort this year with the rollout of APIs and SDKs that aim to bridge the gap between the on-premises and cloud worlds, as noted in last month's Redmond magazine cover story. The topics to be discussed in sessions at Ingite cover the gamut of SharePoint and Office 365 technologies including Delve, OneDrive, Project, Visio and Yammer.
Posted by Jeffrey Schwartz on 02/02/2015 at 7:33 AM0 comments
Microsoft today released versions of its widely used Outlook mail, calendaring and contacts app for users of iPhones, iPads and a preview version Android devices. More than 80 million iPad and iPhone users have downloaded Office, according to Microsoft.
"We have received tremendous customer request for Outlook across all devices, so we are thrilled to fulfill this for our customers," said Julie White, general manager for the Office product management team. Microsoft was able to develop the new Outlook apps thanks to code developed by Acompli, which Microsoft acquired last month , she noted.
I didn't see it in the Apple Store earlier today but a link Microsoft provided enabled an easy download of the Outlook app for the iPad. It looks a lot like the traditional Outlook client, though it clearly is stripped down. In the Settings folder (Figure 1) you can choose swipe options, browsers and create a signature for each or all accounts. However, unlike the Windows Outlook client, you can only create generic signatures.
The mail client has a search button and allows you to see your folders (Figure 2). It also provides access to OneDrive, Dropbox and Google Drive storage accounts (Figure 3). In the configuration setting, it provides access to your Exchange account, Outlook.com, OneDrive, iCloud, Gmail, Yahoo Mail, Dropbox and Box (Figure 4).
Unfortunately if you connect to any other POP- or IMAP-based service you're out of luck. Microsoft didn't indicate whether that will change, though White noted that "you will see us continue to rapidly update the Outlook app, delivering on the familiar Outlook experience our customers know and love." For my testing purposes, I have a Yahoo mail account that I use for less-critical purposes, which enabled me to test the new Outlook client.
Microsoft said the new Outlook app replaces Outlook Web Access for iOS and Android, though Microsoft said it will keep them around for now because some advanced Exchange and Office 365 features still aren't available with the new Outlook app.
Microsoft also announced that its Office for Android tablets is now generally available in the Google Play store. This replaces the preview versions of Word, Excel and PowerPoint. It joins the already available version of OneNote for Android. The company said it will also support native implementation running on Intel chipsets within the quarter.
Posted by Jeffrey Schwartz on 01/29/2015 at 11:26 AM0 comments
Microsoft is giving its Power BI analytical service an upgrade with added connectivity sources, support for iOS and will be available for $9.99 for a premium edition to be called Power BI Pro. The company will also offer a free version with limited functionality that it will retain the Power BI name.
Power BI, a cloud-based business analytics service launched a year ago, was aimed at both technical and general business users. The browser-based software-as-a-service (SaaS) tool generates operational dashboards. As noted in our First Look at Power BI last year, this tool adds new functionality to existing Microsoft information management offerings, namely Excel 2013, Office 365 and SharePoint 2013.
Microsoft currently has three pricing tiers for Power BI, running as high as $52 per user per month included with Office 365 Pro Plus, $40 for a standalone version and $33 when added on to an Office 365 E3/E4 subscription. Starting Feb. 1 Microsoft is offering one paid subscription at the substantially reduced $9.99 price, which is 75 percent less expensive than the highest tier. The company will offer the free version when the upgrade becomes generally available.
The free version is limited to 1GB of data per month per user, whereas the paid subscription will allow up to 10GB according to a comparison of the two options. Users of the paid version will also have access to 1 million rows per hour of streaming data compared to 10,000 rows for the free service. The paid Power BI Pro is required to use such features as access to live data sources, the data management gateway and various collaboration features including the ability to share refreshable team dashboards, create and publish customized content packs, use of Active Directory groups for sharing and managing access control and shared data queries through the data catalog.
With the new preview, users can sign in with any business e-mail address, initially only in the United States. Microsoft said it'll be available for those in other countries in the future. The new data sources supported include GitHub, Marketo, Microsoft Dynamics CRM, Salesforce, SendGrid and Zendesk, noted James Phillips, Microsoft's general manager for data experiences, in a blog post Tuesday. In the pipeline are Inkling Markets, Intuit, Microsoft Dynamics Marketing, Sage, Sumo Logic, Visual Studio Application Insights and Visual Studio Online, among others. "Power BI is 'hybrid' by design, so customers can leverage their on-premises data investments while getting all the benefits of our cloud-based analytics service," he said.
Microsoft also is offering a new tool called Power BI Designer, designed to let business analysts connect with, model and analyze data, he added, letting them easily publish results to any other Power BI user. The company also released a preview of Power BI for iPad, which can be downloaded from the Apple App Store. Phillips noted versions for iPhones, Android and Windows universal apps will be available later this year.
Posted by Jeffrey Schwartz on 01/28/2015 at 3:20 PM0 comments
One day after Microsoft delivered a disappointing quarterly earnings report, Apple Tuesday did the complete opposite by posting its best quarter ever -- far exceeding expectations. In fact Apple is said to have posted the most profitable quarter of any publicly traded company ever, buoyed by the fact that it sold 74.5 million iPhones between Oct. 1 and Dec. 27.
Apple recorded $74.6 billion in revenues with a record $18 billion profit (gross margin of 39.9%). This topped the previous record of ExxonMobil in the second quarter of 2002. The company's shares are soaring a day after Microsoft stock sunk about 9% (making it a key contributor to a down day for the stock market on Monday).
It's not that Microsoft had a horrible quarter -- revenues of $26.5 billion were up 8% year over year -- but profits were down and the growth outlook for next quarter was considerably slower than Wall Street had anticipated. The results have led many to question whether Satya Nadella's honeymoon is over as he approaches his one-year anniversary as CEO. During his first year, Nadella made some major moves to turn Microsoft around, as Mary Jo Foley noted in her monthly Redmond magazine column posted this week.
In addition, Microsoft posted only a 5% increase in commercial licensing and an overall operating income decline of 2%. Impacting earnings were the restructuring of its Nokia business unit and the transition to cloud computing, the company said.
Microsoft also blamed the strong dollar and weakness in China and Japan as well, which could further impact the current quarter should the dollar continue to strengthen, CFO Amy Hood warned on the earnings call. Ironically strong results in China were a key contributor for Apple's growth, while the company indicated only modest impact from the dollar's surge.
Among some other noteworthy improvements that Microsoft reported:
- 9.2 million Office 365 Home and Personal licenses were sold. This is up 30% since last quarter.
- $1.1 billion in Surface sales (24% increase in sales).
- 10.5 million Lumia smartphones sold, up 30 percent. However, Windows Phone still accounts for just 3 percent of the smartphone market.
But despite those improvements, Apple outshined Microsoft in a number of ways. Apple said it sold its 1 billionth iOS device in November. Average selling prices of iPhones increased $50 as customers opted for more expensive devices equipped with more storage. Not that any major market shifts were expected, but its huge spike in iPhone sales aided by the larger form factor of the latest models continues to leave Windows Phone in the dust.
For Microsoft, while Windows Pro OEM revenue declined 13 percent as did non-Pro revenue, sales of Apple Macintoshes rose 14% to total 5.5 million. Apple App Store revenues also jumped 41 percent. One weak spot for Apple was the decline of iPad sales -- 21.6 million units were sold, which is down from 26 million during the same period last year. The decline in iPads is not surprising given the release of the iPhone 6 Plus, which many might substitute an iPad Mini for due to the fact it's only slightly larger. Also recent upgrades have only had modest new features, giving existing users little incentive to replace the iPads they now have.
Still, iPads are becoming a formidable device in the enterprise and Apple CEO Tim Cook said on Tuesday's call that the company's partnership with IBM has helped boost the use of them in the workplace. "I'm really excited about the apps that are coming out and how fast the partnership is getting up and running," he said.
Many are expecting Apple to increase its enterprise push with the release of a larger iPad. For its part, Microsoft is rumored to have a new Surface device coming later this year, though the company has yet to confirm its plans.
Perhaps Cook is having a better week than Nadella but Microsoft has many other fish to fry where Apple is not in its path. Nadella's turnaround for Microsoft is very much still in progress. If the honeymoon is over, as some pundits suggest, then Nadella's success will ride on his ability to keep the marriage strong.
Posted by Jeffrey Schwartz on 01/28/2015 at 12:32 PM0 comments
When Microsoft released the newest Windows 10 Technical Preview on Friday, testers saw some major new features the company is hoping to bring to its operating system designed to switch seamlessly between PC and tablet modes. Among the key new features are a new Start Menu and Cortana, the digital assistant that, until now, was only available on Windows Phone.
Along with those and other new features, the new Build 9926 takes a key step forward in showing the progress Microsoft is making to remove the split personality that epitomizes Windows 8.x. Microsoft is designing Windows 10 to launch desktop apps from the Windows Store interface and vice versa. Upon downloading the new build you'll want to look at the following:
Start Menu: One of the biggest mistakes Microsoft made when it rolled out Windows 8 was the removal of the popular Start Button. While the company brought some of its capabilities back with Windows 8.1, the new build of the Technical Preview introduces a new Start Menu that Windows 7 users who have avoided Windows 8.x should feel comfortable with. The Start Menu displays the apps you use most on the left side of your screen and lets you customize the rest of the page with tiles that can be sized however the user chooses and grouped based on preferences such as productivity tools and content. It can be viewed in desktop mode (Figure 1) or in the pure tablet interface (Figure 2).
Cortana: The digital voice assistant available to Windows Phone users is now part the Windows PC and tablet environment and it was the first thing I wanted to test upon downloading the new build. It wasn't able to answer many questions, though when asked who's going to win the Super Bowl, Cortana predicted the New England Patriots. We'll see how that plays out. Ask it the weather forecast and she'll give you a brief answer. In other cases when I asked certain questions it would initiate a Bing query and send back the search results in a browser view. Cortana is also designed to search your system, OneDrive and other sources based on your queries. Microsoft has warned Cortana for Windows is still in early development but it could emerge as a useful feature if it's able to work as the company hopes. Like the Start Menu, Cortana works on the traditional desktop (Figure 3) or in the tablet mode (Figure 4).
Continuum: The design goal of Windows 10 is its ability to let users transition between desktop and touch-based tablet modes. In either environment, you should be able to access desktop or Windows Store apps. For example if you have downloaded Google's Chrome browser as a desktop app, when in the tablet mode it will appear as an app in that environment. In either case, you're accessing the same browser, just from a different interface.
Farewell Charms: Microsoft introduced Charms with Windows 8 as a hip new way of configuring machines but many found it cumbersome and confusing. In the new build, Charms are gone, replaced by a new Settings component (Figure 5). As the name implies, Settings offers an easy way to customize the display, connect peripherals and configure networks.
New Windows Store: Microsoft is preparing a new store that has a common design for PC, tablet and phone users as well as those accessing it via the Web. The new Windows Store beta (Figure 6) appears as a gray icon, though the existing Windows Store is still available in green.
File Explorer: Many users have complained that the File Explorer in Windows 8.x doesn't allow for a default folder. Now when opening the File Explorer in the new preview, it can be set to open to a default folder (Figure 7).
Because this is still an early beta you'll find bugs and just because you see features here doesn't mean they'll end up in the shipping version this fall. If you've looked at this build, please share your opinions on the latest Windows 10 Technical Preview.
Posted by Jeffrey Schwartz on 01/26/2015 at 3:40 PM0 comments
After putting its plans to go public last year on hold, Box's widely anticipated IPO got out of the starting gate today with its shares up as much as 70 percent midday Friday. The company plans to use the estimated $180 million in proceeds to maintain operations and invest in capital infrastructure to grow its enterprise cloud offering.
Founder and CEO Aaron Levie launched the company in 2005 with the aim of offering an alternative to large premises-based enterprise content management systems. Over the years, Levie publicly put a target on the back of SharePoint. Levie's ambitions earlier this decade to establish Box as a SharePoint killer peaked before Office 365 and OneDrive for Business arrived. While Levie still has strong aspirations to become the primary storage and file sharing service for businesses, the market is more challenging now that Office 365 with OneDrive for Business, Google Drive and others are widely entrenched within enterprises.
For its part, Box has always targeted large enterprises, boasting such customers as GE, Toyota, Eli Lilly, Boston Scientific, Procter & Gamble, Chevron, Schneider Electric and Stanford University. Speaking on the New York Stock Exchange trading floor with CNBC, Levie emphasized that the best opportunity for the company lies with targeting large enterprises and mid-size firms with 500 to 1,000 employees.
Amid enthusiasm for Box, there's also plenty of skepticism among analysts. The company incurs large customer acquisition and retention costs, which include "customer success managers" assigned to those with contracts for the life of the relationship, according to the company's S1 filing with the Securities and Exchange Commission (SEC). Moreover, Box is unprofitable with no target date for turning a profit in sight. According to the filing, Box recorded a $169 million loss for the fiscal year ended January 31, 2014 with a deficit of $361 million.
Also in its filing, Box points to competitors including EMC, IBM and Microsoft (Office 365 and OneDrive), Citrix (ShareFile), Dropbox and Google (Google Drive). There are plenty of other rivals both entrenched and new players such as Acronis, Carbonite, Own Cloud and CloudBerry Lab, among numerous others.
Now that Box believes it doesn't have to displace SharePoint and OneDrive for Business in order to succeed, the company last summer forged an agreement to collaborate with Microsoft. The collaboration pact ensured that Office 365 could use Box to store and synchronize files as an alternative to OneDrive, both of which now offer unlimited storage for paying customers. Microsoft and Box rival Dropbox forged a similar arrangement.
Box also offers APIs for developers to build applications using Box as the content layer, which lets users store content from a given application to centrally and securely store it within its service. Salesforce.com and NetSuite are among those that have used the API to tie their offerings together. In addition, Box last month added a new enterprise mobility management service, Box for Enterprise Mobility Management (EMM), which fits into the company's new Box Trust effort. That initiative consists of a string of partnerships with security vendors and those with data loss protection management tools. Symantec, Splunk, Palo Alto Networks, Sumo Logic and OpenDNS join existing partners Skyhigh Networks, Hewlett Packard, Okta, MobileIron, CipherCloud, Recommind, Ping Identity, Netskope, OneLogin, Guidance Software and Code Green Networks.
It remains to be seen if Box and its chief rival Dropbox can go it alone or if they'll become attractive takeover candidates. Of course that will depend on their ability to grow and ultimately turn a profit. Do you see Box or Dropbox becoming your organization's primary file store and sharing platform?
Posted by Jeffrey Schwartz on 01/23/2015 at 12:27 PM0 comments
One of the unexpected surprises at yesterday's Windows 10 prelaunch event and webcast was when Microsoft donned slick looking eyewear designed to bring holography to the mainstream. Whether Google got word of it days earlier when it pulled its own failed Google Glass experiment off the market is unknown. But the irony of the timing notwithstanding, Microsoft's new HoloLens appears to have more potential.
Microsoft Technical Fellow Alex Kipman, who works in Microsoft's operating systems group and is known as the "father of Kinect," made the surprise introduction of HoloLens at the Windows 10 event. While he didn't say when it would come out or how much it will cost, he positioned it as a product that's designed around Windows 10 and the suggestion is we'll see it sometime later this year.
"Holographic computing enabled by Windows 10 is here," said Kipman. "Every Windows 10 device has APIs focused on humans and environment understanding. Holographic APIs are enabled inside every Windows 10 build from the little screens to the big screens to no screens at all." It has a built-in CPU and GPU, but doesn't require external markers, cameras, wires or a computer connection, he added, and it will blend the physical and digital worlds.
When wearing HoloLens, it is designed to combine an existing environment with holograms, giving a 3D-like visual experience. While it surely will enhance Microsoft's gaming and entertainment portfolio including Xbox and Minecraft, the company also underscored practical uses for HoloLens. In a video, the company described how HoloLens can let workers share ideas, collaborate, teach and learn in a more visually immersive way.
Unlike Google Glass, HoloLens appears to have more practical use cases and may actually offer broader appeal. How broad will depend on price and how useful it ultimately is. But Microsoft has the advantage of seeing where Google Glass fell short and potentially has a larger ecosystem behind it. Perhaps it's even the catalyst that can bring developers of modern apps for other platforms into the fold?
Either way, Google hasn't thrown in the towel in this segment and it could prove to be a burgeoning market alongside other wearable gadgets. Kipman said Microsoft has worked on HoloLens in its research labs for many years, suggesting the demo wasn't just vaporware. It's yet another way Microsoft could draw demand for Windows 10 if users find HoloLens appealing. That could be the case if the price is right and it works as advertised.
Posted by Jeffrey Schwartz on 01/22/2015 at 12:28 PM0 comments
It's great that Microsoft will let Windows 7 and Windows 8.x users upgrade their systems to the new Windows 10 for free when it comes out this fall. But before you cheer too loud, beware of the fine print: the deal doesn't apply to Windows Enterprise editions.
A Microsoft official earlier in the week told me that the company will have an event in March emphasizing the enterprise features of Windows 10. Hopefully Microsoft will reveal whether it will offer the free upgrade or some other incentive for earlier users to upgrade. In the fine print discovered by my colleague Kurt Mackie, Microsoft noted the exclusions, which also include the small number of Windows RT users.
"It is our intent that most of these devices will qualify, but some hardware/software requirements apply and feature availability may vary by device," according to the explanation. "Devices must be connected to the Internet and have Windows Update enabled. ISP fees may apply. Windows 7 SP1 and Windows 8.1 Update required. Some editions are excluded: Windows 7 Enterprise, Windows 8/8.1 Enterprise, and Windows RT/RT 8.1. Active Software Assurance customers in volume licensing have the benefit to upgrade to Windows 10 Enterprise outside of this offer. We will be sharing more information and additional offer terms in coming months."
In many instances, new system rollouts could negate this issue. Will a free upgrade make or break your organization's decision to move to Windows 10?
Posted by Jeffrey Schwartz on 01/22/2015 at 12:22 PM0 comments
Microsoft potentially removed a crucial barrier to the future of its Windows franchise by saying it will offer the next version -- Windows 10 -- as a free upgrade to existing Windows 7 and Windows 8.x users. The company is also adding some compelling new features that may make the upgrade worth the effort if these new capabilities live up to their promise.
Speaking at the anticipated launch event in Redmond today, Terry Myerson, Microsoft's executive vice president of operating systems, announced the free upgrade. The caveat is users must install Windows 10 within a year of its release, though it remains to be seen whether that deadline will hold. Perhaps for consumers, which today's event was aimed at, that won't be a big deal. But businesses and enterprises do things on their own clocks, based on need and compatibility.
In an earlier post, I thought it would be a wise move to offer the free upgrade, though I had no knowledge Microsoft would ultimately do so. As part of the release, Microsoft is also shifting to what it calls Windows as a service, where it will provide continuous upgrades.
"When it comes to Windows as a service, it's a pretty profound change," Microsoft CEO Nadella said at today's event. "For customers, they're going to get a continuous stream of innovation. Not only a continuous stream of innovation but also the assurance their Windows devices are secure and trusted. For developers, it creates the broadest opportunity to target. For our partners, hardware and silicon partners, they can coincident with our software innovation, drive hardware innovation. We want people to love Windows on a daily basis."
Microsoft gave a number of reasons to "love" Windows besides the free upgrade. The company announced the rumored Spartan Web browser, which has a rendering engine better suited for modern Web applications, Myerson said. Microsoft will also offer Cortana, the digital assistant released for Windows Phone last year, for Windows running on PCs and tablets.
Officials also demonstrated the notion of a set of universal apps such as Word, Excel, PowerPoint, OneNote and Outlook, while optimized for each form factor that are consistent across them and designed to let a user stop working on one device and quickly pick up where he or she left off on another. An Xbox app on Windows that will allow Xbox users to run games on their PC or Windows tablet was also announced.
Microsoft also revealed its vision for augmented reality and took the wraps off HoloLens, which ironically is a Google Glass-looking device that Microsoft said has a built-in CPU and GPU and built on sensors. Microsoft described it as the world's first holographic computer. Its APIs are designed to work with the new Windows 10 environment.
More hardware is in the works from third parties and Microsoft. The event showcased the new Surface Hub, a Windows 10-based 84-inch Ultra HD display with Skype for Business built-in, sensors, cameras and the ability to mark up content with any phone or device. The company will also offer a 55-inch version and indicated other Surface hardware is in the works.
The company will release a new Windows 10 technical preview next week with a Windows 10 build for phones scheduled for release in early February. Many of the new features Microsoft demonstrated today will work their way into builds of the technical preview over the next three to five months, said Joe Belfiore, a vice president in the operating system group. Microsoft also plans to reveal more features for enterprises in March, according to a company official. The company still plans for the commercial release of Windows 10 in the fall timeframe.
Posted by Jeffrey Schwartz on 01/21/2015 at 2:29 PM0 comments
Mike Culver, who served a number of strategic roles with Amazon Web Services from the inception of the company's launch of its popular public cloud, lost his battle with pancreatic cancer this week. He was 63. Culver, who before joining AWS was also a technical evangelist at Microsoft in the early days of the .NET Framework rollout, was deeply respected in Redmond and throughout the world.
In his roles at AWS, Culver trained insiders at the emerging cloud industry in how to build and deploy apps in EC2 and scaling them on the company's Simple Storage Service (S3). "Mike was well known within the AWS community," wrote AWS evangelist Jeff Barr, who had shared an office with Culver years back. "He joined my team in the spring of 2006 and went to work right away. Using his business training and experience as a starting point, he decided to make sure that his audiences understood that the cloud was as much about business value as it was about mere bits and bytes."
Culver spoke at many AWS and industry events including Visual Studio Live. I met Culver at Visual Studio Live in 2008 where he gave a session on how to scale ASP.NET applications with cloud-based content delivery. At the time Culver was head of developer relations at AWS. Keep in mind, this was before Microsoft officially announced Azure and AWS S3 was brand new. I was quite impressed by his presentation and sat down with him. Though that was the only time we met, we became friends on Facebook and occasionally commented on one another's posts. I'm quite saddened that he lost his battle both for him, his wife, grown children, siblings and many colleagues who clearly had deep admiration and respect for him.
When he was diagnosed with pancreatic cancer in 2013, Culver was quite candid about his treatment but kept an upbeat yet realistic worldview about his battle. Pancreatic cancer is among the deadliest of cancers. I lost my father nearly a decade ago to it. Culver was accepted a few weeks ago to partake in a trial in a new therapy to battle the disease, though in the end, the disease was too far advanced. Culver entered hospice last week. RIP Michael.
Posted by Jeffrey Schwartz on 01/21/2015 at 2:30 PM0 comments
However you feel about the emerging wearables market, many rightfully have found the notion of Google Glass over the top. Given its obvious potential to distract one's attention, it should be illegal to wear them on the streets and certainly when driving.
Google's announcement yesterday that it will end the Google Glass experiment on Jan. 19 was inevitable since all experiments come to an end. On the other hand, Google has a history of labeling new products or services either tests or beta for an extended amount of time -- remember when Gmail was a beta product for more than five years despite the fact that millions were using it?
Certainly millions weren't using Google Glass and given its $1,500 price tag, it's also not surprising that Jan. 19 is the last day Google will offer it. The company's announcement yesterday that it is moving Google Glass from the Google X research labs headed by Glass chief Ivy Rose into the Nest unit run by Tony Fadell makes sense.
Nest is the company that manufactures and sells network-enabled smart thermostats, which Google acquired last year for $3.2 billion. A few months later Nest also acquired Dropcam for $55 million, the provider of cameras which, like its thermostats, have built-in Wi-Fi connectivity.
Some reports are cheering the demise of Google Glass though the company seems to have future plans for it. Hopefully the Nest division will focus Google Glass on the practical usage: for vertical and specialty functions that can give medical practitioners and all kinds of field workers a tool to do useful things they are now incapable of doing.
Posted by Jeffrey Schwartz on 01/16/2015 at 12:32 PM0 comments
It appears President Obama's forthcoming legislative proposal to crack down on cybercrime could impose additional liabilities on IT pros in that there could be penalties for not putting in place the proper policies, auditing practices and reporting of breaches.
The President this week spoke on his plans to propose the new legislation aimed at stiffening the penalties for all forms of cybercrime that put the nation's critical information infrastructure at risk as well as individual privacy, he said in a speech Tuesday. Obama will emphasize his legislative proposal to Congress in his annual State of the Union address.
"We want to be able to better prosecute those who are involved in cyberattacks, those who are involved in the sale of cyber weapons like botnets and spyware," Obama said in Tuesday's speech. "We want to be sure we can prosecute insiders who steal corporate secrets or individuals' private information. We want to expand the authority of courts to shut down botnets and other malware. The bottom line: we want cyber criminals to feel the full force of American justice because they are doing as much if not more these days as folks who are involved in conventional crime."
The White House also announced it will host a cybersecurity and consumer protection summit at Stanford University on Feb. 13, which will include speeches, panel discussions and a number of topic-specific workshops. Stanford said it is still finalizing details of the summit.
In addition to calling for better information sharing, the legislation will call for compliance with "certain privacy restrictions such as removing unnecessary personal information and taking measures to protect personal information that must be shared in order to quality for liability protection." According to an outline on the White House Web site, the President will also propose giving law enforcement tools they need to "investigate, disrupt and prosecute cybercrime."
The administration has also revised an existing proposal pertaining to security breach reporting "by simplifying and standardizing the existing patchwork of 46 state laws (plus the District of Columbia and several territories) that contain these requirements into one federal statute, and putting in place a single clear and timely notice requirement to ensure that companies notify their employees and customers about security breaches."
Over the next five years, the Department of Energy will also provide $25 million in grants to fund the training of cybersecurity professionals. The move, of course, comes amidst growing concerns about high-profile breaches over the past year including Target, Home Depot and most recently Sony, among others.
Yet the President is sure to face a battle, especially as it relates to information sharing, where the IT industry is fighting to ensure customer privacy and civil rights. For its part, Microsoft has led that fight in its battle to protect data residing on servers in Dublin, despite last year's court order mandating the release of that information. The Electronic Foundation, the non-profit organization focused on protecting civil liberties, swiftly denounced the President's proposal.
"President Obama's cybersecurity legislative proposal recycles old ideas that should remain where they've been since May 2011: on the shelf," according to a statement it released following Obama's proposal. "Introducing information sharing proposals with broad liability protections, increasing penalties under the already draconian Computer Fraud and Abuse Act, and potentially decreasing the protections granted to consumers under state data breach law are both unnecessary and unwelcome."
But the White House isn't alone in its effort to crack down on cybercrime. New York State Attorney General Eric Schneiderman yesterday said he plans to propose legislation that would require companies to inform customers and employees following any type of cyberattack or breach. The legislation would also broaden the scope of data companies would be required to protect, impose tighter technical and physical security protection and offer a safe harbor for organizations meeting certain standards, according to a statement released by the AG's office. "With some of the largest-ever data breaches occurring in just the last year, it's long past time we updated our data security laws and expanded protections for consumers," Schneiderman said.
While it's good that cybercriminals will face harsher penalties for their crimes -- and they should -- it's not likely to thwart those determined to inflict the most harm. Still, no one wants to be the next Target or Sony. As the content of this new legislation is debated, it also puts enterprises on notice that they will need to take measures to protect their critical data -- for their benefit and for everyone else.
Posted by Jeffrey Schwartz on 01/15/2015 at 9:48 AM0 comments
Facebook apparently does intend to enter the enterprise social networking market with its own offering targeted at business users. The new Facebook at Work will let Facebook users establish work accounts that are separate from their personal accounts.
Rumors that Facebook was developing a business network first came to light in November. News that the Facebook at Work pilot would launch today surfaced this morning in a report by Recode. A Facebook spokeswoman confirmed that the company has launched the pilot with some undisclosed participants testing the new service.
"We're not disclosing the handful of companies in the pilot, since it's early and we're still testing," the spokeswoman said in response to an e-mail inquiry. I was able to download the new Facebook at Work app on my iPhone but when searching for it with my iPad and Windows 8.1 PC, the app didn't appear in their respective app stores (as of Wednesday afternoon). The Facebook at Work FAQ indicated that it's also available in the Google Play app store.
"With a Facebook at Work account, you can use Facebook tools to interact with coworkers," according to the FAQ. "Things you share using your work account will only be visible to other people at your company. To set up an account, your company must be using Facebook at Work." The current app allows users to request more information on how employers can establish accounts.
To what extent a full-blown launch of Facebook at Work might have on incumbent enterprise social network providers such as Microsoft's Yammer, Salesforce.com's Chatter and Jive remains to be seen. But as SharePoint and Yammer expert Christian Buckley of GTConsult said back in November, "they will undoubtedly attract users, and have a number of high-profile deployments, but there is a very real line of demarcation between consumer and business platforms, and I just don't see Facebook as being able to close that gap in any serious way."
Do you see your organization using Facebook at Work?
Posted by Jeffrey Schwartz on 01/14/2015 at 12:44 PM0 comments
Microsoft is making a big splash at this year's annual National Retail Federation (NRF) show in New York. The company is showcasing a number of major brand name chains that have kicked off efforts to improve their in-store experiences by using Azure, predictive analytics and new ways of interacting using apps delivered on mobile devices and kiosks.
While Microsoft emphasized that many of its customers were rolling out mobile devices for their employees at last year's NRF show, the types of apps that various retailers and restaurant chains are rolling out this year make use of Microsoft Azure in a big way. A number of big chains including GameStop and McDonalds are making use of applications that make use of Azure Machine Learning, Microsoft's predictive analytics tool rolled out last year.
Usage of Azure by retailers has grown exponentially in the past year, Tracy Issel, general manager of Microsoft's retail sector, said in an interview. "It used to be [that] we talked with people about going to the cloud and the perceived risk and their concern about scalability," Issel said. "I haven't had one of those conversations in a long time. Now it's 'what do I move first and when do I do it?' Many are moving to Office 365 and Azure simultaneously."
In a roundtable discussion today, Issel introduced four customers that are in the midst of major new efforts using Azure and/or Windows 8.1-based tablets and kiosks. Here's a brief synopsis of their efforts:
GameStop: Jeff Donaldson, senior vice president of GameStop Technology Institute, outlined a number of initiatives that aim for customers to use the retailer's mobile app in ways that store employees can engage with them when a customer visits. The app uses a variety of analytics tools including Hewlett Packard Vertica, SAS for statistical analysis and Azure Machine Learning to inform a sales rep when a customer comes into a store as to what interactions have taken place in the past. "When they come into the store, we want to make sure the employees know about the messages we sent to customers so they better understand the intent of the visit," Donaldson says. Another major effort calls for delivering Ultra HD content into each of its 6,400 stores using Azure Media Services.
CKE Restaurant Holdings, aka Carls Jr. and Hardees: The popular fast-food chain has concluded that millennials would much rather interact with a kiosk to order their food than a person, said Thomas Lindblom, senior vice president and chief technology officer. As such, Hardees is rolling out kiosks that allow customers to choose and customize their burgers and it is designed to upsell. Lindblom is using Dell 24-inch off-the shelf touch-based PCs to deliver the highly visual application. Lindblom said CKE is "a significant user of Azure" for a number of functions including storage and disaster recovery. CKE has also rolled out Office 365.
TGI Fridays: Looking to modernize the dining experiences, waiters and waitresses will carry eight-inch off-the shelf Windows tablets with apps developed by Micros (which is now a part of Oracle). The point-of-sale solution is designed to track customer preferences through loyalty cards. "We are cutting training times [and] we are able to deliver this digital touch point in the hands of our servers as they serve their guests," said CIO Tripp Sessions.
McDonalds: Microsoft partner VMob has rolled out an Azure-based application at McDonalds locations in Europe that enables it to track customer preferences using information gathered by his or her purchasing patterns. VMob has also started rolling it out at locations in Japan. VMob founder and CEO Scott Bradley, who was demonstrating the solution in Microsoft’s both, indicated he’s still working on getting the app into United States locations but he implied that may take some time and said it’s not a done deal. Nevertheless, he said he believes McDonalds eventually will roll it out in the U.S.
Posted by Jeffrey Schwartz on 01/13/2015 at 1:48 PM0 comments
At last week's Consumer Electronics Show in Las Vegas, there were robots, smartwatches, driverless cars, ultra-high-definition TVs and home automation systems. Even the traditional PC desktop display got a facelift.
Hewlett Packard was among a number of suppliers showcasing new curved desktop displays, designed to provide a more "immersive" experience, as Ann Lai, director of commercial displays at HP, put it in a briefing prior to the show.
"With a curve desktop display, you're really sitting right in front of it, so the curve is wrapping around you and providing you a very immersive experience that also makes it easier to read and more comfortable to use for longer periods of time," Lai said . "As displays have gotten larger, we've noticed the edges can be harder to read, especially when you're sitting close to it. With a curved display, you're going to have much more comfortable peripheral viewing as well as a much more immersive experience as you're using your display."
I'm reserving judgment as I've never tried one, though my first reaction was these would have more cosmetic appeal for an executive's office than helping make workers more productive. HP's new Pavilion 27 CM Elite display and Elite display S273 are both 27-inch curve displays that are priced at $399. The price is a slight premium over displays without a curve.
If you were looking for a new desktop display, would one with a curve be on your checklist?
Posted by Jeffrey Schwartz on 01/12/2015 at 3:32 PM0 comments
When Microsoft released Windows Server 2012 R2 back in the fall of 2013, one of the many features we pointed out at the time was "Workplace Join," which is designed to let organizations give single sign-on capability to their bring your own device (BYOD) employees -- or for anything not designed to join an Active Directory domain. Simply put, it lets you register a non-domain-based device running Windows 8.1, iOS and Android to Active Directory.
Microsoft was especially happy to tout Workplace Join when it launched its Windows RT-based Surface 2 back in September 2013. In an interview with Surface Director of Marketing Cyril Belikoff at the time, she talked up the Workplace Join capability with me. "Workplace Joins are the access components of a directory service that allows a user to use their ID and password to access their corporate network documents and shares in a secure way," Belikoff said. "It's not a fully domained device but you get the administration of mobile device management and get the access component." Last year, Microsoft added support for Windows 7-based systems as well.
Workplace Join does require Active Directory Federation Services, Active Directory on premises and the Device Registration Service, all part of the "Federation Services Role on Windows Server 2012 R2," as described in the TechNet library.
I've talked to a variety of mobile device management vendors and suppliers of Active Directory management and auditing tools and I've heard various views. Some say customers (or prospective ones) are questioning how these tools will support Workplace Join and others recommend using the device enrollment features in their wares.
Microsoft boosting the functionality of Workplace Join in the forthcoming Windows 10 operating system could be a factor that builds to its popularity.
Please share your experience or views on Workplace Join. Is it suitable for your BYOD authentication requirements? Drop me a line at email@example.com.
Posted by Jeffrey Schwartz on 01/08/2015 at 2:31 PM0 comments
IT pros apparently plan to give Windows 10 a warmer welcome than they gave Windows 8 when it arrived, according to an online survey of Redmond magazine readers conducted during December and early this month. A respectable 41 percent said they plan to deploy PCs with Windows 10 within a year after it ships and 30 percent will go with Windows 8.1, the survey shows.
To put that in perspective, nearly two years ago only 18 percent of Redmond magazine readers said they planned to deploy Windows 8 in a survey fielded six months after that operating system shipped. The 41 percent now saying they will deploy Windows 10 was all the more respectable given it ranked second when asked which of Microsoft's "newer" offerings they plan to deploy this year. Topping the list, to no surprise, was Office 365, where nearly 46 percent say they plan to deploy it this year. Many respondents also plan to deploy Lync, Azure IaaS and Azure Active Directory this year.
To be sure, in a different question where Windows 10 was not an option (since it's still in Technical Preview), respondents overwhelmingly see Windows 7 as their client platform of choice to replace aging PCs, though not as many as in previous years, which is to be expected. Here's the breakdown of PC replacement plans:
- Windows 7: 57%
- Windows 8.1: 30%
- Virtual desktop/thin client: 6%
- Whatever employee requests: 1%
- Linux: 1%
It stands to reason that those that have started to mix Windows 8.1 devices into their shops will continue doing so. Anecdotally, though, I'm hearing many are awaiting the arrival of Windows 10 before making any firm commitments.
While it's hard to predict how IT decision makers ultimately will choose to replace aging desktops and portable PCs, it appears both Windows 8.1 and Windows 10 for now are the likely favorites among this audience in the coming year.
Drop me a line if you want to share your top IT priorities for 2015. I'm at firstname.lastname@example.org.
Posted by Jeffrey Schwartz on 01/07/2015 at 1:02 PM0 comments
As the annual Consumer Electronics Show kicks off today in Las Vegas, you can expect to hear lots of buzz about driverless cars, home automation systems, the so-called "Internet of Things" and of course wearable computing devices (including smartwatches and fitness bands).
Having spent most of December using the new Microsoft Band, as I reported last month, it has some nice features but it's still buggy and, in my opinion, not worth the steep $199 price tag. When I returned my Microsoft Band, the clerk asked why. I mentioned the buggy Bluetooth synchronization with iOS, which she admitted is a common problem with the Microsoft Band. It was also a problem CNBC On-Air Editor Jon Fortt emphasized while interviewing Matt Barlow, Microsoft's general manager of new devices.
"I'm sorry to hear about the challenges you're running into, but a bunch of other people using those devices are having a great time being fit with Microsoft Band," Barlow responded. Not letting Barlow off the hook, Fortt told Barlow that Microsoft has already acknowledged the Bluetooth connectivity issues with iOS. "With any types of new product rollout, you're going to have updates that need to occur with software," Barlow responded. "We're updating software and we're updating usability, so I'm definitely convinced that [we're] seeing people using the Microsoft Band with all phone types without any issues moving forward."
Barlow went on to tout the unique 24-hour heart-tracking capability of the Microsoft Band, along with its on-board GPS, guided workouts and e-mail, text and Facebook integration. "People are really looking for value, and when I think about what we have with the Microsoft Band ... at a $199 price point, [it] is certainly magical," he argued.
Clearly it is the early days for wearable devices and it remains to be seen if they will take off. For its part, Microsoft has only offered its band through its retail stores, further limiting their presence. One could argue that many are waiting for the Apple Watch, due out this quarter, but at a starting price of $349, it's not likely they'll be flying off the shelves either. Results of a survey by Piper Jaffray confirmed that.
Not that its earlier surveys showed pent-up demand either, but now only 7 percent of 968 iPhone users surveyed said they intend to purchase an Apple Watch, down from 8 percent back in September when it was introduced and 10 percent in September 2013.
"We believe that the muted response to the Watch is due to consumer questions including what is the killer feature of the watch?," wrote Gene Munster, senior analyst and known Apple bull, in a Dec. 21 research note. People also want to know, "what applications will be available for the watch? We believe that as we get closer to launch [this] year, Apple will answer many of these questions and demand will increase; however, we still believe expectations for the first year of the watch should remain conservative."
Do you use a wearable such as the Apple Watch, the Microsoft Band, or another item that come from a plethora of other players offering similar devices?
Posted by Jeffrey Schwartz on 01/05/2015 at 12:25 PM0 comments
Microsoft has taken a beating by critics over this month's security patch, which initially suggested that Windows 10 Technical Preview testers might need to uninstall Office before coming up with a less invasive workaround. Despite that and numerous other frustrations with the Windows 10 Technical Preview, Microsoft reported 1.5 million testers have their hands on a preview version of Windows 10, and nearly a third of them are "highly active." The company also claims that more people are testing it than any beta release of Windows to date.
Apologizing for not having a major new build this month, Gabriel Aul, a data and fundamentals team lead at Microsoft's Operating Systems Group, promised it would be worth the wait. "We're really focused on making the next build something that we hope you'll think is awesome," Aul wrote in a Windows Insider blog post Wednesday. "In fact, just so that we have a *daily* reminder to ourselves that we want this build to be great, we even named our build branch FBL_AWESOME. Yeah, it's a bit corny, but trust me that every Dev that checks in their code and sees that branch name gets an immediate reminder of our goal."
Microsoft recently said it will reveal what's in store in the next build on January 21 and Aul indicated it would have some substantive new features. Though Microsoft didn't release a major new build in December, Aul pointed out the company has issued numerous bug fixes as a result of feedback from the 450,000 active testers. Given the poor reception for Windows 8.x, the record number of testers of the Windows 10 Technical Preview is an encouraging sign.
While the large participation doesn't guarantee Windows 10 will be a hit, a sparse number of testers would obviously lower the odds of success. "That hardcore usage will help us fix all the rough edges and bugs," Aul said, noting his favorite was a "very rare" instance when the OneDrive icon in File Explorer could be replaced by an Outlook icon. So far, he noted, testers have helped discover 1,300 bugs. Aul said while most are minor bugs, Microsoft will implement UX changes based on the feedback as well. Many will be small changes and others will be major.
What's on your wish list for the next build and where does the final release of Windows 10 fit in your plans for 2015?
Posted by Jeffrey Schwartz on 12/19/2014 at 10:54 AM0 comments
It's been a tough few months for IBM. The company has seen its shares tumble in 2014 amid weak earnings. But looking to show it may be down but not out, Big Blue said it has picked up the pace to build out its cloud footprint after getting to a slow start several years ago.
To close the year, the company yesterday said the IBM Cloud now has 12 new datacenters around the world including in Frankfurt, Mexico City, Tokyo and nine other locations through colocation provider Equinix in Australia, France, Japan, Singapore, the Netherlands and the United States.
The IBM Cloud now has datacenters in 48 locations around the world. That's double the number it had last year thanks to its promise to invest $1.2 billion to expand its cloud network, which kicked into high gear with last year's acquisition of SoftLayer. On top of that, IBM invested $1 billion to build its BlueMix platform-as-a-service technology, designed for Web developers to build hybrid cloud applications.
Enabling that expansion, IBM made aggressive moves including its deal with Equinix to connect to its datacenters using the large colocation provider's Cloud Exchange network infrastructure. IBM also inked partnerships with AT&T, SAP and Microsoft. With its recently announced Microsoft partnership, applications designed for Azure will work on the IBM Cloud (and vice versa). As IBM and Microsoft compete with each other, Amazon and numerous other players, the partnership with Microsoft promises to benefit both companies and their customers.
As a result of its aggressive expansion this year, IBM says it, Microsoft and Amazon have the largest enterprise cloud infrastructure and platform offerings. Nevertheless, few would dispute Amazon remains the largest cloud provider.
In its announcement yesterday, IBM also said it recently inked more than $4 billion in long-term enterprise cloud deals with Lufthansa, WPP, Thomson Reuters and ABN Amro, and that its customer base has doubled in the past year to more than 20,000. While many are traditional Big Blue shops, IBM says thousands are new companies, including startups. IBM said its cloud revenues are growing at a 50 percent rate and on pace for $7 billion in 2015.
Posted by Jeffrey Schwartz on 12/18/2014 at 10:53 AM0 comments
Like many cloud service providers, Microsoft has identified disaster recovery as a key driver for its hybrid infrastructure-as-a-service (IaaS) offering. Microsoft this year delivered a critical component of delivering its disaster recovery as a service (DRaaS) with Azure Site Recovery.
If you saw Brien Posey's First Look at Azure Site Recovery, you may have quickly lost interest if you're not a Microsoft System Center user. That's because Azure Site Recovery required System Center Virtual Machine Manager. But with last week's Microsoft Azure release upgrade, the company lifted the SCVMM limitation.
The new Azure Site Recovery release allows customers to replicate and recover virtual machines using Microsoft Azure without SCVMM. "If you're protecting fewer VMs or using other management tools, you now have the option of protecting your Hyper-V VMs in Azure without using System Center Virtual Machine Manager," wrote Vibhor Kapoor, director of marketing for Microsoft Azure, in a blog post outlining the company's cloud service upgrades.
By making Azure Site Recovery Manager available without SCVMM, it brings the DRaaS to branch offices and smaller organizations that can't afford Microsoft's systems management platform or simply prefer other tools, explained Scott Guthrie, executive vice president of Microsoft's enterprise and cloud business, in a blog post. "Today's new support enables consistent replication, protection and recovery of Virtual Machines directly in Microsoft Azure. With this new support we have extended the Azure Site Recovery service to become a simple, reliable and cost effective DR Solution for enabling Virtual Machine replication and recovery between Windows Server 2012 R2 and Microsoft Azure without having to deploy a System Center Virtual Machine Manager on your primary site."
Guthrie pointed out that Azure Site Recovery builds upon Microsoft's Hyper-V Replica technology built into Windows Server 2012 R2 and Microsoft Azure "to provide remote health monitoring, no-impact recovery plan testing and single click orchestrated recovery -- all of this backed by an SLA that is enterprise-grade." Since organizations may have different uses for Azure Site Recovery, Guthrie underscored the One-Click Orchestration using Recovery Plans option, which provides various Recovery Time Objectives depending on the use case. For example using Azure Site Recovery for test and/or planned failovers versus unplanned ones typically require different RTOs, as well as for disaster recovery.
In addition to Hyper-V Replica in Windows Server 2012 R2, Azure Site Recovery can use Microsoft's SQL Server AlwaysOn feature. Azure Site Recovery also integrates with SAN replication infrastructure from NetApp, Hewlett Packard and EMC. Also, according to a comment by Microsoft's Roan Daley in our First Look, Azure Site Recovery also protects VMware workloads across VMware host using its new InMage option. Acquired back in July, InMage Scout is an on-premises appliance that offers real-time data capture on a continuous basis, which simultaneously performs local backups or remote replication via a single data stream. Microsoft is licensing Azure Site Recovery with the Scout technology on a per-virtual or per-physical instance basis.
Are you using Microsoft's Azure Site Recovery, planning to do so or are you looking at the various third party alternatives as cloud-based DRaaS becomes a more viable data protection alternative?
Posted by Jeffrey Schwartz on 12/17/2014 at 12:08 PM0 comments
Salesforce.com today launched a connector that aims to bridge its cloud-based CRM portfolio of services with enterprise file repositories. The new Salesforce File Connect will let organizations centralize their customer relationship management content with file stores including SharePoint and OneDrive with a connector to Google Drive coming in a few months.
The release of the connector to SharePoint and OneDrive was promised back in late May when both Salesforce.com and Microsoft announced a partnership to integrate their respective offerings. While the two companies have a longstanding rivalry, they also share significant overlapping customer bases. The companies at the time said they would enable OneDrive for Business and SharePoint Online as integrated storage options for the Salesforce platform.
In today's announcement, Salesforce claims it's the first to create a repository that natively integrates CRM content and files among popular enterprise file stores. Salesforce.com said it provides a simple method of browsing, searching and sharing files located in various repositories.
Salesforce.com described two simple use cases. One would enable a sales rep to attach a presentation on OneDrive for Business to a sales lead in the Salesforce CRM app. The other would allow a service representative to pull an FAQ content form OneDrive for Business running in the Salesforce Service Cloud app.
The connector supports federated search to query repositories simultaneously from any device and lets users attach files to social feeds, groups or records, enabling them to find contextually relevant information in discussions running in Salesforce Chatter. The tool is also designed to enforce existing file permissions.
For customers and third-party software providers wanting to embed file sharing into their applications, Salesforce.com also is offering the Salesforce Files Connect API.
Posted by Jeffrey Schwartz on 12/17/2014 at 12:26 PM0 comments
Microsoft last month entered the wearables market with the Microsoft Band, which, paired with the new Microsoft Health Web site and app for the wrist band, is designed to track your physical activities and bring some productivity features to your wrist.
The Microsoft Band, in my opinion, does a lot of interesting things, though it doesn't really excel at any of them at this point. Among the productivity features included are alerts that let you glance at the first sentence or two of a message, texts, Facebook posts, Facebook Messenger, phone calls, voicemails, schedules, stock prices and an alarm clock that vibrates gradually for deep sleepers who don't like to jump out of bed.
Then there's the health component that has a pedometer to track your steps, a monitor for runners as well as one to track general workouts. It also tracks your sleep including your average heart rate and how often you supposedly woke up. You can synchronize whatever physical activity it monitors with Microsoft Health, an app that runs on any iOS, Android and Windows Phone device. If you use it with Windows Phone, you get the added benefit of using Microsoft's Cortana, the digital assistant that responds to spoken commands. You can also look at reports on the Microsoft Health Web site.
My personal favorite: the Starbucks app, which presents the scan image of your account allowing for the barista to scan it when making a purchase. Most of them seeing it for the first time responded with awe, with one saying that "this is the wave of the future."
Though I've been skeptical about wearables like this and others like it from Fitbit, Samsung, Garman, Nike, Sony, and dozens of other providers, it's clearly a growing market and it may very well be the wave of the future -- or at least a wave of the future. Market researcher Statistica forecasts that the market for these wearables will be close to $5.2 billion this year, which is more than double over last year. In 2015, sales of these gadgets will hit $7.1 billion and by 2018 it will be $12.6 billion.
Gartner last month reported that while smart wristbands are poised for growth, its latest survey shows at least half are considering smartwatches. It actually sees the smartwatch market growing from 18 million units this year to 21 million in 2015, while purchases of wristbands will drop from 20 billion to 17 billion. Certainly the release of the Apple Watch, despite its hefty starting price of $349, will likely fuel that market, though I already questioned how much demand we'll see for it.
I haven't tested other devices so it's hard to say how the Microsoft Band rates compared to them. But I find the notion of having information on my wrist more compelling than I had thought. However, performance of my Microsoft Band is flaky. I've encountered synchronization problems that have required me to uninstall and reinstall the Microsoft Health app on my iPhone on a number of occasions. It has presented realistic heart rates when I'm at the gym and suddenly it would give numbers not believable. When I click on the e-mail button it often says I have nothing new and even when I can read them, the messages are cryptic and don't always indicate the sender.
I like that the Microsoft Band does synchronize with some other health apps, such as MyFitnessPal, which I use to track my meals these days. By importing that data, it provides more relevant info that I'd otherwise have to figure out and enter manually. The problem is, I don't believe I could have possibly burned 2,609 calories from a recent visit to the gym, though it would be nice if that was indeed the case.
That's why after spending several weeks with it, I can say I like the concept but it's not worth its $199 price tag unless money is no object to you. While I agree with my colleague Brien Posey that the Microsoft Band has some nice features, I think I'd wait for an improved version of the Microsoft Band and a richer Microsoft Health site before buying one of these (unless they become remarkably less expensive).
That stated, I hope Microsoft continues to enhance the Microsoft Band by adding more capacity and battery life to make it a more usable and comfortable device. If everyone had accurate readings of our physical activities, maybe it would lead to healthier lifestyles.
Posted by Jeffrey Schwartz on 12/15/2014 at 7:28 AM0 comments
Once hailed as the future of in-vehicle communications and entertainment, a partnership between Ford and Microsoft has all but unraveled. Ford this week said it's replacing Microsoft Sync with BlackBerry's QNX software.
Ford launched its Sync 3 platform, which ushers in significant new features and will show up in 2016 vehicles sometime next year, the company announced yesterday. Though Ford didn't officially announce it was walking away from Microsoft Sync in favor of BlackBerry QNX, The Seattle Times reported in February that the automaker was on the verge of making the switch. Raj Nair, Ford's CTO of global product development, said in numerous reports yesterday that QNX is now the new platform. Some 7 million Ford vehicles are reportedly equipped with Microsoft Sync but the systems have continuously scored poorly in consumer satisfaction reports due to frequent malfunctions.
Swapping out Microsoft Sync for QNX would also result in cost savings, according to The Seattle Times, noting that it's also used in the in-vehicle navigation systems of Audis and BMWs. Apple and Google also have alliances with various car manufactures. While BlackBerry smartphones may be rapidly disappearing, QNX has gained significant ground in the in-vehicle systems market. While Microsoft Sync, based on Windows Embedded, is said to also run the vehicle entertainment systems of some BMW, Kia, Fiat and Nissan models, Ford and Microsoft announced with great fanfare in 2007 their plans roll out models with the entertainment system as an option.
Microsoft Sync was initially designed to link iPods and Zune music players to entertainment systems, debuting just at the dawn of the smartphone age. At the time, Microsoft Founder Bill Gates saw Microsoft Sync as another element of the company's "Windows Everywhere" effort. As we all know, much as changed since then.
If Microsoft has new plans for Sync, the next logical time to announce them would be at next month's annual Detroit Auto Show.
Posted by Jeffrey Schwartz on 12/12/2014 at 11:28 AM0 comments
While IP address conflicts are as old as networks themselves, the growing number of employee-owned devices in the workplace are making them a more frequent problem for system administrators. By nature of the fact that PCs and devices have become transient in terms of the number of networks they may connect to, it's not uncommon for a device to still think it's linked to one network, causing an IP address conflict when it tries to connect to another network.
SolarWinds is addressing that with its new IP Control Bundle, which identifies and resolves IP address conflicts. The bundle consists of the new SolarWinds IP Address Manager (IPAM) and the SolarWinds User Device Tracker (UDT). There are two parts to the IP address resolution process.
First, IPAM identifies the IP conflicts by subnet, provides a history of where the user and machine connected to the network, identifies the switch and port on which that system connected and then actually disables that user's connection. Next, UDT uses that information to disable the switch port and assigns a new IP address and updates any DNS entries as necessary for the device to work before reconnecting.
IPAM and UDT are typically installed on a separate server, and when a problem arises an administrator can use the software to scan the network and IP address ranges. It also interrogates routers, switches and other network infrastructure to gather relevant troubleshooting information. Rather than using agents, it relies on standard protocols, notably SNMP.
In addition to troubleshooting and remediating client-based devices, the SolarWinds package can handle IP address conflicts occurring on servers and virtual machines, says Chris LaPoint, vice president of product management at SolarWinds.
"If I'm the owner of that critical app trying to figure out what's going on, I can go to this tool and see that Joe over in another part of the datacenter has spun up a new VM and that's what's creating issues with my application," LaPoint explains. "So now I can probably notify Joe and tell him I'm kicking him off the network because it's actually affecting the availability of a customer-facing application that we need to have running."
Pricing for IPAM starts at $1,995 and UDT begins at $1,795.
Separately, SolarWinds this week said its SolarWinds Web Help Desk now works with DameWare Remote Support. SolarWinds acquired DameWare in 2011 but it operates as a separate business unit. The products are collectively used by 25,000 customers and the combined solution will allow help desk technicians to connect with remote devices or servers, collect support data including chat transcripts and screen shots and generate reports.
The SolarWinds Web Help Desk offering provides automated ticketing, SLA alerts, asset management and reporting while DameWare Remote Support provides remote access to client devices and servers, allowing administrators to take control of those systems and manage multiple Active Directory Domains as well as resetting passwords.
Posted by Jeffrey Schwartz on 12/12/2014 at 11:27 AM0 comments
During the Thanksgiving break, I had a number of simultaneous encounters with PCs in public places still sporting the Windows XP logo and it got under my skin. Among them was a computer near the checkout area at Home Depot. And within an hour I spotted another on a counter right next to the teller stations at my local Bank of America branch.
Given that we know Windows XP systems are no longer patched by Microsoft, the sight of them is becoming as uncomfortable as being near someone who has a nasty cold and coughs without covering his or her mouth. Speaking of spreading viruses, I've even been to two different doctors' offices in recent months that were running Windows XP-based PCs -- one of them is used to actually gather patient information and the other to schedule appointments. In both cases, when I asked if they planned to upgrade those systems, I got the equivalent of a blank stare. I don't think they had any idea what I was talking about.
Nevertheless, seeing a Windows XP PC just after I used the self-checkout terminal at Home Depot was especially unsightly given the retailer's massive breach last month in which e-mail addresses were stolen. Home Depot Spokeswoman Meghan Basinger said: "Thanks for reaching out, but this isn't detail we'd discuss."
Now the Bank of America situation is a bit different. The day after the Thanksgiving holiday weekend, InformationWeek announced their IT chief of the year: Cathy Bessant, head of Bank of America's 100,000-person Global Technology & Operations, who manages an IT organization of 100,000 employees. That's a lot of IT pros and developers.
Bank of America appeared to have a strong IT organization just by the nature of the way the company is often first to market with new e-banking features and mobile apps. The bank's systems tend to be reliable and they haven't had any major breaches that I can recall. Also, having worked in the past for InformationWeek Editor-in-Chief Rob Preston, who interviewed Bessant and reported on the bank's ambitious IT efforts, I have no doubt the choice was a well vetted one.
So when he noted among the bank's many milestones this year that its IT team completed the largest Windows 7 migration to date (300,000 PCs), I felt compelled to check in with Bank of America Spokesman Mark Pipitone. Perhaps after updating so many systems, my inquiry sounded petty, but I was curious as to how they were dealing with these stray Windows XP systems. Were they paying $200 for premium support per system or maybe the PC was just front-ending an embedded system? (Microsoft does still support Windows XP embedded.) As such, I sent a picture of the system to Pipitone.
"Not knowing exactly what device you took a picture of, the best the team can tell is that it's an excepted device (there are some across our footprint), or it's a device that's powered on but not being used on a regular basis," Pipitone responded.
I made a trip to the branch and asked what the XP machine was used for. A rep there told me that it was used for those needing to access their safe deposit boxes. I informed Pipitone of that, though he declined to comment further. Maybe the lone PC I saw isn't connected to the Internet or it is otherwise protected. But the mere public display of Windows XP machines in so many different places for many tech-aware people is still disconcerting.
I laud Bank of America and others who have undertaken the painful move of modernizing their PC environments. At the same time, I look forward to a day when I don't have to see that Windows XP logo when I walk into a place of business, whether it's a doctor's office, a local restaurant or a major retailer or bank. Windows XP was a great operating system when it came out and I know some defenders of the legacy OS will be outraged by my stance -- many of whom are angered by Microsoft's decision to stop supporting it. But Windows XP machines are likely unprotected unless they're not, and never will be, connected to a network.
There is some encouraging news. Waiting in my inbox on December 1 right after the holiday weekend was a press release from StatCounter reporting that there are more Windows 8.1 PCs out there than those with Windows XP. According to the November report, 10.95 percent of systems are running Windows 8.1. Windows XP still accounts for 10.67 percent. This marks the first time that there are more Windows 8.1-based systems than Windows XP PCs, according to its analysis. Back in August, the combination of Windows 8 and Windows 8.1 systems achieved that milestone, so it could be argued the latest report is a minor feat.
Nevertheless, the stragglers will remain for some time, according to Sergio Galindo, general manager of GFI Software, a provider of Web monitoring and patch management software. "I'm aware of several companies that continue running large XP installations -- and even larger IT budgets -- that may have custom XP agreements," Galindo said. "Windows XP will continue to survive as long as it meets people's needs. To keep a network secure, IT admins and computer consultants can 'lock down' the accounts on the XP machines. I strongly advise that machines running XP be allowed only minimal capabilities and have no admin access. I also favor using more secure browsers such as Chrome versus Internet Explorer in these cases. Also, IT admins may want to shut off some of the more common attack vectors such as Adobe Flash. In the case of XP, less (software) is more (secure)."
By the way, just a friendly reminder: there are just over 200 days left before Microsoft will no longer support Windows Server 2003. You'll be hearing a lot about that from us and Redmond magazine's Greg Shields last month primed the pump.
Posted by Jeffrey Schwartz on 12/10/2014 at 12:54 PM0 comments
At Microsoft's annual shareholder meeting Wednesday in Bellevue, Wash., CEO Satya Nadella cashed in big. Shareholders approved his proposed $84 million pay package, a reward for a job well done. The pay package, which includes $59.2 million in stock options and a $13.5 million in retention pay, according to Bloomberg, has come under attack as excessive by Institutional Shareholder Services, an investor advisory organization.
Indeed Nadella ranks among the most highly paid CEOs. According to this year's Wall Street Journal/Hay Group CEO Compensation report ranking the 300 largest companies in revenue, the median pay package was $11.4 million, with Oracle CEO Larry Ellison taking the top spot in 2013 earning $76.9 million.
By that measure, Nadella isn't breaking any records. Oracle's share price rose nearly 29 percent, while Microsoft's share price jumped 32 percent since Nadella took over in early February. Nevertheless, investor advocates have scrutinized CEO compensation in wake of the financial crisis.
While Microsoft's prospects look better than they have in a long time, the package for some may look excessive. Others would argue Nadella has plenty of incentive to continue Microsoft's turnaround, which is still in its early stages and certainly not yet a sure thing, given rapid changes in fortune that can take place in the IT industry.
Do you believe Nadella's compensation is excessive or is it fair?
Posted by Jeffrey Schwartz on 12/05/2014 at 12:09 PM0 comments
It was hard to ignore the hype over the Thanksgiving weekend's traditional Black Friday and Cyber Monday barrage of cut rate deals including this year's decision by quite a few retailers to open their doors earlier than ever. Many, including the Microsoft Store, opened as early as 6 p.m. on Thanksgiving Day, hoping to lure people away from their turkey dinner earlier to get a jump on their holiday shopping.
Content with spending Thanksgiving Day with my family and not a big fan of crowds anyway, I decided to stop by my local Staples at a normal hour on Friday morning. To my surprise, there were just a handful of people in the store. When I asked an employee why the store was so empty on Black Friday, she said the crowds were all there Thanksgiving night.
When I asked her how many people bolted early from their turkey dinners, she said there was a line of about 100 people outside the store prior to its 6 p.m. opening Thursday evening. Apparently a good chunk of them were waiting for the $99 Asus EeeBook X205TA, which normally sells for at least double that price. Truth be told, that's why I popped in, though I had anticipated the allotment would be sold out. I had already looked at the 11.6-inch Windows 8.1 notebook, which can also function as a tablet with its removable keyboard. It's powered with an Intel Atom processor, 2 GB of RAM and a 32GB SSD.
I asked her how many people in line were waiting for that device and she replied that more than half were. While many Windows 8.1 notebooks and tablets were on sale during the holiday rush, the two prominent $99 deals were the aforementioned Asus device and the HP Stream 7. The latter is a 7-inch Windows 8.1 tablet and it comes with a one-year Office 365 Personal subscription good for the tablet and one other PC. The discounted HP Stream 7 is only available today at the Microsoft Store, which is also offering up to $150 off on the most expensive Surface Pros with Intel Core i7 processors.
The HP Stream 7 is also powered by an Intel Atom processor, a 32GB SSD but only has 1GB of RAM. While you shouldn't plan on doing much multitasking with this device, it's certainly a viable option if you want an ultra-portable tablet that can quickly access information and function as an option to a Kindle Fire (the Kindle app is among many apps now available in the Microsoft Store).
Given I already have a Dell Venue 8 Pro with similar specs and 2GB of RAM, the HP Stream 7 was of little interest to me, though it would make a good gift for someone at that price. Back at Staples, I asked the employee if there were any of the Asus convertibles left at the $99 price and to my surprise she said they were all out but I could order one with free delivery from the store's kiosk. It's slated to arrive today. Apparently you can still order one on this Cyber Monday on Staples' Web site (you can probably get a competitor to match the price).
Today the National Retail Federation released a report forecasting that sales over the Thanksgiving weekend overall were down 11 percent and there are a number of theories for why that's the case. The drop in sales does show that all of those retailers who felt compelled to open their doors on Thanksgiving Day may want to rethink that strategy for next year.
Posted by Jeffrey Schwartz on 12/01/2014 at 12:54 PM0 comments
Microsoft this week gave developers and IT pros a deep dive on major new features coming to Office 365, which the company has described as the fastest growing new product in its history. The demos, which include several APIs and SDKs aimed at driving existing SharePoint users to Office 365, gave a close look at how building and administering applications for collaboration is going to change dramatically for IT pros, developers and end users alike.
Because Microsoft has made clear that organizations running applications developed in native code for SharePoint won't be able to migrate them to Office 365, the company is trying to convince customers to plan for the eventual move using the company's new app model. Microsoft is betting by offering compelling new capabilities, which it describes as its "Office Everywhere" effort, that organizations will find making the move worthwhile.
The APIs and new Office 365 features demonstrated include the new My Apps user interface, which the company also calls the App Launcher, due out for preview imminently after what the company described as a brief delay. My Apps gives users a customizable interface to applications they use such Word, Excel, PowerPoint, contacts, mail and files. They can also add other Microsoft services as well as ultimately those of third parties.
Jeremy Thake, a senior Microsoft product manager, demonstrated the new Office 365 platform and underlying API model Thursday at the Live! 360/SharePoint Live! conference in Orlando. Thake said the Microsoft Graph demo was the first given in the United States since the company unveiled it two weeks ago at TechEd Europe, where Microsoft also released the preview of the new Office 365 APIs.
"The Microsoft Graph is essentially allowing me to authenticate once and then go to every single endpoint across Microsoft. And not just Office but Dynamics to Azure and anything I've got running Windows, such as Live, Outook.com and whatnot," Thake said during the demo, noting the plan is to tie it to third-party services that have registered to Microsoft Graph. "It's an access to those things from that one endpoint. This is a really powerful thing that isn't out yet. It's in preview; it will be coming next year."
Consultant Andrew Connell, organizer of the SharePoint Live! track at Live! 360 said the release of the APIs and the Microsoft Graph bode well for the future of Office 365 and SharePoint. "It opens so much more of the company data, not just so much more of our data that we're using in Microsoft services from a uniform endpoint for other companies to interact with and provide additional value on it," he said during the closing conference wrap up panel. "That's going to be tremendous. That [Microsoft Graph] API is being pushed by the 365 group but it's a Microsoft thing -- it touches everything we do."
Thake demonstrated numerous other APIs including a discovery service and the new Android and iOS SDKs, among other things. There are huge changes coming to Office 365 in 2015 and it will have a huge impact on IT pros and developers who build to and manage it. It was a huge topic at SharePoint Live! and I'll be sharing the implications of what's in the pipeline in the coming weeks.
Posted by Jeffrey Schwartz on 11/21/2014 at 9:02 AM0 comments
While Microsoft this year has rolled out extensive additions to its data management portfolio as well as business intelligence and analytics tools, SQL Server is still its core database platform. Nevertheless, Microsoft has unleashed quite a few new offerings that DBAs, developers and IT decision makers need to get their arms around.
"I think Microsoft needs to have the full stack to compete in the big data world," said Andrew Brust, who is research director at Gigaom Research. Brust Tuesday gave the keynote address at SQL Server Live!, part of the Live! 360 conference taking place in Orlando, Fla., which like Redmond, is produced by 1105 Media. Microsoft CEO Satya Nadella has talked of the data culture that's emerging, as noted in the Redmond magazine October cover story.
Brust pointed out that Microsoft has delivered some significant new tools over the past year including its Azure HDInsight, its Apache Hadoop-based cloud service for processing unstructured and semi-structured Big Data. Microsoft recently marked the one-year anniversary of Azure HDInsight with the preview of a new feature, Azure Machine Learning, which adds predictive analysis to the platform.
"Since the summer, they've added half a dozen new data products, mostly in the cloud but they're significant nonetheless," Brust said in an interview, pointing to the variety of offerings ranging from Stream Analytics, the company's real-time events processing engine to Azure Data Factory, which lets customers provision, orchestrate and process on-premises data such as SQL Server with cloud sources including Azure SQL database, Blobs and tables. It also offers ETL as a service. Brust also pointed to the new Microsoft DocumemtDB, the company's new NoSQL entry, which natively supports JSON-compatible documents.
Microsoft's release of SQL Server 2014, which adds in-memory processing to its flagship database, aims to take aim at SAP's HANA. "Microsoft is going after it from the point of view you can have in memory and just stay in SQL Server instead of having to having to move to a specialized database," Brust said. "It's a version one, so I don't expect adoption to be huge but it will be better in the next version. They are definitely still working on it. It's not just one-off that they threw out there -- it's very strategic for them."
Posted by Jeffrey Schwartz on 11/19/2014 at 1:38 PM0 comments
Microsoft today said it has merged Windows code into Docker, allowing administrators from a Windows client to manage Docker containers running on Linux hosts. It's the latest move by Microsoft to jump on the Docker bandwagon, which began earlier this year with its support for Linux containers in the Azure pubic cloud, and continued with last month's pact by the two companies to develop native Docker clients for Windows Server.
The company published reference documentation, published in the form of a command-line interface (CLI), that illustrates how to compile a Docker container on Windows. "Up 'til today you could only use Linux-based client CLI to manage your Docker container deployments or use boot2docker to set up a virtualized development environment in a Windows client machine," wrote Khalid Mouss, a senior program manager for the Azure runtime, in a blog post.
"Today, with a Windows CLI you can manage your Docker hosts wherever they are directly from your Windows clients," Mouss added. The Docker client is in the official Docker GitHub repository. Those interested can follow its development under Pull Request#9113.
While noteworthy, it bears noting this is not the announcement of a Windows Docker client -- it's just a move to enable management of Linux clients from a Windows client, said Andrew Brust, research director at Gigaom Research, who, when I saw the news on my phone, happened to be sitting next to me at our Live! 360 conference, taking place this week in Orlando, Fla. "This is simply a client that lets you run a client to manage Linux-based Docker containers," Brust said. "It's interesting but it's not a huge deal."
Furthermore, Mouss said on the heels of Microsoft open sourcing .NET Framework last week, the company this week also released a Docker image for ASP.NET on Docker Hub, enabling developers to create ASP.NET-ready containers from the base image. The ASP.NET image is available from Docker Hub.
See this month's Redmond magazine cover story on Microsoft's move toward containers as potentially the next wave on infrastructure and application virtualization.
Posted by Jeffrey Schwartz on 11/18/2014 at 11:52 AM0 comments
When Amazon Web Services announced Aurora as the latest database offering last week, the company put the IT industry on notice that it once again believes it can disrupt a key component of application infrastructures.
Amazon debuted Aurora at its annual AWS re:Invent customer and partner conference in Las Vegas. Amazon said the traditional SQL database for transaction-oriented applications, built to run on monolithic software and hardware, has reached its outer limits. Amazon Web Services' Andy Jassy said in the opening keynote address that the company has spent several years developing Aurora in secrecy.
Built on the premise that AWS' self-managed flagship services EC2, S3 and its Virtual Private Cloud (VPC) are designed for scale-out, service-oriented and multi-tenant architectures, Aurora removes half of the database out of the application tier, said Anurag Gupta, general manager of Amazon Aurora, during the keynote.
"There's a brand new log structured storage system that's scale out, multi-tenant and optimized for database workloads, and it's integrated with a bunch of AWS services like S3," said Gupta, explaining Aurora is MySQL-compatible. Moreover, he added, those with MySQL-based apps can migrate them to Aurora with just several mouse clicks and ultimately see a fivefold performance gain.
"With Aurora you can run 6 million inserts per minute, or 30 million selects," Gutpa said. "That's a lot faster than stock MySQL running on the largest instances from AWS, whether you're doing network IOs, local IOs or no IOs at all. But Aurora is also super durable. We replicate your data six ways across three availability zones and your data is automatically, incrementally, continuously backed up to S3, which as you know is designed for eleven nines durability."
Clearly Amazon is trying to grab workloads that organizations have built for MySQL but the company is also apparently targeting those that run on other SQL engines that it now hosts via its Relational Database Service (RDS) portfolio including Oracle, MySQL and Microsoft's SQL Server.
Aurora automatically repairs failures in the background recovering from crashes within seconds, Gupta added. It can replicate six copies of data across three Availability Zones and backup data continuously to S3. Customers can scale an Aurora database instance up to 32 virtual CPUs and 244GB of memory. Aurora replicas can span up to three availability zones with storage capacities starting at 10GB and as high as 64TB.
Gupta said the company is looking to price this for wide adoption, with pricing starting at 29 cents for a two-virtual CPU, 15.25-GB instance.
The preview is now available. Do you think Amazon Aurora will offer a viable alternative to SQL databases?
Posted by Jeffrey Schwartz on 11/17/2014 at 12:32 PM0 comments
Facebook is secretly developing a social network aimed at enterprise users, according to a report published in today's Financial Times. The report said Facebook at Work could threaten Microsoft's Yammer enterprise social network as well as LinkedIn and Google Drive.
At first glance, it's hard to understand how Facebook at Work would challenge both Yammer and LinkedIn. Though they're both social networks, they are used in different ways. Granted there's some overlap, Yammer is a social network for a closed group of users. Facebook users have apparently used Facebook at Work over the past year for internal communications and the company has let others test it as well.
The report was otherwise quite vague and I wonder if the author even understands the difference between Yammer, LinkedIn and Google Drive. It's not unreasonable to think Facebook would want to offer a business social network similar to Yammer or Salesforce.com's Chatter. But as PC Magazine points out, many businesses might have issues with a service provided by Facebook.
That said, I reached out to SharePoint and Yammer expert Christian Buckley, who recently formed GTConsult, to get his take. Buckley said there's been buzz about Facebook's ambitions for some time but he's skeptical that Facebook could make a serious dent in the enterprise social networking market despite its dominance on the consumer side.
"Honestly I think they're a couple years behind in making any serious move in this space," Buckley said. "They will undoubtedly attract users, and have a number of high-profile deployments, but there is a very real line of demarcation between consumer and business platforms, and I just don't see Facebook as being able to close that gap in any serious way."
Buckley also noted that Google, LinkedIn and Yammer have very different value propositions to enterprises. "Each have their own struggles," Buckley said. "LinkedIn may be displacing Yahoo Groups and other public chat forums, but my understanding is that they are having a difficult time translating that moderate growth into additional revenue beyond job postings. Yammer's difficulties may be a closer comparison and highlight Facebook's uphill battle to win over the enterprise by aligning ad hoc social collaboration capabilities with business processes. Microsoft has Yammer at the core of its inline social strategy, and like SharePoint, the individual Yammer brand will fade (in my view) as the core features are spread across the Office 365 platform. Instead of going to a defined Yammer location, the Yammer-like features will happen in association with your content, your e-mail, your CRM activities, and so forth."
What's your take on this latest rumor?
Posted by Jeffrey Schwartz on 11/17/2014 at 11:56 AM0 comments
When Microsoft CEO Satya Nadella last month said, "Microsoft loves Linux" and pointed to the fact that 20 percent of its Azure cloud is already running the open source popular platform, he apparently was getting ready to put his money where his mouth is.
At its Connect developer conference this week, Microsoft said it will open source its entire .NET Framework core and bring it to both Linux and the Apple Macintosh platform. It is the latest move by Microsoft to open up its proprietary .NET platform. Earlier this year, the company made ASP.NET and the C# compiler open source. This week the company released the .NET Core development stack and in the coming months, Microsoft will make the rest of .NET Core Runtime and .NET Core Framework open source.
Citing more than 1.8 billion .NET installations and over 7 million downloads of Visual Studio 2013 during the past year, Microsoft Developer Division Corporate Vice President S. Somasegar said in a blog post, "we are taking the next big step for the Microsoft developer platform, opening up access to .NET and Visual Studio to an even broader set of developers by beginning the process of open sourcing the full .NET server core stack and introducing a new free and fully-featured edition of Visual Studio." These were all once unthinkable moves.
Just how big a deal is this? Consider the reaction of Linux Foundation Executive Director Jim Zemlin: "These are huge moves for the company," he said in a blog post. "Microsoft is redefining itself in response to a world driven by open source software and collaborative development and is demonstrating its commitment to the developer in a variety of ways that include today's .NET news."
Zemlin lauded a number of Microsoft's open source overtures including its participation in the OpenDaylight SDN project, the AllSeen Alliance Internet of Things initiative and the Core Infrastructure Initiative.
For IT pros, the move is Microsoft's latest affirmation of the company's embrace of open source and Linux in particular. At the same time, while some believe Microsoft is also doing so to deemphasize Windows, the company's plans to provide Docker containers in Windows Server suggests the company has a dual-pronged strategy for datacenter and applications infrastructure: bolster the Windows platform to bring core new capabilities to its collaboration offerings while ensuring it can tie to open source platforms and applications as well.
At the same time, it appears that Microsoft is seeking to ensure that its development environment and ecosystem remains relevant in the age of modern apps. Zemlin believes Microsoft has, in effect, seen the light. "We do not agree with everything Microsoft does and certainly many open source projects compete directly with Microsoft products," he said. "However, the new Microsoft we are seeing today is certainly a different organization when it comes to open source. Microsoft understands that today's computing markets have changed and companies cannot go it alone the way they once did."
Posted by Jeffrey Schwartz on 11/14/2014 at 11:11 AM0 comments
When Microsoft last month announced it has 100-plus partners adopting its burgeoning Cloud OS Network, which aims to provide Azure-compatible third party cloud services, it left out perhaps one of the biggest fishes it has landed: Rackspace.
The two companies are longtime partners, and as I recently reported, Rackspace has extended its Hyper-V-compatible offerings and dedicated Exchange, SharePoint and Lync services. But Rackspace also has a formidable cloud infrastructure as a service that competes with the Azure network. The news that Rackspace now will provide Azure-compatible cloud service, announced on Monday with Rackspace's third-quarter earnings report, signals a boost for both companies.
For Microsoft it brings one of the world's largest public clouds and dedicated hosting providers into the Azure fold. Even if it's not all in or the core of Rackspace business -- that is still reserved for its own OpenStack-based infrastructure, a healthy VMware offering and the newly launched Google Apps practice -- Rackspace has a lot of Exchange and SharePoint hosting customers who may want to move to an Azure-like model but want to use it with the service level that the San Antonio, Texas-based company emphasizes.
"Those who are down in the managed 'colo' world, they don't want to be managing the infrastructure. They want us to do that," said Jeff DeVerter, general manager of Microsoft's Private Cloud business at Rackspace. "They're happy to let that go and get back into the business of running the applications that run that business."
Customers will be able to provision Azure private cloud instances in the Rackspace cloud and use the Windows Azure Pack to manage and view workloads. This is not a multitenant offering like Azure or similar infrastructure-as-a- service clouds, DeVerter pointed out. "These are truly private clouds from storage to compute to the networking layer and then the private cloud that gets deployed inside of their environment is dedicated to theirs. We deploy a private cloud into all of our datacenters [and] it puts the customers' cloud dual homing some of their management and reporting back to us so that we can manage hundreds and then thousands of our customers' clouds through one management cloud."
Microsoft first launched the Cloud OS Network nearly a year ago with just 25 partners. Now with more than 100, Marco Limena, Microsoft's vice president of Hosting Service Providers, claimed in a blog post late last month that there are in excess of 600 Cloud OS local datacenters in 100 companies serving 3.7 million customers. The company believes this network model will address the barriers among customers who have data sovereignty and other compliance requirements.
Among the members of the Cloud OS Network listed in an online directory are Bell Canada, CapGemini, Datapipe, Dimension Data and SherWeb. "Microsoft works closely with network members to enable best-practice solutions for hybrid cloud deployments including connections to the Microsoft Azure global cloud," Limena said.
Asked if it's in the works for Rackspace to enable Cloud OS private cloud customers to burst workloads to the Microsoft Azure service, DeVerter said: "Those are active conversations today that we're having internally and having with Microsoft. But right now our focus is around making that private cloud run the best it can at Rackspace."
Posted by Jeffrey Schwartz on 11/12/2014 at 1:01 PM0 comments
If the Microsoft Azure public drive is going to be the centerpiece of its infrastructure offering, the company needs to bring third-party applications and tools along with it. That's where the newly opened Microsoft Azure Marketplace comes in. The company announced the Microsoft Azure Marketplace at a press and analyst briefing in San Francisco late last month led by CEO Satya Nadella and Scott Guthrie, executive VP of cloud and enterprise. As the name implies, it's a central marketplace in which providers can deliver to customers to run their software as virtual images in Azure.
A variety of providers have already ported these virtual images to the marketplace -- some are pure software vendors, while others are providers of vertical industry solutions -- and a number of notable offerings have started appearing. Many providers announced their offerings at last month's TechEd conference in Barcelona.
One that Microsoft gave special attention to at the launch of the Azure Marketplace was Cloudera, the popular supplier of the Apache Hadoop distribution. Cloudera has agreed to port its Cloudera Enterprise distribution, which many Big Data apps are developed on, to Microsoft Azure. That's noteworthy because Microsoft's own Azure HDInsight Hadoop as a Service is based on the Hortonworks Apache Hadoop distribution. While it could cannibalize Azure HDInsight, those already committed to Cloudera are far less likely to come to Azure than if Cloudera is there.
"To date, most of our customers have built large infrastructures on premises to run those systems, but there's increasing interest in public cloud deployment and in hybrid cloud deployment, because infrastructure running in the datacenter needs to connect to infrastructure in the public cloud," said Cloudera Founder and Chief Strategy Officer Mike Olsen, speaking at the Microsoft cloud briefing in San Francisco. "This we believe is, for our customers, a major step forward in making the platform more consumable still."
Also up and running in the Azure Marketplace is Kemp Technologies, a popular provider of Windows Server load balancers and application delivery controllers. The Kemp Virtual LoadMaster for Azure lets customers create a virtual machine (VM) optimized to run natively in the Microsoft cloud, said Maurice McMullin, a Kemp product manager.
"Even though Azure itself does have a load balancer, it's a pretty rudimentary one," McMullin said. "Having the Kemp load balancer in there totally integrated into the Azure environment allows you to script some of those environments and application scenarios. The impact of that is, for an organization that's looking toward the cloud, one of the big challenges is trying to maintain the consistency by having a consistent load balancer from on premises, meaning you get a single management interface and consistent management of apps and policies on premises or in the cloud."
Lieberman Software has made available as a virtual image in the marketplace its Enterprise Random Password Manager (ERPM), which the company said provides enterprise-level access controls over privileged accounts throughout the IT stack, both on premises and now in Azure.
The company says ERPM removes persistent access to sensitive systems by automatically discovering, securing and auditing privileged accounts across all systems and apps within an enterprise. Authorized administrators can delegate to users quick access to specific business applications, as well as corporate social media sites in a secure environment. And those activities are automatically recorded and audited. It also ensures access to such identities is temporary and able to ensure unauthorized or anonymous access to sensitive data is avoided.
Another security tool is available from Waratek Ltd., a supplier of a Java Virtual Machine (JVM) container, which lets enterprises bring their own security to the cloud. Called Runtime Application Self-Protection (RASP), it monitors for key security issues and provides policy enforcement and attack blocking from the JVM.
In the JVM, the company offers a secure container where administrators can remotely control their own security at the application level, said Waratek CEO Brian Maccaba. "This is over and beyond anything the cloud provider can do for you and it's in your control," Maccaba says. "You're not handing it to Microsoft or Amazon -- you're regaining the reins, even though it's on the cloud."
The number of offerings in the Azure Marketplace is still relatively few -- it stands at close to 1,000 based on a search via the portal, though it is growing.
Posted on 11/10/2014 at 12:44 PM0 comments
Microsoft got some positive ink yesterday when it announced that Office 365 users on iPhones and iPads can now edit their documents for free and that the same capability was coming to Android tablets. Indeed it is good news for anyone who uses one or more of those devices (which is almost everyone these days).
But before you get too excited, you should read the fine print. As Directions on Microsoft Analyst Wes Miller noted on his blog, "Office is free for you to use on your smartphone or tablet if, and only if you are not using it for commercial purposes [and] you are not performing advanced editing."
If you do fit into the above-mentioned buckets or you want the unlimited storage and new Dropbox integration, it requires either an Office 365 Personal, Home or a commercial Office 365 subscription that comes with the Office 365 ProPlus desktop suite, Miller noted. As Computerworld's Gregg Keizer put it: "What Microsoft did Thursday was move the boundary between free and paid, shifting the line."
In Microsoft's blog post announcing the latest free offering, it does subtly note that this offer may not be entirely free. "Starting today, people can create and edit Office content on iPhones, iPads, and soon, Android tablets using Office apps without an Office 365 subscription," wrote Microsoft Corporate VP for Microsoft Office John Case, though that fine print was at the end of his post. "Of course Office 365 subscribers will continue to benefit from the full Office experience across devices with advanced editing and collaboration capabilities, unlimited OneDrive storage, Dropbox integration and a number of other benefits." Microsoft offers similar wording on the bottom of its press release issued yesterday.
Still, while noting this is great news for consumers, it's going to be problematic for IT organizations, Miller warned, especially those that have loose BYOD policies. "For commercial organizations, I'm concerned about how they can prevent this becoming a large license compliance issue when employees bring their own iPads in to work."
Are you concerned about this as well?
Posted by Jeffrey Schwartz on 11/07/2014 at 11:04 AM0 comments
BlueStripe Embeds App Monitor into System Center, Windows Azure Pack
BlueStripe Software is now offering its Performance Center tool as a management pack for Microsoft System Center 2012 R2 Operations Manager. The company earlier this year released the dashboard component of FactFinder, which monitors distributed applications across numerous modern and legacy platforms.
With the addition of Performance Center, the company has embedded its core FactFinder tool into System Center. FactFinder can monitor everything from mainframe infrastructure including CICS and SAP R3 transactions, along with applications running on Unix, Linux and Windows infrastructures. BlueStripe said it provides visibility and the root causes of performance to application components on physical, virtual and cloud environments. It works with third-party public cloud services, as well.
FactFinder integrates Operation Manager workflows, providing data such as response times, failed connections, application loads and server conditions, the company said. It also maps all business transactions by measuring performance across each hop of a given chain and is designed to drill into the server stack to determine the cause of a slow or failing transaction.
In addition to the new System Center Management Pack, BlueStripe launched Performance Center for the Windows Azure Pack, which is designed to provide administrators common visibility of their Windows Server and Microsoft Azure environments. This lets administrators and application owners monitor the performance via the Windows Azure Pack.
BlueStripe Marketing Manager Dave Mountain attended last week's TechEd Conference in Barcelona and said he was surprised at the amount of uptake for the Windows Azure Pack. "There's a recognition of the need for IT to operate in a hybrid cloud world," Mountain said. "IT's reason for existing is to ensure the delivery of business services. Tools that allow them to focus on app performance will be valuable and that's what we are doing with FactFinder Performance Center for Windows Azure Pack."
Netwrix Tackles Insider Threats with Auditor Upgrade
Netwrix Corp. has upgraded its auditing software to offer improved visibility to insider threats, while warning of data leaks more quickly. The new Netwrix Auditor 6.5 offers deeper monitoring of log files and privileged accounts, which in turn provides improved visibility to changes made across a network, including file servers and file shares.
The new release converts audit logs into more human readable formats, according to the company. It also lets IT managers and systems analysts audit configurations from any point in time, while providing archives of historical data against which to match. Netwrix said this ensures compliance with security policies and thwarting rogue employees from making unauthorized changes.
In all, Netwrix said it has added more than 30 improvements to the new release of Auditor, resulting in higher scalability and performance.
Riverbed Extends Visibility and Control
Riverbed Technology this week launched the latest version of its SteelHead WAN optimization platform, including a new release of its SteelCentral AppResonse management tool to monitor hybrid environments, including Software-as-a-Service (SaaS) apps.
Core to the new SteelHead 9.0 is its tight integration with SteelCentral AppResponse, which Riverbed said simplifies the ability to troubleshoot applications using the app's analytics engine, making it easier to manage such processes as policy configuration, patch management, reporting and troubleshooting. The SteelCentral dashboard lets administrators track performance of applications, networks, quality of service and reports on how policies are maintained.
SteelCentral AppResponse 9.5 also gives administration metrics on end-user experiences of traditional and SaaS-based apps, even if they're not optimized by the SteelHead WAN platform. Riverbed said providing this information aims to let IT groups respond to business requirements and issues causing degraded performance. The new SteelHead 9.0 also is designed to ensure optimized performance of Office 365 mailboxes.
Posted by Jeffrey Schwartz on 11/07/2014 at 10:50 AM0 comments
A majority of some of the largest chief information security officers (CISOs) strongly believe that the sophistication of attackers is outstripping their own ability to fend them off and the number of threats has increased markedly. According to IBM's third annual CISO study, 59 percent are concerned about their inability to keep pace with 40 percent and say it's their top security challenge.
Moreover, 83 percent said external threats have increased over the past three years with 42 percent of them saying the increases were dramatic. IBM revealed results of its study at a gathering of CISOs held at its New York offices.
The survey also found CISOs have also found themselves more frequently questioned by the C-suite and corporate boards, while changes to the global regulatory landscape promise to further complicate efforts to step threats, where the vast majority are derived. Kristin Lovejoy, IBM's general manager of security services, said malware creation is a big business in unregulated countries, which are the origin of most attacks.
"Where we say we're worried about external attackers and we're worried about financial crime data theft, there's a correlation between people getting Internet access in unregulated, unlegislated countries where it's an economic means of getting out," Lovejoy said. "When you interview the criminals, they don't even know they're performing a crime -- they're just building code. We have to be careful here, this external attacker thing, it's not going to get any better, it's going to get worse."
Most are able to exploit the naivety of employees, she added, noting 80 to 90 percent of all security incidents were because of human error. "They're getting in because users are pretty dumb," she said. "They click on stuff all the time. It's going to continue." She added organizations that are most secure are those that have good IT hygiene, automation, configuration management, asset management, especially those that implement ITIL practices.
Posted by Jeffrey Schwartz on 11/05/2014 at 1:23 PM0 comments
A survey of small and medium enterprises found that only 8 percent are prepared to recover from an unplanned IT outrage, while 23 percent of them report it would take more than a day to resume operations.
Underscoring the risk to companies with fewer than 1,000 employees, a vast majority of the 453 organizations surveyed have experienced a major IT outage in the past two years. Companies with 50 to 250 employees were especially at risk. A reported 83 percent have gone through a major IT failure, while 74 percent of organizations with 250 to 1,000 employees have experienced a significant outage.
One-third are using cloud-based disaster recovery as a service, which has rapidly started to gain momentum this year, according to the survey, conducted by Dimensional Research and sponsored by DRaaS provider Axcient. Daniel Kuperman, director of product marketing at Axcient, said the results confirmed what the company had suspected. "In a lot of cases, companies still don't put emphasis on disaster recovery," he said.
Axcient didn't reveal whose DRaaS offerings the organizations were using, through Kuperman said Dimensional chose its own companies to poll from the researcher's own resources. DRaaS is one of the leading use cases for organizations making their foray into using cloud services.
A survey by cloud infrastructure provider EvolveIP last month found that nearly 50 percent benefitted from a recovery cloud service by avoiding outages from a disaster. Nearly three quarters, or 73 percent, cited the ability to recover from an outrage was the prime benefit of using a cloud service. As a result, 42 percent of those responding to EvolveIP's survey have increased their cloud spending budgets this year, while 54 percent plan to do so in 2015.
Posted by Jeffrey Schwartz on 11/05/2014 at 12:01 PM0 comments
In its latest bid to offer better failover and replication in its software and cloud infrastructure, Microsoft demonstrated its new Storage Replica technology at last week's TechEd conference in Barcelona.
Microsoft Principal Program Manager Jeff Woolsey demonstrated Storage Replica during the opening TechEd keynote. Storage Replica, which Microsoft sometimes calls Windows Volume Replication (or WVR) provides block-level, synchronous replication between servers or cluster to provide disaster recovery, according to a Microsoft white paper published last month. The new replication engine is storage-agnostic and Microsoft says it can also stretch a failover cluster for high availability.
Most notable is that Storage Replica provides synchronous replication, which as Microsoft describes it, enables organizations to mirror data within the datacenter with "crash-consistent volumes." The result, says Microsoft, is zero data loss at the file system level. By comparison, asynchronous replication, which Microsoft added to Windows Server 2012 via the Hyper-V Replica and updated in last year's Windows Server 2012 R2 release, allows site extension beyond the limitations of a local metropolitan area. Asynchronous replication, which has a higher possibility for data loss or delay, may not be suited for scenarios where instantaneous real-time availability is a requirement, though for general purposes it's considered adequate.
In the TechEd demo, Woolsey simulated a scenario with four server nodes, two in New York and the other across the river in New Jersey. The goal is to ensure that if users are unable to access data on the two nodes in New York, they automatically and transparently fail over to New Jersey without losing any data, Woolsey explained. It also uses a new feature in the Microsoft Azure service called Cloud Witness.
"To do a stretch cluster you need to have a vote for the cluster quorum," Woolsey explained. "In the past, this meant extra hardware, extra infrastructure, extra cost. Now we're just making this part of Azure as well. So that's an option to take advantage of the Cloud Witness. As you can see, we're baking hybrid capabilities right into Windows Server."
In the demo, Woolsey accessed the file share data to enable replication via the new storage replication wizard. From there he selected the source log disk, then the destination storage volume and log disk. "Literally in just a few clicks, that's it, I've gone ahead and I've set up synchronous replication," he said.
In the recently published white paper, the following features are implemented in the Windows Server Technical Preview:
Yes (server to server only)
Storage hardware agnostic
Windows Server Stretch Cluster creation
Write order consistency across volumes
TCP/IP or RDMA
Replication network port firewall requirements
Single IANA port (TCP 445 or 5445)
Over the wire encryption and signing
Per-volume failovers allowed
Dedup & BitLocker volume support
Management UI in-box
Windows PowerShell, Failover Cluster Manager
Microsoft also has emphasized that Storage Replica is not intended for backup and recovery scenarios. And because of the general purpose of the product, the company noted it may not be suited to specific applications behaviors. In addition, Microsoft is warning that with Storage Replica, organizations could see feature gaps in applications and hence they could be better served by those app-specific replication technologies.
What's your take on Microsoft's latest efforts to embed disaster recovery into Windows Server and Azure?
Posted by Jeffrey Schwartz on 11/03/2014 at 1:26 PM0 comments
Microsoft used its TechEd conference in Barcelona this week to give customers a first look at the new Azure cloud in a box. The so-called Cloud Platform System (CPS), announced at an event held last week in San Francisco led by CEO Satya Nadella and Executive VP for Cloud and Enterprise Scott Guthrie, is Microsoft's effort to let customers or hosting providers run their own Azure clouds.
The first CPS is available from Dell, though describing the company as the "first" to provide one implies that other major hardware providers may have plans for their own iterations -- or perhaps it's only at the wishful thinking stage. At any rate, CPS has been a long time coming.
As you may recall, Microsoft first announced plans to release such an offering more than four years ago. At the time, Dell, Hewlett Packard and Fujitsu were planning to offer what was then coined the Windows Azure Platform Appliance, and eBay had planned to run one. Though Microsoft took it on a roadshow that year, it suddenly disappeared.
Now it's back and Corporate VP Jason Zander showcased it in his TechEd Europe opening keynote, inviting attendees to check it out on the show floor. "This is an Azure-consistent cloud in a box," he said. "We think this is going to give you the ability to adopt the cloud with even greater control. You energize it, you hook it up to your network and you're basically good to go."
The CPS appears more modest than the original Windows Azure Platform Appliance in that they are sold as converged rack-based systems and don't come in prefabricated containers with air conditioning and cooling systems. The racks are configured with Dell PowerEdge servers, storage enclosures and network switches. Each rack includes 32 CPU notes and up to 282TB of storage. On the software side customers get Windows Server 2012 R2 with Hyper-V, configured in a virtualized multi-tenant architecture, System Center 2012 R2 and the Windows Azure Pack to provide the Azure-tie functionality within a customer's datacenter.
So far, the first two known customers of the CPS are NTTX, which will use it to provide its own Azure infrastructure as a service in Japan and CapGemini, which will provide its own solutions for customers running in the Azure cloud.
CapGemini is using it for an offering called SkySight, which will run a variety of applications including SharePoint and Lync as well as a secure policy driven orchestration service based on its own implementation of Azure. "SkySite is a hybrid solution where we will deliver a complete integrated application store and a developer studio all using Microsoft technologies," said CapGemini Corporate VP Peter Croes, in a pre-recorded video presented by Zander during the keynote. "CPS for me is the integrated platform for public and private cloud. Actually it's the ideal platform to deliver the hybrid solution. That is what the customers are looking for."
Microsoft last week tried to differentiate itself from Amazon Web Services and Google in its hybrid approach. CPS could become an important component of Azure's overall success.
Posted by Jeffrey Schwartz on 10/31/2014 at 1:00 PM0 comments
Andy Rubin is leaving Google to join a technology incubator dedicated to startups building hardware, according to published reports. Whether you are an Android fan or not, it's hard not to argue that Google's acquisition of the company Rubin founded was one of the most significant deals made by the search giant.
While Rubin continued to lead the Android team since Google acquired it in 2005, Google reassigned him last year to lead the company's moves into the field of robotics, which included overseeing the acquisition of numerous startups.
Rubin's departure comes a week after Google CEO Larry Page promoted Sundar Pichai to head up all of Google's product lines except for YouTube. Page in a memo to employees published in The Wall Street Journal said he's looking to create a management structure that can make "faster, better decisions."
That effectively put Pichai in charge of the emerging robotics business as well. A spokesman told The New York Times that James Kuffner, who has worked with Google's self-driving cars, will lead the company's robotics efforts.
The move comes ironically on the same day that Google sold off the hardware portion of its Motorola Mobility business to Lenovo. The two companies yesterday closed on the deal, announced in January. Lenovo said it will continue to leverage the Motorola brand.
As for Rubin, now that he's incubating startups, he'll no doubt be on the sell-side of some interesting companies again.
Posted by Jeffrey Schwartz on 10/31/2014 at 12:58 PM0 comments
Microsoft kicked off what looks to be its final TechEd conference with the launch of new services designed to simplify the deployment, security and management of apps running in its cloud infrastructure. In the opening keynote presentation at TechEd, taking place in Barcelona, officials emphasized new capabilities that enable automation and the ability to better monitor the performance of specific nodes.
A new feature called Azure Operational Insights will tie the cloud service and Azure HDInsight with Microsoft's System Center management platform. HDInsight, the Apache Hadoop-based Big Data analytics service, will monitor and analyze machine data from cloud environments to determine where IT pros need to reallocate capacity.
Azure Operational Insights, which will be available in preview mode next month (a limited preview is currently available), initially will address four key functions: log management, change tracking, capacity planning and update assessment. It uses the Microsoft Monitoring Agent, which incorporates an application performance monitor for .NET apps and the IntelliTrace Collector in Microsoft's Visual Studio development tooling, to collect complete application-profiling traces. Microsoft offers the Monitoring Agent as a standalone tool or as a plugin to System Center Operations Manager.
Dave Mountain, vice president of marketing at BlueStripe Software, was impressed with the amount of information it gathers and the way it's presented. "If you look at it, this is a tool for plugging together management data and displaying it clearly," Mountain said. "The interface is very slick, there's a lot of customization and it's tile-based."
On the heels of last week's announcement that it will support the more-robust G-series of virtual machines, which boast up to 32 CPU cores of compute based on Intel's newest Xeon processors, 45GB of RAM and 6.5TB of local SSD storage, Microsoft debuted Azure Batch, which officials say is designed to let customers use Azure for jobs that require "massive" scale out. The preview is available now.
Azure Batch is based on the job scheduling engine used by Microsoft internally to manage the encoding of Azure Media Services and for testing the Azure infrastructure itself, said Scott Guthrie, Microsoft's executive VP for cloud and enterprise, in a blog post today.
"This new platform service provides 'job scheduling as a service' with auto-scaling of compute resources, making it easy to run large-scale parallel and high performance computing (HPC) work in Azure," Guthrie said. "You submit jobs, we start the VMs, run your tasks, handle any failures, and then shut things down as work completes."
The new Azure Batch SDK is based on the application framework from GreenButton, a New Zealand-based company that Microsoft acquired in May, Guthrie noted. "The Azure Batch SDK makes it easy to cloud-enable parallel, cluster and HPC applications by describing jobs with the required resources, data and one or more compute tasks," he said. "With job scheduling as a service, Azure developers can focus on using batch computing in their applications and delivering services without needing to build and manage a work queue, scaling resources up and down efficiently, dispatching tasks, and handling failures."
Microsoft also said it has made its Azure Automation service generally available. The tool is designed to automate repetitive cloud management tasks that are time consuming and prone to error, the company said. It's designed to use existing PowerShell workflows or IT pros can deploy their own.
Also now generally available is WebJobs, the component of Microsoft Azure Websites designed to simplify the running of programs, services or background tasks on a Web site, according a blog post by Product Marketing Manager Vibhor Kapoor, in a post today on the Microsoft Azure blog.
"WebJobs inherits all the goodness of Azure Websites -- deployment options, remote debugging capabilities, load balancing and auto-scaling," Kapoor noted. "Jobs can run in one instance, or in all of them. With WebJobs all the building blocks are there to build something amazing or, small background jobs to perform maintenance for a Web site."
Posted by Jeffrey Schwartz on 10/28/2014 at 10:59 AM0 comments
Will Microsoft's final TechEd conference this week in Barcelona go out with a bang? We'll have a better sense of that over the next two days as the company reveals the next set of deliverables for the datacenter and the cloud. Microsoft has kept a tight lid on what's planned but we should be on the lookout for info pertaining to the next versions of Windows Server, System Center and Hyper-V, along with how Microsoft sees containers helping advance virtualization and cloud interoperability.
In case you missed it, this week's TechEd, the twice-yearly conference Microsoft has held for nearly two decades, will be the last. Instead, Microsoft said earlier this month it will hold a broader conference for IT pros and developers called Ignite to be held in Chicago during the first week of May. Ignite will effectively envelope TechEd, SharePoint and Exchange.
Given the company's statements about faster release cycles, if officials don't reveal what's planned for the next releases of Windows Server, System Center and the so-called "Cloud OS" tools that enable it to provide an Azure-like infrastructure within the datacenter, partner cloud services and its own public cloud, I'd be quite surprised.
If you caught wind of presentations made last week by CEO Satya Nadella and Scott Guthrie, EVP of Microsoft's cloud and enterprise group, it was clear that besides some noteworthy announcements, they were clearly aimed at priming the pump for future announcements. For example Microsoft announced the Azure Marketplace, where ISV partners can develop virtual images designed to accelerate the use of Azure as a platform. Also revealed, the Azure G-series of virtual powered by the latest Intel Xeon processors that Guthrie claimed will be the largest VMs available in the public cloud -- at least for now. Guthrie claimed that the new VMS provide twice the memory of the largest Amazon cloud machine.
As Microsoft steps up its moves into containerization with the recent announcement that it's working with Docker to create Docker containers for Windows Server, it will be interesting to hear how that will play into the next release of the server operating system. It will also be interesting to learn to what extent Microsoft will emphasize capabilities in Windows Server and Azure that offer more automation as the company moves to build on the evolving software-defined datacenter.
The opening keynote is tomorrow, when we'll find out how much Microsoft intends disclose what's next for its core enterprise datacenter and cloud platforms. I'd be surprised and disappointed if it wasn't substantive.
Posted by Jeffrey Schwartz on 10/27/2014 at 3:17 PM0 comments
While almost every part of Microsoft's business faces huge pressure from disruptive technology and competitors, the software that put the company on the map -- Windows -- continues to show it's not going to go quietly into the night. Given Microsoft's surprise report that Surface sales have surged and the company promising new capabilities in the forthcoming release of Windows 10, expectations of the operating system's demise are at least premature and potentially postponed indefinitely.
Despite the debacle with its first Surface rollout two years ago, this year's release of the Surface Pro 3 and the resulting impressive performance shows that Windows still has a chance to remain relevant despite the overwhelming popularity of iOS and Android among consumers and enterprises. Granted, we now live in a multiplatform world, which is a good thing that's not going to change. The only question still to play out is where Windows will fit in the coming years and this will be determined by Microsoft making the right moves. Missteps by Apple and Google going forward will play a role as well of course, but the ball is in Microsoft's court to get Windows right.
Amid yesterday's impressive results for the first quarter of Microsoft's 2015 fiscal year were increases along key lines including Office 365, enterprise software and cloud business and the disclosure of $908 million in revenues for its Surface business. That's more than double of what Surface devices received last year. This report includes the first full quarter that the new Surface Pro 3 has been on the market. Presuming there wasn't significant channel stuffing, this is promising news for the future of Windows overall.
Indeed while showing hope, the latest report on Surface sales doesn't mean Windows is out of the woods. Despite the surge in revenues, Microsoft didn't reveal Surface unit sales. And while the company said its Surface business is now showing "positive gross margins" -- a notable milestone given the $900 million charge the company took five quarters ago due to poor device sales -- Microsoft didn't say how profitable they are, said Patrick Moorhead, principal analyst with Moor Insights & Strategy.
Marketing and the cost of implementing Microsoft's much improved global channel and distribution reach neutralized or negated much of the overall negative margin. Moorhead predicted, "I can say with 99 percent confidence they are losing money on Surface still. That may not be bad for two reasons. They need to demonstrate Windows 8 can provide a good experience and second of all it puts additional pressure on traditional OEMs that they need to be doing a better job than what they do."
Also worth noting, the $908 million in Surface revenues were about 17 percent of the $5.3 million Apple took in for iPads during the same period (revenues for Macintoshes, which are in many ways more comparable to the Surface Pro 3, were $6.6 million, Apple said). Apple's iPads, which often displace PCs for many tasks, are also hugely profitable though ironically sales of the tablets have declined for the past three quarters amid the sudden surge in Surface sales. Naturally they also have different capabilities but the point is to underscore the positive signs the growth of Surface portend for the future of Windows.
Morehead said the current quarter and notably holiday sales of all Windows devices, led by an expected onslaught of dirt-cheap Windows tablets (possibly as low as $99) could be an inflexion point, though he warned that Microsoft will need to execute. "If Microsoft continues to operate the way they are operating, they will continue to lose considerable consumer relevance," he said. "If during the holidays, they make news and sell big volumes, I would start to think otherwise."
Key to the quarter's turnaround was the company's expanded global distribution and extended sales of corporations through its channel partners, though that effort is still at a formative stage. Despite his skeptical warning, Moorhead believes Google's failure to displace Windows PCs with large Android devices and Chromebooks gives Microsoft a strong shot at keeping Windows relevant.
"Google had this huge opportunity to bring the pain on Microsoft with larger devices and eat into notebooks," he said. "They never did it. They really blew their opportunity when they had it. While Android may have cleaned up with phones, when you think about it what they did was just blocking Microsoft as opposed to going after Microsoft, which would be in larger form factor devices in the form of Android notebooks and Chromebooks. The apps are designed for 4-inch displays, not a 15-inch display or 17-inch display. And with Chrome, its offline capabilities just came in too slowly and there really aren't a lot of apps. They just added the capability to add real apps."
Meanwhile, Moorhead pointed out that Apple this month has delivered on what Microsoft is aiming to do: provide an experience that lets a user begin a task on say an iPhone and resume that task on an iPad or Mac.
Hence keeping Windows relevant, among other thing, may rest on Microsoft's ability to deliver a Windows 10 that can do that and improve on a general lack of apps on the OS, which in the long run would incent developers to come back. The promising part of that is the renewed focus on the desktop, Moorhead said. "When they converge the bits in a meaningful way, I think they can hit the long tail because of the way they're doing Windows 10 with Windows apps and the ability to leverage those 300 million units to a 7-inch tablet and a 4-inch phone. I think that is an enticing value proposition for developers."
Given the target audience of business users for the Surface Pro 3, it also is a promising signal for the prospects of Windows holding its own in the enterprise. Do you find the new surge in Surface sales coupled with design goal of Windows 10 to be encouraging signs for the future Windows or do you see it more as one last burst of energy?
Posted by Jeffrey Schwartz on 10/24/2014 at 12:48 PM0 comments
Microsoft may be trying to compete with IBM in the emerging market for machine learning-based intelligence but like all rivals, these two with a storied past together have their share of mutual interests even as they tout competing public enterprise clouds. Hence the two are the latest to forge a cloud compatibility partnership.
The companies said today they are working together to ensure some of their respective database and middleware offerings can run on both the IBM Cloud and Microsoft Azure. Coming to Microsoft Azure is IBM's WebSphere Liberty application server platform, MQ middleware and DB2 database. IBM's Pure Application Service will also run on Microsoft Azure the two companies said.
In exchange, Windows Server and SQL Server will work on the IBM Cloud. Both companies are collaborating to provide Microsoft's .NET runtime for IBM Bluemix, the company's new cloud development platform. While the IBM Cloud already has support for Microsoft's Hyper-V, IBM said it will add expanded support for the virtualization platform that's included in Windows Server. It was not immediately clear how they will improve Hyper-V support on the IBM Cloud.
Andrew Brust, a research director at Gigaom Research, said that the IBM Cloud, which is based on the SoftLayer public cloud IBM acquired last year for $2 billion, runs a significant amount of Hyper-V instances. "They explained to me that they have a 'non-trivial' amount of Windows business and that they support Hyper-V VMs," Brust said.
"With that in mind, the announcement makes sense, especially when you consider [Microsoft CEO] Satya's [Nadella] comment on Monday that Azure will 'compose' with other clouds," Brust added. The comment made by Nadella took place Monday when he was articulating on Microsoft's strategy to build Azure into a "hyperscale" cloud. "We are not building our hyperscale cloud in Azure in isolation," Nadella said. "We are building it to compose well with other clouds."
Nadella spelled out recent efforts to do that including last week's announcement that Microsoft is working with Docker to develop Docker containers for Windows Server, its support for native Java via its Oracle partnership (which, like IBM, includes its database and middleware offerings) as well as broad support for other languages including PHP, Python and Node.js. "This is just a subset of the open source as well as other middle-tier frameworks and languages that are supported on Azure," Nadella said at the event.
Most analysts agree that Amazon, Microsoft and Google operate the world's largest cloud infrastructures but with SoftLayer, IBM has a formidable public cloud as well. Both IBM and Microsoft are seeing considerable growth with their respective cloud offerings but have reasonably sized holes to fill as well.
Nadella said Monday that Microsoft has a $4.4 billion cloud business -- still a small fraction of its overall revenues but rapidly growing. For its part, IBM said on its earnings call Monday that its public cloud infrastructure is in a $3.1 billion run rate and its overall cloud business is up 50 percent, though the company's spectacular earnings miss has Wall Street wondering if IBM has failed to move quickly enough. The company's shares have tumbled in recent days and analysts are questioning whether the company needs a reboot similar to the one former CEO Lou Gerstner gave it two decades ago.
"Overall, this looks like a marriage of equals where both stand to gain by working harmoniously together," said PundIT Analyst Charles King. Forrester Research Analyst James Staten agreed. "IBM and Microsoft both need each other in this regard so a nice quid quo pro here," he said.
For Microsoft, adding IBM to the mix is just the latest in a spate of cloud partnerships. In addition to its partnership with Oracle last year, Microsoft recently announced a once-unthinkable cloud partnership with Salesforce.com and just tapped Dell to deliver its latest offering, the new Cloud Platform System, which the company describes as an "Azure-consistent cloud in a box" that it will begin offering to customers next month.
It also appears that IBM and Microsoft held back some of their crown jewels in this partnership. There was no mention of IBM's Watson or Big SQL, which is part of its InfoSphere Platform on Hadoop, based on a Hadoop Distributed File System (HDFS). During a briefing last week at Strata + Hadoop World in New York, IBM VP for Big Data Anjul Bhambhri described the recent third release of Big SQL in use with some big insurance companies. "Some of their queries which they were using on Hive, were taking 45 minutes to run," she said. "In Big SQL those kinds of things are 17 rejoins is now less than 5 minutes."
Likewise, the announcement doesn't seem to cover Microsoft's Azure Machine Learning or AzureHD Insights offerings. I checked with both companies and while both are looking into it, there was no response as of this posting. It also wasn't immediately clear when the offerings announced would be available.
Update: A Microsoft spokeswoman responded to some questions posed on the rollout of the services on both companies' cloud. Regarding the availability of IBM's software on Azure: "In the coming weeks, Microsoft Open Technologies, a wholly owned subsidiary of Microsoft, will publish license-included virtual machine images with key IBM software pre-installed," she stated. "Customers can take advantage of these virtual machines to use the included IBM software in a 'pay-per-use' fashion. Effective immediately, IBM has updated its policies to allow customers to bring their own license to Microsoft Azure by installing supported IBM software on a virtual machine in Azure."
As it pertains to using Microsoft's software in the IBM cloud, she noted: "Windows Server and SQL Server are available for use on IBM Cloud effective immediately. IBM will be offering a limited preview of .NET on IBM Cloud in the near future." And regarding plans to offer improves support for Hyper-V in the IBM Cloud: "Hyper-V is ready to run very well on IBM SoftLayer to provide virtualized infrastructure and apps. IBM is expanding its product support for Hyper-V."
Posted by Jeffrey Schwartz on 10/22/2014 at 2:22 PM0 comments
The launch today of the new Apple Pay service for users of the newest iPhone and iPad -- and ultimately the Apple Watch -- is a stark reminder that Microsoft has remained largely quiet about its plans to pursue this market when it comes to Windows Phone or through any other channels.
If smartphone-based payments or the ability to pay for goods with other peripherals such as watches does take off in the coming year, it could be the latest reason consumers shun Windows Phone, which despite a growing number of apps, still is way behind the two market leaders.
So if payments become the new killer app for smartphones, is it too late for Microsoft to add it to Windows Phone? The bigger question should be is it too late for Microsoft as a company? Perhaps the simplest way to jump in would be to buy PayPal, the company eBay last month said it will spin off. The problem there is eBay has an estimated market valuation of $65 billion -- too steep even for Microsoft.
If Microsoft still wants to get into e-payment -- which, in addition to boosting Windows Phone, could benefit Microsoft in other ways including its Xbox, Dynamics and Skype businesses, among others -- the company could buy an emerging e-payment company such as Square, which is said to be valued at a still-steep (but more comfortable) $6 billion.
Just as Microsoft's Bill Gates had visions of bringing Windows to smartphones nearly two decades ago, he also foresaw an e-payments market similar to the one now emerging. Gates was reminded of the fact that he described possible e-payment tech in his book, "The Road Ahead," by Bloomberg Television's Erik Schatzker in an interview released Oct. 2.
"Apple Pay is a great example of how a cellphone that identifies its user in a pretty strong way lets you make a transaction that should be very, very inexpensive," Gates said. "The fact that in any application I can buy something, that's fantastic. The fact I don't need a physical card any more -- I just do that transaction and you're going to be quite sure about who it is on the other end -- that is a real contribution. And all the platforms, whether it's Apple's or Google's or Microsoft, you'll see this payment capability get built in. That's built on industry standard protocols, NFC and these companies have all participated in getting those going. Apple will help make sure it gets critical mass for all the devices."
Given his onetime desire to lead Microsoft in offering digital wallet and payment technology, Schatzker asked Gates why Microsoft hasn't entered this market already? "Microsoft has a lot of banks using their technology to do this type of thing," Gates said. "In the mobile devices, the idea that a payment capability and storing the card in a nice secret way, that's going to be there on all the different platforms. Microsoft had a really good vision in this." Gates then subtly reminded Schatzker the point of their interview was to talk about the work of the Bill and Melinda Gates Foundation.
But before shifting back to that topic, Schatzker worked in another couple of questions, notably should Microsoft be a player the way that Apple is looking to become (and Google has) with its digital wallet? "Certainly Microsoft should do as well or better but of all the things that Microsoft needs to do in terms of making people more productive in their work, helping them communicate in new ways, it's a long list of opportunities," he said. "Microsoft has to innovate and taking Office and making it dramatically better would be really high on the list. That's the kind of thing I'm trying to help make sure they move fast on."
For those wishful that Microsoft does have plans in this emerging segment, there's hope. Iain Kennedy last month left Amazon.com where he managed the company's local commerce team to take on the new role of senior director of product management for Microsoft's new commerce platform strategy, according to his LinkedIn profile. Before joining Amazon, Kennedy spent four years at American Express.
Together with Gates' remarks, it's safe to presume that Microsoft isn't ignoring the future of digital payments and e-commerce. One sign is that Microsoft is getting ready to launch a smartwatch within the next few weeks that is focused on fitness. While that doesn't address e-payments, it's certainly a reasonable way to get into the game. According to a Forbes report today, it will be a multiple operating system watch.
It's unclear what role Windows Phone will play in bringing a payments service to market but it's looking less like it will have a starring role. As a "productivity and platforms" company, despite Gates' shifting the conversation to Office, it may not portend that Microsoft has plans for the e-payments market. If the company moves soon, it may not be too late.
Posted by Jeffrey Schwartz on 10/20/2014 at 1:49 PM0 comments
Apparently stung by his remarks last week that women shouldn't ask for raises but instead look for "karma," Microsoft CEO Satya Nadella said he is putting controls in place that will require all employees to attend diversity training workshops to ensure not just equal pay for women but opportunities for advancement regardless of gender or race.
Nadella had quickly apologized for his remarks at last week's Hopper Celebration of Women in Computing in Phoenix but apparently he's putting his money where his mouth is. Microsoft's HR team reported to Nadella that women in the U.S. last year earned 99.7 percent of what men earned at the same title and rank, according to an e-mail sent to employees Wednesday that was procured by GeekWire.
"In any given year, any particular group may be slightly above or slightly below 100 percent," he said. "But this obscures an important point: We must ensure not only that everyone receives equal pay for equal work, but that they have the opportunity to do equal work."
Given the attention Microsoft's CEO brought to the issue over the past week, it begs the question: do women in your company earn the same amount as men for the same job title and responsibilities and have the same opportunities for advancement or is there a clear bias? IT is an industry dominated by men though educators are trying to convince more women to take up computer science.
Posted by Jeffrey Schwartz on 10/17/2014 at 12:59 PM0 comments
Nearly a year after launching its Hadoop-based Azure HDInsight cloud analytics service, Microsoft believes it's a better and broader solution for real-time analytics and predictive analysis than IBM's widely touted Watson. Big Blue this year has begun commercializing its Watson technology, made famous in 2011 when it came out of the research labs to appear and win on the television game show Jeopardy.
Both companies had a large presence at this year's Strata + Hadoop World Conference in New York, attended by 5,000 Big Data geeks. At the Microsoft booth, Eron Kelly, general manager for SQL Server product marketing, highlighted some key improvements to Microsoft's overall Big Data portfolio since last year's release of Azure HDInsight including SQL Server 2014 with support for in-memory processing, PowerBI and the launch in June of Azure Machine Learning. In addition to bolstering the offering, Microsoft showcased Azure ML's ability to perform real-time predictive analytics for the retail chain Pier One.
"I think it's very similar," in terms of the machine learning capabilities of Watson and Azure ML, Kelly said. "We look at our offering as a self-service on the Web solution where you grab a couple of predictive model clips and you're in production. With Watson, you call in the consultants. It's just a difference fundamentally [that] goes to market versus IBM. I think we have a good advantage of getting scale and broad reach."
Not surprisingly, Anjul Bhambhri, vice president of Big Data for IBM's software group disagreed. "There are certain applications which could be very complicated which require consulting to get it right," she said. "There's also a lot of innovation that IBM has brought to market around exploration, visualization and discovery of Big Data which doesn't require any consulting." In addition to Watson, IBM offers its InfoSphere BigInsights for Hadoop and Big SQL offerings.
As it broadens its approach with a new "data culture," Microsoft has come on strong with Azure ML, noting it shares many of the real-time predictive analytics of the new personal assistant in Windows Phone called Cortana. Now Microsoft is looking to further broaden the reach of Azure ML with the launch of a new app store-type marketplace where Microsoft and its partners will offer APIs consisting of predictive models that can plug into Azure Machine Learning.
Kicking off the new marketplace, Joseph Sirosh, Microsoft's corporate VP for information management and machine learning, gave a talk at the Strata + Hadoop conference this morning. "Now's the time for us to try to build the new data science economy," he said in his presentation. "Let's see how we might be able to build that. What do data science and machine learning people do typically? They build analytical models. But can you buy them?"
Sirosh said with Microsoft's new data section of the Azure Marketplace, marketplace developers and IT pros can search for predictive analytics components. It consists of APIs developed both by Microsoft and partners. Among those APIs from Microsoft are Frequently Bought Together, Anomaly Detection, Cluster Manager and Lexicon Sentiment Analysis. Third parties selling their APIs and models include Datafinder, MapMechanics and Versium Analytics.
Microsoft's goal is to build up the marketplace for these data models. "As more of you data scientists publish APIs into that marketplace, that marketplace will become just like other online app stores -- an enormous of selection of intelligent APIs. And we all know as data scientists that selection is important," Sirosh said. "Imagine a million APIs appearing in a marketplace and a virtual cycle like this that us data scientists can tap into."
Also enabling the real-time predictive analytics support is support for Apache Storm clusters, announced today. Though it's in preview, Kelly said Microsoft is adhering to its SLAs with use of the Apache Storm capability, which enables complex event processing and stream analytics, providing much faster responses to queries.
Microsoft also said it would support the forthcoming Hortonworks Data Platform, which has automatic backup to Azure BLOB storage, Kelly said. "Any Hortonworks customer can back up all their data to an Azure Blob in a real low cost way of storing their data, and similarly once that data is in Azure, it makes it real easy for them to apply some of these machine learning models to it for analysis with Power BI [or other tools]."
Hortonworks is also bringing HDP to Azure Virtual Machines as an Azure certified partner. This will bring Azure HDInsight to customers who want more control over it in an infrastructure-as-a-service model, Kelly said. Azure HDInsight is currently a platform as a service that is managed by Microsoft.
Posted by Jeffrey Schwartz on 10/17/2014 at 9:54 AM0 comments
In perhaps its greatest embrace of the open source Linux community to date, Microsoft is teaming up with Docker to develop Docker containers that will run on the next version of Windows Server, the two companies announced today. Currently Docker containers, which are designed to enable application portability using code developed as micro-services, can only run on Linux servers.
While Microsoft has stepped up its efforts to work with Linux and other open source software over the years, this latest surprise move marks a key initiative to help make containers portable among respective Windows Server and Linux server environments and cloud infrastructures. It also underscores a willingness to extend its ties with the open source community as a key contributor to make that happen.
In addition to making applications portable, proponents say containers could someday supersede the traditional virtual machine. Thanks to their lightweight composition, containers can provide the speed and scale needed for next generation applications and infrastructure components. Those next generation applications include those that make use of Big Data and processing complex computations.
Containers have long existed, particularly in the Linux community and by third parties such as Parallels. But they have always had their own implementations. Docker has taken the open source and computing world by storm over the past year since the company, launched less than two years ago, released a standard container that created a de-facto standard for how applications can extend from one platform to another running as micro-services.
Many companies have jumped on the Docker bandwagon in recent months including Amazon, Google, IBM, Red Hat and VMware, among others. Microsoft in May said it would enable Docker containers to run in its Azure infrastructure as a service cloud. The collaboration between Docker and Microsoft was a closely held secret.
Microsoft Azure CTO Mark Russinovich had talked about the company's work with Docker to support its containers in Azure in a panel at the Interop show in New York Sept. 30 and later in an interview. Russinovich alluded to Microsoft's own effort to develop Windows containers, called Drawbridge. Describing it as an internal effort, Russinovich revealed the container technology is in use within the company internally and is now available for customers that run their own machine learning-based code in the Azure service.
"Obviously spinning up a VM for [machine learning] is not acceptable in terms of the experience," Russinovich said during the panel discussion. "We are figuring out how to make that kind of technology available publicly on Windows."
At the time, Russinovich was tight-lipped about Microsoft's work with Docker and the two companies' stealth effort. Russinovich emphasized Microsoft's support for Linux containers on Azure and when pressed about Drawbridge he described it as a more superior container technology, arguing its containers are more secure for deploying micro-services.
As we now know, Microsoft has been working quietly behind the scenes with Docker to enable the Docker Engine, originally architected only to run in a Linux server, to operate with Windows Server as well. The two companies are working together to enable the Docker Engine to work in the next version of Windows Server.
Microsoft is working to enable Docker Engine images for Windows Server that will be available in Docker Hub, an open source repository housing more than 45,000 Docker applications via shared developer communities. As a result, Docker images will be available for both Linux and Windows Server.
Furthermore, the Docker Hub will run in the Microsoft Azure public cloud, accessible via the Azure Management Portal and Azure Gallery and Management Portal. This will allow cloud developers including its ISV partners to access the images. Microsoft also said it will support Docker orchestration APIs, which will let developers and administrators manage applications across both Windows and Linux platforms using common tooling. This will provide portability across different infrastructure, such as on-premises servers to cloud. It bears noting the individual containers remain tied to the operating system they are derived from.
The Docker Engine for Windows Server will be part of the Docker open source project where Microsoft said it intends to be an active participant. The result is that developers will now be able to use preconfigured Docker containers in both Linux and Windows environments.
Microsoft is not saying when it will appear, noting it is in the hands of the open source community, according to Ross Gardler, senior technology evangelist for Microsoft Open Technologies. To what extent Microsoft will share the underlying Windows code is not clear. Nor would he say to what extent, if any, the work from Docker will appear in this effort other than to say the company has gained deep knowledge from that project.
"This announcement is about a partnership of the bringing of Docker to Windows Server to insure we have interoperability between Docker containers," Gardler said. "The underlying implementation of that is not overly important. What is important is the fact that we'll have compatibility in the APIs between the Docker containers on Linux, and the Docker container on Windows."
David Messina, vice president of marketing at Docker, said the collaboration and integration between the two companies on the Docker Hub and the Azure Gallery, will lead to the merging of the best application content from both communities.
"If I'm a developer and I'm trying to build a differentiated application, what I want to focus on is a core service that's going to be unique to my enterprise or my organization and I want to pull in other content that's already there to be components for the application," Messina said. "So you're going to get faster innovation and the ability to focus on core differentiating capabilities and then leveraging investments from everybody else."
In addition to leading to faster development cycles, it appears containers will place less focus on the operating system over time. "It's less about dependencies on the operating system and more about being able to choose the technologies that are most appropriate and execute those on the platform," Microsoft's Gardler said.
Microosft Azure Corporate VP Jason Zander described the company's reasoning and plan to support Docker in Windows Server and Azure in a blog post. Zander explained how they will work:
Windows Server containers provide applications an isolated, portable and resource controlled operating environment. This isolation enables containerized applications to run without risk of dependencies and environmental configuration affecting the application. By sharing the same kernel and other key system components, containers exhibit rapid startup times and reduced resource overhead. Rapid startup helps in development and testing scenarios and continuous integration environments, while the reduced resource overhead makes them ideal for service-oriented architectures.
The Windows Server container infrastructure allows for sharing, publishing and shipping of containers to anywhere the next wave of Windows Server is running. With this new technology millions of Windows developers familiar with technologies such as .NET, ASP.NET, PowerShell, and more will be able to leverage container technology. No longer will developers have to choose between the advantages of containers and using Windows Server technologies.
IDC Analyst Al Hilwasaid in an e-mail that Microsoft has taken a significant step toward advancing container technology. "This is a big step for both Microsoft and the Docker technology," he said. "Some of the things I look forward to figuring out is how Docker will perform on Windows and how easy it will be to run or convert Linux Docker apps on Windows."
Posted by Jeffrey Schwartz on 10/15/2014 at 2:10 PM0 comments
Satya Nadella's comments suggesting that women shouldn't ask for pay raises or promotions have prompted outrage on social media. But to his credit, he swiftly apologized, saying he didn't mean what he said.
To be sure, Nadella's answer to the question of "What is your advice?" to women uncomfortable asking for a raise was, indeed, insulting to women. Nadella said "karma" is the best way women should expect a salary increase or career advancement, a comment the male CEO couldn't have made at a worse place: The Grace Hopper Celebration of Women in Computing in Phoenix, where he was interviewed onstage by Maria Klawe, president of Harvey Mudd College in Claremont, Calif. Even more unfortunate, Klawe is a Microsoft board member, one of the people Nadella reports to.
Here's exactly what Nadella said:
"It's not really just asking for the raise but knowing and having faith that the system will give you the right raises as you go along. And I think it might be one of the additional super powers that, quite frankly, women who don't ask for a raise have. Because that's good karma, it will come back, because somebody's going to know that 'that's the kind of person that I want to trust. That's the kind of person that I want to really give more responsibility to,' and in the long-term efficiency, things catch up."
Accentuating his poor choice of words was Klawe's immediate and firm challenge when she responded, "This is one of the very few things I disagree with you on," which was followed by a rousing applause. But it also gave Klawe an opportunity to tell women how not to make the same mistake she has in the past.
Klawe explained that she was among those who could easily advocate for someone who works for her but not for herself. Klawe related how she got stiffed on getting fair pay when she took a job as dean of Princeton University's engineering school because she didn't advocate for herself. Instead of finding out how much she was worth, when the university asked Klawe how much she wanted to be paid, she told her boss, who was a woman, "Just pay me what you think is right." Princeton paid her $50,000 less than the going scale for that position, Klawe said.
Now she's learned her lesson and offered the following advice: "Do your homework. Make sure you know what a reasonable salary is if you're being offered a job. Do not be as stupid as I was. Second, roleplay. Sit down with somebody you really trust and practice asking for the salary you deserve."
Certainly, the lack of equal pay and advancement for women has been a problem as long as I can remember. On occasion, high-profile lawsuits, often in the financial services industry, will bring it to the forefront and politicians will address it in their campaign speeches. The IT industry, perhaps even more so than the financial services industry, is dominated by men.
Nadella's apology appeared heartfelt. "I answered that question completely wrong," he said in an e-mail to employees almost immediately after making the remarks. "Without a doubt I wholeheartedly support programs at Microsoft and in the industry that bring more women into technology and close the pay gap. I believe men and women should get equal pay for equal work. And when it comes to career advice on getting a raise when you think it's deserved, Maria's advice was the right advice. If you think you deserve a raise, you should just ask. I said I was looking forward to the Grace Hopper Conference to learn, and I certainly learned a valuable lesson. I look forward to speaking with you at our monthly Q&A next week and am happy to answer any question you have."
Critics may believe Nadella's apology was nothing more than damage control. It's indeed the first major gaffe committed by the new CEO, but I'd take him at his word. Nadella, who has two daughters of his own, has encouraged employees to ask if they feel they deserve a raise. If Nadella's ill-chosen comments do nothing else, they'll elevate the discussion within Microsoft and throughout the IT industry and business world at large.
Of course, actions speak louder than words, and that's where the challenge remains.
Posted by Jeffrey Schwartz on 10/10/2014 at 3:00 PM0 comments
In its bid to replace the traditional Windows and client environment with virtual desktops, VMware will release major new upgrades of its VMware Workstation and VMware Player desktop virtualization offerings in December. Both will offer support for the latest software and hardware architectures and cloud services.
The new VMware Workstation 11, the company's complete virtual desktop offering and the company's flagship product launched 15 years ago, is widely used by IT administrators, developers and QA teams. VMware Workstation 11 will support the new Windows 10 Technical Preview for enterprise and commercial IT testers and developers who want to put Microsoft's latest PC operating system through the paces in a virtual desktop environment.
Built with nested virtualization, VMware Workstation can run other hypervisors inside the VM, including Microsoft's Hyper-V and VMware's own vSphere and ESXi. In addition to running the new Windows 10 Technical Preview, VMware Workstation 11 will add support for other operating systems including Windows 2012 R2 for servers, Ubuntu 14.10, RHEL 7, CentOS 7, Fedora 20, Debian 7.6 and more than 200 others, the company said.
Also new in VMware Workstation 11 is support for the most current 64-bit x86 processors including Intel's Haswell (released late last year). VMware claims that based on its own testing, using Haswell's new microprocessor architecture with VMware Workstation 11 will offer up to a 45 percent performance improvement for functions such as encryption and multimedia. It will let IT pros and developers build VMs with up to 16 vCPUs, 8TB virtual disks and up to 64GB of memory. It will also connect to vSphere and the vCloud Air public cloud.
For more mainstream users is the new VMware Player 7. Since it's targeted at everyday users rather than just IT pros and administrators, it has fewer of the bells and whistles, but it gains support for the current Windows 8.1 operating system, as well as offering continued support for Windows XP and Windows 7 in desktop virtual environments. "Our goal is to have zero-base support," said William Myrhang, senior product marketing manager at VMware.
VMware Player 7 adds support for the latest crop of PCs and tablets and will be able to run restricted VMs, which, as the name implies, are secure clients that are encrypted, password restricted and can shut off USB access. VMware said the restricted VMs, which can be built with VMware Workstation 11 or VMware Fusion 7 Pro, run in isolation between host and guest operating systems and can have time limits built in.
Posted by Jeffrey Schwartz on 10/08/2014 at 2:25 PM0 comments
Veeam today said it will offer a free Windows endpoint backup and recovery client. The move is a departure from its history of providing replication and backup and recovery software for virtual server environments However, company officials said the move is not a departure from focus, which will remain targeted on protection of server virtual machines, but rather a realization that most organizations are not entirely virtual.
The new Veeam Endpoint Backup software will run on Windows 7 and later OSes (though not Windows RT) to an internal or external (such as USB) disk or flash drive and will include a networked attached storage (NAS) share within the Veeam environment. The company will issue a beta in the coming weeks and the product is due to be officially released sometime next year. The surprise announcement came on the closing day of its inaugural customer and partner conference called VeeamON, held in Las Vegas.
Enterprise Strategy Group Analyst Jason Buffington, who follows the data protection market and has conducted research for Veeam, said offering endpoint client software was unexpected. "At first, I was a little surprised because it didn't seem congruent with that VM-centric approach to things," Buffington said. "But that's another great example of them adding a fringe utility. In this first release, while it's an endpoint solution, primarily, there's no reason you technically couldn't run it on a low-end Windows Server. I'm reasonably confident they are not going to go hog wild into the endpoint protection business. This is just their way to kind of test the code, test customers' willingness for it, as a way to vet that physical feature such that they have even a stronger stranglehold on that midsize org that's backing up everything except a few stragglers."
At a media briefing during the VeeamON conference, company officials emphasized that they remain focused on its core business of protecting server VMs as it plots its growth toward supporting various cloud environments as backup and disaster recovery targets. Doug Hazelman, Veeam vice president of product strategy, indicated that it could be used for various physical servers as well. Hazelman said that the company is largely looking to see how customers use the software, which can perform file-level recoveries. Furthermore he noted that the endpoint software doesn't require any of the company's software and vowed it would remain free as a standalone offering.
"We are not targeting this at an enterprise with 50,000 endpoints," Hazelman said. "We want to get it in the hands of the IT pros and typical Veeam customers and see how we can expand this product and see how we can grow it."
Indeed the VeeamON event was largely to launch a major new release of its flagship suite, to be called the Data Availability Suite v8. Many say Veeam is the fastest growing provider in its market since the company's launch in 2006. In his opening keynote address in the partner track, CEO Ratmir Timashev said that Veeam is on pace to post $500 million in booked revenue (non GAAP) and is aiming to double that to $1 billion by 2018.
In an interview following his keynote, Timashev said the company doesn't have near-term plans for an initial public offering (IPO) and insisted the company is not looking to be acquired. "We're not looking to sell the company," he said. "We believe we can grow. We have proven capabilities to find the next hot market and develop a brilliant product. And when you have this capability, you can continue growing, stay profitable and you don't need to sell."
Timashev added that Veeam can reach those fast-growth goals without deviating from its core mission of protecting virtual datacenters. Extending to a new network of cloud providers will be a key enabler, according to Timashev. The new Data Availability Suite v8, set for release next month (he didn't give an exact date), will incorporate a new interface called Cloud Connect that will let customers choose from a growing network of partners who are building cloud-based and hosted backup and disaster recovery services.
The new v8 suite offers a bevy of other features including what it calls "Explorers" that can now protect Microsoft's Active Directory and SQL Server and provides extended support for Exchange Server and SharePoint. Also added is extended WAN acceleration introduced in the last release to cover replication and a feature called Backup IO, which adds intelligent load balancing.
Posted by Jeffrey Schwartz on 10/08/2014 at 11:30 AM0 comments
Hewlett Packard for decades has resisted calls by Wall Street to divest itself into multiple companies but today it has heeded the call. The company said it would split itself into two separate publicly traded businesses next year. The two companies will leverage their existing storied brand, calling its PC and printing business HP Inc. and the infrastructure and cloud businesses HP Enterprise.
Once the split is complete, Meg Whitman will lead the new HP Enterprise as CEO and serve only as chairman of HP Inc. The move comes less than a week after eBay said it would spin off its PayPal business unit into a separately traded company. Ironically, Whitman, HP's current CEO, was the longtime CEO of eBay during the peak of the dotcom bubble and it too was recently under pressure by activist investors to spin off PayPal, believing both companies would fare better apart.
HP has long resisted calls by Wall Street to split itself into two or more companies. The pressure intensified following the early 2005 departure of CEO Carly Fiorina, whose disputed move to acquire Compaq remained controversial to this day. The company had strongly considered selling off or divesting its PC and printing businesses under previous CEO Leo Apotheker. When he was abruptly dismissed after just 11 months and Whitman took over, she continued the review but ultimately decided a "One HP" would make it a stronger company.
In deciding not to divest back in 2011, Whitman argued remaining together gave it more scale and would put it in a stronger position to compete. For instance, she argued HP was the largest buyer of CPUs, memory, drives and other components.
Now she's arguing that the market has changed profoundly. "We think this is the best alternative," she told CNBC's David Faber in an interview this morning. "The market has changed dramatically in terms of speed. We are in a position to position these two companies for growth"
Even though the question of HP divesting its business has always come up over the years, today's news was unexpected and comes after the company was rumored to be looking at an acquisition of storage giant EMC and earlier cloud giant Rackspace. It's not clear how serious talks, if there were any, were.
The decision to become smaller rather than larger by HP reflects growing pressure by large companies to become more competitive against nimbler competitors. Many of the large IT giants have faced similar pressure. Wall Street has been pushing EMC to split itself from VMware and IBM last week just completed the sale of its industry standard server business to Lenovo. And Dell, led by its founder and CEO Michael Dell, has become a private company.
Microsoft has also faced pressure to split itself up over the years, dating back to the U.S. government's antitrust case. Investors have continued to push Microsoft to consider splitting off or selling its gaming and potentially its devices business off since former CEO Steve Ballmer announced he was stepping down last year. The company's now controversial move to acquire Nokia's handset business for $7.2 billion and the selection of insider Satya Nadella as its new CEO has made that appear less likely. Nadella has said that Microsoft has no plans to divest any of its businesses. But HP's move shows how things can change.
Despite its own "One Microsoft" model, analysts will surely step up the pressure for Microsoft to consider its options. Yet Microsoft may have a better argument that it should keep its businesses intact, with the exception of perhaps Nokia if it becomes a drag on the company.
But as Whitman pointed out, "before a few months ago, we weren't positioned to do this. Now the time is right." And that's why never-say-never is the operative term in the IT industry.
Posted by Jeffrey Schwartz on 10/06/2014 at 11:42 AM0 comments
Microsoft may have succeeded in throwing a curve ball at the world by not naming the next version of its operating system Windows 9. But as William Shakespeare famously wrote in Romeo and Juliet, "A rose by any other name would smell as sweet." In other words, if Windows 10 is a stinker, it won't matter what Microsoft calls it.
In his First Look of the preview, Brien Posey wondered if trying to come up to speed with Apple's OS X had anything to do with the choice in names -- a theory that quickly came to mind by many others wondering what Microsoft is up to (the company apparently didn't say why it came up with Windows 10). Perhaps Microsoft's trying to appeal to the many Windows XP loyalists?
The name notwithstanding, I too downloaded the Windows 10 Preview, a process that was relatively simple. Posey's review encapsulates Microsoft's progress in unifying the modern interface with the desktop and gave his thoughts on where Microsoft needs to move forward before bringing Windows 10 to market. One thing he didn't touch upon is the status of the Charms feature. Introduced in Windows 8, Charms were intended to help find shortcuts in managing your device.
In the preview, the Charms are no longer accessible with a mouse, only with touch. If you got used to using the Charms with a mouse, you're going to have to readjust to using the Start Button again. For some, that may require some readjustment, especially if they continue to use the Charms with their fingers. Would you like to see Microsoft make the Charms available when using a mouse? What would be the downside?
Meanwhile, have you downloaded the Windows 10 Preview? Keep in mind, this is just the first preview and Microsoft is looking for user feedback to decide what features makes the final cut as it refines Windows 10. We'd love to hear your thoughts on where you'd like them to refine Windows 10 so that it doesn't become a stinker.
Posted by Jeffrey Schwartz on 10/03/2014 at 12:35 PM0 comments
As the capabilities of virtual machines reach their outer limits in the quest to build cloud-based software-defined datacenters, containers are quickly emerging as their potential successor. Though containers have long existed, notably in Linux, the rise of the Docker open source container has created a standard for building portable applications in the form of micro-services. As they become more mature, containers promise portability, automation, orchestration and scalability of applications across clouds and virtual machines.
Since releasing Docker as an open source container for Linux, just about every company has announced support for it either in their operating systems, virtual machines or cloud platforms including IBM, Google, Red Hat, VMware and even Microsoft, which in May said it would support Linux-based Docker containers in the infrastructure-as-a-service (IaaS) component of its Azure cloud service. Docker is not available in the Microsoft platform as a service (PaaS) because it doesn't yet support Linux, though it appears only a matter of time before that happens.
"We're thinking about it," said Mark Russinovich, who Microsoft last month officially named CTO of its Azure cloud. "We hear customers want Linux on PaaS on Azure."
Russinovich confirmed that Microsoft is looking to commercialize its own container technology, code-named "Drawbridge," a library OS effort kicked off in 2008 by Microsoft Research Partner Manager Galen Hunt, who in 2011 detailed a working prototype of a Windows 7 library operating system that ran then-current releases of Excel, PowerPoint and Internet Explorer. In the desktop prototype, Microsoft said the securely isolated library operating system instances worked via the reuse of networking protocols. In a keynote address at the August TechMentor conference (which, like Redmond magazine, is produced by 1105 Media) on the Microsoft campus, Redmond magazine columnist Don Jones told attendees about the effort and questioned its future.
During a panel discussion at the Interop conference in New York yesterday, Russinovich acknowledged Drawbridge as alive and well. While he couldn't speak for plans on the Windows client he also stopped short of saying Microsoft plans to include it in Windows Server and Hyper-V. But he left little doubt that that's in the pipeline for Windows Server and Azure. Russinovich said Microsoft has already used the Drawbridge container technology in its new Azure-based machine learning technology.
"Obviously spinning up a VM for them is not acceptable in terms of the experience," Russinovich said. "So we built with the help of Microsoft Research our own secure container technology, called Drawbridge. That's what we used internally. We are figuring out how to make that kind of technology available publicly on Windows." Russinovich wouldn't say whether it will be discussed at the TechEd conference in Barcelona later this month.
Sam Ramji, who left his role as leader of Microsoft's emerging open source and Linux strategy five years ago, heard about Drawbridge for the first time in yesterday's session. In an interview he argued that if Windows Server is going to remain competitive with Linux, it needs to have its own containers. "It's a must-have," said Ramji, who is now VP of strategy at Apigee, a provider of cloud-based APIs. "If they don't have a container in the next 12 months, I think they will probably lose market share."
Despite Microsoft's caginess on its commercial plans for Drawbridge and containers, reading between the lines it appears they're a priority for the Azure team. While talking up Microsoft's support for Docker containers for Linux, Russinovich seemed to position Drawbridge as a superior container technology, arguing its containers are more secure for deploying micro-services.
"In a multi-tenant environment you're letting untrusted code from who knows where run on a platform and you need a security boundary around that," Russinovich said. "Most cloud platforms use the virtual machines as a security boundary. With a smaller, letter-grade secure container, we can make the deployment of that much more efficient," Russinovich said. "That's where Drawbridge comes into play. "
Ramji agreed that the ability to provide secure micro-services is a key differentiator between the open source Docker and Drawbridge. "It's going to make bigger promises for security, especially for third-party untrusted code," Ramji said.
Asked if cloud platforms like the open source OpenShift PaaS, led by Red Hat, can make containers more secure, Krishnan Subramanian, argued that's not their role. "They are not there to make containers more secure. Their role is for the orchestration side of things," Subramanian said. "Security comes with the underlying operating system that the container uses. If they're going to use one of those operating systems in the industry that are not enterprise ready, probably they're not secure."
Russinovich said customers do want to see Windows-based containers. Is that the case? How do you see them playing in your infrastructure and how imperative is it that they come sooner than later?
Posted by Jeffrey Schwartz on 10/01/2014 at 2:25 PM0 comments
Microsoft Research today opened its new online Prediction Lab in a move it said aims to reinvent the way polls and surveys are conducted. The new lab, open to anyone in the format of a game, seeks to provide more accurate predictions than current surveys can forecast today.
Led by David Rothschild, an economist at Microsoft Research and also a fellow at the Applied Statistics Center at Columbia University, the Prediction Lab boasts it already has a credible track record prior to its launch. In some examples released today, the lab predicted an 84 percent chance that Scottish voters would reject the election held to decide whether Scotland should secede from the United Kingdom.
The predictions, published in a blog post called A Data-Driven Crystal Ball, also included data of the winners of all 15 World Cup knockout games this year. And in the 2012 presidential election, the lab got the Obama versus Romney results right in 50 of 51 territories (including Washington, DC). The new interactive platform, released to the public, hopes to gather more data and sentiment of the general population.
"We're building an infrastructure that's incredibly scalable, so we can be answering questions along a massive continuum," Rothschild said in the blog post, where he described the Prediction Lab as "a great laboratory for researchers [and] a very socialized experience" for those who participate.
"By really reinventing survey research, we feel that we can open it up to a whole new realm of questions that, previously, people used to say you can only use a model for," Rothschild added. "From whom you survey to the questions you ask to the aggregation method that you utilize to the incentive structure, we see places to innovate. We're trying to be extremely disruptive."
Rothschild also explained why traditional poling technology is outdated and the need to research new methods like in the Prediction Lab in the era of big data. "First, I firmly believe the standard polling will reach a point where the response rate and the coverage is so low that something bad will happen. Then, the standard polling technology will be completely destroyed, so it is prudent to invest in alternative methods. Second, even if nothing ever happened to standard polling, nonprobability polling data will unlock market intelligence for us that no standard polling could ever provide. Ultimately, we will be able to gather data so quickly that the idea of a decision-maker waiting a few weeks for a poll will seem crazy."
Microsoft is hoping to keep participants engaged with its game-like polling technique, where participants can win or lose points based on making an accurate prediction (if you're wrong, you lose points). This week's "challenge" looks to predict whether President Obama will name a new attorney general before Oct. 5. The second question asks if the number of U.S. states recognizing gay marriages will change next week and the final poll asks if there will be American active combat soldiers in Syria by Oct. 5.
Whether the Microsoft Prediction Lab will gain the status of more popular surveys such as the Gallup polls remains to be seen. But the work in Microsoft Research shows an interesting use of applied quantitative research. Though Microsoft didn't outline plans to extend the Prediction Lab, perhaps some of its technology will have implication for the company's offerings such as Cortana, Bing and even Delve, the new Office Graph technology formerly code-named "Oslo" for SharePoint and Office 365. Now in preview, it's built on Microsoft's FAST enterprise search technology and is designed to work across Office 365 app silos.
Posted by Jeffrey Schwartz on 09/29/2014 at 11:54 AM0 comments
Rackspace, which over the past few years tried to transform itself into the leading OpenStack cloud provider, is shifting gears. The large San Antonio-based service provider last week began emphasizing a portfolio of dedicated managed services that let enterprises run their systems and applications on their choice of virtual platforms -- Microsoft's Hyper-V, VMware's ESX or the open source OpenStack platform.
The new Hyper-V-based managed services include for the first time the complete System Center 2012 R2 stack, including Windows Server and Storage Spaces. The orchestrated services are available as dedicated single-tenant managed services for testing and are set for general availability in the United States in November, followed by the United Kingdom and Sydney in the first quarter of next year. Insiders had long pushed the company to offer Hyper-V-based managed services but until recently, it was met with resistance by leadership that wanted to stick with VMware for hosted services.
Months after Rackspace CEO Lanham Napier stepped down in February and the company retained Morgan Stanley to seek a buyer in May, the company last week said it didn't find a buyer and decided to remain independent, naming Rackspace President Taylor Rhodes as the new CEO. A day later, the company said it's no longer just emphasizing OpenStack. Instead, the company is promoting its best-of-breed approach with Hyper-V, VMware and OpenStack.
I caught up with CTO John Engates at a day-long analyst and customer conference in New York, where he explained the shift. "The vast majority of IT is still done today in customer-run datacenters by IT guys," Engates explained. "The small fraction of what's going to the cloud today is early stage applications and they're built by sometimes a sliver of the IT organization that's sort of on the bleeding edge. But there are a lot of applications that still move to the cloud as datacenters get old, as servers go through refresh cycles. But they won't necessarily be able to go to Amazon's flavor of cloud. They will go to VMware cloud, or a Rackspace-hosted VMware cloud, or a Microsoft-based cloud."
In a way, Rackspace is going back to its roots as the company was born as a hosting provider until a few years ago when it decided to base its growth on competing with large cloud providers by building out its entire cloud infrastructure on OpenStack to offer an option to Amazon Web Services cloud offerings, with the added benefit of offering its so-called "fanatical support." Rackspace codeveloped OpenStack with NASA as an open source means of offering portable Amazon-compatible cloud services. While it continued to offer other hosting services including a Microsoft-centric Exchange, Lync and SharePoint managed services offering, it was a relatively small portion of its business, ran only on VMware and remained in the shadow of Rackspace's OpenStack push.
A report released by 451 Research shows OpenStack, though rapidly growing, accounts for $883 million of the $56 billion market for cloud and managed services. "Rackspace for a long time was seen part and parcel of the Amazon world with public cloud and they're clearly repositioning themselves around the fastest growing, most important part of the market, which is managed cloud and private cloud," said 451 Research Analyst and Senior VP Michelle Bailey. "They can do cloud with customer support, which is something you don't typically get with the larger public provides. They have guarantees around availability, and they'll sign a business agreement with customers, which is what you'll see from traditional hosting and service providers."
Like others, Bailey said there's growing demand for managed services based on Hyper-V-based single-tenant servers. "With the Microsoft relationship they're able to provide apps," she said. "So you're able to go up the stack. It's not just the infrastructure piece that you're getting with Microsoft but specifically Exchange, SharePoint and SQL Server. These are some of the most commonly used applications in the market and Microsoft has made it very good for their partners to be able to resell those services now. When you get into an app discussion with a customer, it's a completely different discussion."
Jeff DeVerter, general manager for the Microsoft private cloud practice at Rackspace, acknowledged it was a multi-year effort to get corporate buy-in for offering services on Hyper-V and ultimately the Microsoft Cloud OS stack. "I had to convince the senior leadership team at Rackspace that this was the right thing to do," said DeVerter. "It was easier to do this year than it was in previous years because we were still feeling our way from an OpenStack perspective. If you look at the whole stack that is Microsoft's Cloud OS, it really is a very similar thing to what the whole stack of OpenStack is. Rackspace has realized the world is not built on OpenStack because there really are traditional enterprise applications [Exchange, Lync and SharePoint] that don't fit there. They're not written for the OpenStack world."
DeVerter would know, having come to the company six years ago as a SharePoint architect and helped grow the SharePoint, Exchange and ultimately Lync business to $50 million in revenues. Aiding that growth was the Feb. 2012 acquisition of SharePoint 911, whose principals Shane Young and Todd Klindt and helped make the case for moving those platforms from VMware to Hyper-V.
Posted by Jeffrey Schwartz on 09/26/2014 at 7:46 AM0 comments
Within moments of last week's news that Larry Ellison has stepped down as Oracle's CEO to become CTO, social media lit up. Reaction such as "whoa!" and "wow!" preceded every tweet or Facebook post. In reality, it seemed like a superficial change in titles.
For all intents and purposes, the new CEOs, Mark Hurd and Safra Catz, were already running the company, while Ellison had final say in technical strategy. Hence it's primarily business as usual with some new formalities in place. Could it be a precursor to some bombshell in the coming days and weeks? We'll see but there's nothing obvious to suggest that.
It seems more likely Ellison will fade away from Oracle over time, rather than have a ceremonial departure like Bill Gates did when he left Microsoft in 2008. Don't be surprised if Ellison spends a lot more time on his yacht and on Lanai, the small island he bought in 2012 near Hawaii that he is seeking to make "the first economically viable, 100 percent green community," as reported by The New York Times Magazine this week.
For now, Hurd told The Times that "Larry's not going anywhere." In fact Hurd hinted despite incurring his wrath on SAP and Salesforce.com over the past decade, Ellison may revisit his old rivalry with Microsoft, where he and Scott McNealy, onetime CEO of Sun Microsystems (ironically now a part of Oracle), fought hard and tirelessly to end Microsoft's Windows PC dominance. They did so in lots of public speeches, lawsuits and a strong effort to displace traditional PCs with their network computers, which were ahead of their time. Ellison also did his part in helping spawn the Linux server movement by bringing new versions of the Oracle database on the open source platform first, and only much later on Windows Server.
While Oracle has fiercely competed with Microsoft in the database market and the Java versus .NET battle over the past decade, with little left to fight about, Ellison largely focused his ire on IBM, Salesforce and SAP. Nevertheless Oracle's agreement with Microsoft last year to support native Java and the Oracle database and virtual machines on the Microsoft Azure public cloud was intriguing as it was such an unlikely move at one time.
Now it appears Ellison, who years ago mocked cloud computing, has Azure envy due to it having one of the more built-out PaaS portfolios. Hurd told The Times Oracle is readying its own platform as a service (PaaS) that will compete with Azure that Ellison will reveal at next week's Oracle OpenWorld conference in San Francisco. Newly promoted CEO Hurd told The Times Ellison will announce a PaaS aimed at competing with the Azure PaaS and SQL Server in a keynote. Suggested (but not stated) was that Ellison will try to pitch it as optimized for Microsoft's .NET language.
Oracle's current pact with Microsoft is primarily focused on Azure's infrastructure-as-a-service (Iaas) offerings, not PaaS. Whether Oracle offers a serious alternative to running its wares on IaaS remains to be seen. If indeed Oracle aims to compete with Microsoft on the PaaS front, it's likely going to be an offering that will give existing Oracle customers an alternative cloud to customers who would never port their database and Java apps to Azure.
However Ellison positions this new offering, unless Oracle has covertly been building a few dozen datacenters globally that match the scale and capacity of Amazon, Azure, Google and Salesforce.com --which would set off a social media firestorm -- it's more likely to look like a better-late-than-never service well suited and awaited by many of Oracle's customers.
Posted by Jeffrey Schwartz on 09/24/2014 at 11:14 AM0 comments
Apple today said it has sold 10 million of its new iPhones over the first three days since they arrived in stores and at customers' doorsteps Friday. This exceeds analysts' and the company's forecasts. In the words of CEO Tim Cook, sales of its new iPhone 6 and iPhone 6 Plus models have led to the "best launch ever, shattering all previous sell-through records by a large margin."
Indeed analysts are noting that the figures are impressive, especially considering they haven't yet shipped in China, where there's large demand for the new iPhones. How and if the initial results will embolden the iPhone and iOS, in a market that has lost market leadership share to Android, remains to be seen. But it appears the company's unwillingness to deliver a larger phone earlier clearly must have cost it market share with those with pent up demand for larger smartphones. Most Android phones and Windows Phones are larger than the previous 4-inch iPhone 5s and the majority of devices are even larger than 4.5 inches these days.
The company didn't break out sales of the bigger iPhone 6 versus the even larger 5.5-inch iPhone 6 Plus, but some believe the latter may have had an edge even if they're more scarce. But if the sales Apple reported are primarily from existing iPhone users, that will only stabilize its existing share, not extend it. However, as the market share for Windows Phone declines, demand for the iPhone will grow on the back of features such as Apple Pay and the forthcoming iWatch that saturate the media. This won't help the market for Microsoft-based phones (and could bring back some Android users to Apple).
It doesn't appear the new Amazon Fire phones, technically Android-based devices, are gaining meaningful share. Meanwhile BlackBerry is readying its first new phone since the release of the BlackBerry 10 last year. Despite miniscule market share, BlackBerry CEO John Chen told The Wall Street Journal that the new 4.5-inch display that will come with its new Passport will help to make it an enterprise-grade device that's targeted at productivity. It will also boast a battery that can power the phone for 36 hours and a large antenna designed to provide better reception. In addition to enterprise users, the phone will be targeted at medical professionals.
With the growing move by some to so-called phablets, which the iPhone 6 Plus arguably is (some might say devices that are over 6 inches better fit that description), these larger devices are also expected to cut into sales of 7-inch tablets. In Apple's case, that includes the iPad mini. But given the iPad Mini's price and the fact that not all models have built-in cellular connectivity, the iPhone 6 Plus could bolster Apple more than hurt it.
Despite Microsoft's efforts to talk up improvements to Windows Phone 8.1 and its emphasis on Cortana, it appears the noise from Apple and the coverage surrounding it is all but drowning it out. As the noise from Apple subsides in the coming weeks, Microsoft will need to step up the volume.
Posted by Jeffrey Schwartz on 09/22/2014 at 12:03 PM0 comments
Microsoft earlier this week said two more longtime board members are stepping down and the company has already named their replacements, which will take effect Oct. 1.
Among them are David Marquardt, a venture capitalist who was an early investor in Microsoft, and Dina Dublon, the onetime chief financial officer of J.P. Morgan. Dublon was on Microsoft's audit committee and chaired its compensation, The Wall Street journal noted. The paper also raised an interesting question: Would losing the two board members now and others over the past two years result in a gap in "institutional knowledge?" Marquardt, with his Silicon Valley ties, played a key role in helping Microsoft "get off the ground and is a direct link to the company's earliest days."
The two new board members are Teri List-Stoll, chief financial officer of Kraft Foods and Visa CEO Charles Scharf, who once ran J.P. Morgan Chase's retail banking and private investment operations. Of course the moves come just weeks after former CEO Steve Ballmer stepped down.
Nadella will be reporting to a board with six new members out of a total of 10 since 2012. Indeed that should please Wall Street investors who were pining for new blood during last year's search for Ballmer's replacement and didn't want to see an insider get the job.
But the real question to be determined is will these new voices teamed with Nadella strike a balance that is needed for Microsoft to thrive in the fiercest competitive market it has faced to date?
Posted by Jeffrey Schwartz on 09/18/2014 at 3:32 PM0 comments
IBM's New M5 Servers Include Editions for Hyper-V and SQL Server
In what could be its last major rollout of new x86 systems if the company's January deal to sell its commodity server business to Lenovo for $2.3 billion goes through, IBM launched the new System x M5 line. The new lineup of servers includes systems designed to operate the latest versions of Microsoft's Hyper-V and SQL Server.
IBM said the new M5 line offers improved performance and security. With a number of models for a variety of solution types, the new M5 includes various tower, rack, blade and integrated systems that target everything from small workloads to infrastructure for private clouds, big data and analytic applications. The two systems targeting Microsoft workloads include the new IBM System x Solution for Microsoft Fast Track DW for SQL Server 2014 and an upgraded IBM Flex System Solution for Microsoft Hyper-V.
The IBM System x Solution for Microsoft SQL Data Warehouse on X6 is designed for data warehouse workloads running the new SQL Server 2014, which shipped in April. IBM said it offers rapid response to data queries and enables scalability as workloads increase. Specifically, the new systems are powered by Intel Xeon E7-4800/88 v2 processors. IBM said the new systems offer 100 percent faster database performance than the prior release, with three times the memory capacity and a third of the latency of PCIe-based flash. The systems can pull up to 12TB of flash memory-channel storage near the processor as well.
As a Microsoft Fast Track Partner, IBM also added its IBM System x3850 X6 Solution for Microsoft Hyper-V. Targeting business-critical systems, the two-node configuration uses Microsoft Failover Clustering, aimed at eliminating any single point of failure, according to IBM's reference architecture. The Hyper-V role installed on each clustered server, which hosts virtual machines.
Unitrends Adds Reporting Tools To Monitor Capacity and Storage Inventory
Backup and recovery appliance supplier Unitrends has added new tools track storage inventory and capacity, designed to help administrators more accurately gauge their system requirements in order to lower costs.
The set of free tools provide views of storage, file capacity and utilization. Unitrends said they let administrators calculate the amount of storage they need to make available for backups and prioritize files that need to be backed up. The tools provide single point-in-time snapshots of storage and files distributed throughout organizations' datacenters to help determine how much capacity they need and prioritize which files get backed up based on how much storage is available.
There are two separate tools. The Unitrends Backup Capacity Tool provides the snapshot of all files distributed throughout an organization on both storage systems as well as servers, providing file-level views of data to plan backups. It provides reports for planning, while outlining file usage. The other, the Unitrends Storage Inventory Tool, provides a view of an organization's entire storage infrastructure, which the company says offers detailed inventories of storage assets in use and where there are risks of exceeding available capacity.
These new tools follow July's release of BC/DR Link, an online free tool that helps organizations create global disaster recovery plans. It includes 1GB of centralized storage in the Unitrends cloud, where customers can store critical document.
CloudLink Adds Microsoft BitLocker To Secure Workloads Running in Amazon
Security vendor CloudLink's SecureVM, which provides Microsoft BitLocker encryption, has added Amazon Web Services to its list of supported cloud platforms. The tool lets customers implement the native Windows encryption to their virtual machines -- both desktop and servers -- in the Amazon cloud. In addition to Amazon, it works in Microsoft Azure and VMware vCloud Air, among other public clouds.
Because virtual and cloud environments naturally can't utilize the BitLocker encryption keys typically stored on TPM and USB hardware, SecureVM emulates that functionality by providing centralized management of the keys. It also lets customers encrypt their VMs outside of AWS.
Customers can start their VMs only in the intended environment. It's policy based to ensure the VMs only launch when authorized. It also provides an audit trail tracking then VMs are launched, with other information such as IP addresses, hosts and operating systems, among other factors.
Data volumes designated to an instance are encrypted, allowing customers to encrypt other data volumes, according to the company. Enterprises maintain control of encryption key management including the option to store keys in-house.
Posted by Jeffrey Schwartz on 09/18/2014 at 12:09 PM0 comments
As the IT industry looks at the future of the virtual machine, containers have jumped out as the next big thing and every key player with an interest in the future of the datacenter is circling the wagons around Silicon Valley startup Docker. That includes IBM, Google, Red Hat, VMware and even Microsoft. Whether it is cause or effect, big money is following Docker as well.
Today Sequoia Capital has pumped $40 million in Series C funding, bringing its total funding to $66 million and estimated valuation at $400 million. Early investors in Docker include Benchmark, Greylock Partners, Insight Ventures, Trinity Ventures and Yahoo Cofounder Jerry Yang. Docker's containers aim to move beyond the traditional virtual machine with its open source platform for building, shipping and running distributed applications.
As Docker puts it, the limitation of virtual machines is that they include not only the application, but the required binaries, libraries and an entire guest OS, which could weigh tens of gigabytes, compared with just a small number of megabytes for the actual app. By comparison, the Docker Engine container consists of just the application and its dependencies.
"It runs as an isolated process in user space on the host operating system, sharing the kernel with other containers," according to the company's description. "Thus, it enjoys the resource isolation and allocation benefits of VMs but is much more portable and efficient."
With the release of Docker in June, Microsoft announced support for the Linux-based containers by updating the command-line interface in Azure, allowing customers to build and deploy Docker-based containers in Azure. Microsoft also said customers can manage the virtual machines with the Docker client. As reported here by John Waters, Microsoft's Corey Sanders, manager of the Azure compute runtime, demonstrated this capability at DockerCon at the time.
On the Microsoft Open Technologies blog, Evangelist Ross Gardler outlined how to set up and use Docker on the Azure cloud service. According to Gardler, common use cases for Docker include:
- Automating the packaging and deployment of applications
- Creation of lightweight, private PaaS environments
- Automated testing and continuous integration/deployment
- Deploying and scaling web apps, databases and backend services
At VMworld last month, VMware talked up its support for Docker, saying it has teamed with the company, joined by its sister company Pivotal as well as Google, to enable their collective enterprise customers to run and manage apps in containers in public, private and hybrid cloud scenarios as well as on existing VMware infrastructure.
Containers are a technology to watch, whether you're a developer or IT pro. The entire IT industry has embraced (at least publicly) it as the next generation of virtual infrastructure. And for now, Docker seems to be setting the agenda for this new technology.
Posted by Jeffrey Schwartz on 09/16/2014 at 11:57 AM0 comments
Apple today said preorders of its new iPhone 6 and its larger sibling, the 6 Plus, totaled a record 4 million in the first 24 hours, which doubled the preorders that the iPhone 5 received two years ago. But those numbers may suggest a rosier outlook than they actually portend.
Since Apple didn't release similar figures for last year's release of the iPhone 5s, which was for the most part an incremental upgrade over its then year-old predecessor as well as the lower-end 5c, it suggests customers are sitting on a number of aging iPhones. That includes earlier models which are now reaching the point of sluggishness due to upgrades to iOS running on slower processors and the fact they can only run on 3G networks.
Perhaps boosting demand was the fact that Apple for the first time offered an attractive promotion -- it will offer a $200 trade-in for an older iPhone if it's in working condition. For now, that offer is only good through this Friday but it wouldn't be surprising if Apple extended it or reintroduced it through the holiday season.
Also while I visited a local Verizon owned and operated store on Saturday, I noticed a few customers interested in switching out their larger Android phones for the new 6 Plus. But most of those orders Apple reported must have come from online because the vast majority were there looking at Android-based phones. Those wanting a new iPhone, especially the larger one, will have to wait at least a month or more, although Apple always seems to like to play those supply shortages early on when releasing new devices.
Many customers these days are less loyal to any one phone platform and are willing to switch if one has hardware specs that meet their needs -- perhaps its size, the camera or even the design. For example, I observed one woman who wanted to replace her damaged iPhone with iPhone 6 Plus but when the rep told her she'd have to wait a few weeks, she said she'd just take an Android phone since she needed a new one right away. I saw another man warning his son that if he switched to an Android phone, he'd lose all his iOS apps. The teenager was unphased by that and also bought an Android Phone.
Meanwhile, no one was looking at the Nokia Lumia 928s or Icons and the store employees told me that they sell few Windows Phones, which they estimated accounted for less than 5 percent of phone sales. Perhaps that will change, following Microsoft's deal to acquire Mojang, purveyor of the popular Minecraft game? That's a discussion for my post about today's announcement by Microsoft to acquire Mojang for $2.5 billion.
For those who did purchase a new iPhone, it appears there was more demand for the larger unit with the 5.5-inch display compared with its junior counterpart, which measures 4.7-inches (still larger than the earlier 5/5s models). Apple didn't provide a breakdown. If you're an iPhone user, do you plan to upgrade to a newer one or switch to another platform? What size do you find most appealing?
Posted by Jeffrey Schwartz on 09/15/2014 at 12:08 PM0 comments
Microsoft has pulled the trigger on a $2.5 billion deal to acquire Mojang, the developer of the popular Minecraft game. Rumors that a deal was in the works surfaced last week, though the price tag was initially said to be $2 billion. It looks like the founders of the 5-year-old startup squeezed another half-billion dollars out of Microsoft over the weekend.
Minecraft is the largest selling game on Microsoft's popular Xbox platform. At first glance, this move could be a play to make it exclusive to Microsoft's gaming system to keep it out of the hands of the likes of Sony. It could even signal to boost its declining Windows Phone business or even its Windows PC and tablet software. However if you listen to Xbox Head Phil Spencer and Microsoft's push to support all device platforms, that doesn't appear to be the plan.
"This is a game that has found its audience on touch devices, on phones, on iPads, on console and obviously its true home on PC. Whether you're playing on an Xbox, whether you're playing on a PlayStation, an Android or iOS device, our goal is to continue to evolve with and innovate with Minecraft across all those platforms," Spencer said in a prerecorded announcement on his blog.
If you consider CEO Satya Nadella's proclamation in July that Microsoft is the "productivity and platforms company" and it spent more than double what it cost to acquire enterprise social media company Yammer on Mojang, it may have you wondering how this fits into that focus. In the press release announcing the deal, Nadella stated: "Minecraft is more than a great game franchise -- it is an open world platform, driven by a vibrant community we care deeply about, and rich with new opportunities for that community and for Microsoft."
That could at least hint that the thinking is the platform and community the founders of Mojang created could play a role in UI design that doesn't rely on Windows or even Xbox. Others have speculated that this is a move to make Microsoft's gaming business ripe for a spinoff or sale, something investors want but a move the Nadella has indicated is not looking to make.
"The single biggest digital life category, measured in both time and money spent, in a mobile-first world is gaming," Nadella said in his lengthy July 10 memo, announcing the company's focus moving forward. "We also benefit from many technologies flowing from our gaming efforts into our productivity efforts --core graphics and NUI in Windows, speech recognition in Skype, camera technology in Kinect for Windows, Azure cloud enhancements for GPU simulation and many more. Bottom line, we will continue to innovate and grow our fan base with Xbox while also creating additive business value for Microsoft."
The deal also makes sense from another perspective: Minecraft is hugely popular, especially with younger people -- a demographic that is critical to the success in any productivity tool or platform. Clearly Nadella is telling the market and especially critics that it's not game over for Microsoft.
Posted by Jeffrey Schwartz on 09/15/2014 at 12:24 PM0 comments
Hewlett Packard's surprising news that it has agreed to acquire Eucalyptus potentially throws a monkey wrench into Microsoft's recently stepped-up push to enable users to migrate workloads from Amazon Web Services to the Microsoft Azure public cloud.
As I noted last week, Microsoft announced its new Migration Accelerator, which migrates workloads running on the Amazon Web Services cloud to Azure. It's the latest in a push to accelerate its public cloud service, which analysts have recently said is gaining ground.
By acquiring Eucalyptus, HP gains a tool sanctioned by Amazon to enable AWS-enabled workloads in its private cloud service. Eucalyptus signed a compatibility pact with Amazon in March 2012 that enables it to use Amazon APIs including Amazon Machine Images (AMIs) for its open source private cloud operating system software.
The deal, announced late yesterday, also means Eucalyptus CEO Marten Mickos will become general manager of HP's Cloud services and will report to HP Chairman and CEO Meg Whitman. Mickos, the onetime CEO of MySQL, has become a respected figure in infrastructure-as-a-service (IaaS) cloud circles. But the move certainly raised eyebrows.
Investor and consultant Ben Kepes in a Forbes blog post questioned whether Eucalyptus ran out of money and was forced into a fire sale or if the acquisition was a desperate move by HP to give a push to its cloud business. HP has had numerous management and strategy shifts with its cloud business.
"One needs only to look at the employee changes in the HP cloud division." Kepes wrote. "Its main executive, Biri Singh, left last year. Martin Fink has been running the business since then and now Mickos will take over -- he'll apparently be reporting directly to CEO Meg Whitman but whether anything can be achieved given Whitman's broad range of issues to focus on is anyone's guess."
Ironically on Redmond magazine's sister site Virtualization Review, well-known infrastructure analyst Dan Kusnetzky had just talked with Bill Hilf, senior vice president, product and service management, HP Cloud. He shared his observations on HP's new Helion cloud offering prior to the Eucalyptus deal announcement. Helicon is built on OpenStack, though HP also has partnerships with Microsoft, VMware and CloudStack.
"The deal represents HP's recognition of the reality that much of what enterprise developers, teams and lines of business do revolves around Amazon Web Services," said Al Sadowski, a research director at 451 Research.
"HP clearly is trying to differentiate itself from Cisco, Dell and IBM by having its own AWS-compatible approach," Kusnetzky added. "I'm wondering what these players are going to do once Eucalyptus is an HP product. Some are likely to steer clients to OpenStack or Azure as a way to reduce HP's influence in their customer bases."
It also raises questions about the future of Helicon, which despite HP's partnerships, emphasizes OpenStack -- something Mickos has been a longtime and vocal supporter of. "We are seeing the OpenStack Project become one of the largest and fastest growing open source projects in the world today," Mickos was quoted as saying on an HP blog post.
Hmm. According to a research report released by 451 Research, OpenStack revenues were $883 million of IaaS revenues, or about 13 percent. They are forecast to double to about $1.7 billion of the $10 billion in 2016, or 17 percent. Not trivial but not a huge chunk of the market either.
HP clearly made this move to counter IBM's apparent headway with its cloud service, even though it appears the three front runners are Amazon, Microsoft and Google. In order to remain a player, HP needs to have compatibility with all three, as well as OpenStack, and acquiring Eucalyptus gives it a boost in offering Amazon compatibility, even if it comes at the expense of its server business, as noted by The New York Times.
In my chats with Mickos over the years, Eucalyptus hadn't ruled out Azure compatibility but admitted it hadn't gone very far last time we spoke over a year ago. Time will tell if this becomes a greater priority for both companies.
Regardless, 451 Research's Sadowski indicated that HP's move likely targets IBM more than Microsoft. "HP is hoping to capture more of the enterprise market as organizations make their way beyond Amazon (and Azure) and build out their private and hybrid cloud deployments," he said. "We would guess that in acquiring Eucalyptus, the company is seeking to replicate the story that IBM has built with its SoftLayer buy and simultaneous OpenStack support."
Posted by Jeffrey Schwartz on 09/12/2014 at 12:51 PM0 comments
Apple's much-anticipated launch event yesterday "sucked the air out of the room" during the opening keynote session at the annual Tableau Customer conference, taking place in Seattle, as my friend Ellis Booker remarked on Facebook yesterday.
Regardless how you view Apple, the launch of its larger iPhone 6 and 6 Plus models, along with the new payment service and smartwatch, were hard to ignore. Positioned as an event on par with the launch of the original Macintosh 30 years ago, the iPod, iPhone and iPad, the new Apple Watch and Apple Pay made for the largest launch event the company has staged in over four years. It was arguably the largest number of new products showcased at a single Apple launch event. Despite all the hype, it remains to be seen if it will be remembered as disruptive as Apple's prior launches. Yet given its history, I wouldn't quickly dismiss the potential in what Apple announced yesterday.
The Apple Watch was the icing on the cake for many fans looking for the company to show it can create new markets. But critics were quickly disappointed when they learned the new Apple Watch would only work if linked it to your new iPhone (or last year's iPhone 5s). Many were surprised to learn that it will be some time before the component circuitry for providing all of the communications functionality of a phone or tablet can be miniaturized for a watch.
This is not dissimilar to other smartwatches on the market. But this technology is not yet the smartwatch Dick Tracy would get excited about. No one knows if the Apple Watch (or any smartwatch) will be the next revolution in mobile computing or if it will be a flop. And even if it is the next big thing, it isn't a given that Apple, or any other player, will own the market.
Yet Apple does deserve some benefit of the doubt. Remember when the first iPod came out and it only worked with a Mac? Once Apple added Windows compatibility, everything changed. Likewise when the first iPhone came out in 2007, it carried a carrier-subsidized price tag of $599. After few sold, Apple quickly reduced them to $399. But it wasn't until later when the company offered $199 iPhones did the smartphone market take off. It's reasonable to expect there will be affordable smartwatches if it does become a mass market. I'd think the sweet spot will be under $150, but who knows for sure.
While Apple was expected to introduce a watch, the variety of watches it will offer was surprising. Those pining for an Apple Watch will have to wait until early next year but it's hard to see these flying off the shelves in their current iteration. I am less certain Apple will open up its watch to communicate with Android and Windows Phones, though that would open the market for its new devices just as adding Windows support to its iPods did.
Still, it's looking more likely that those with Android and Windows Phones will choose watches running on the same platforms. Indeed there are a number of Android-based alternatives such as the new Android-based Moto 360 for $249 and the Motorola Mobility LG G Watch, available now for $179 in the Google Play store. They too require connectivity to an Android Phone.
For its part, Microsoft is planning its own smartwatch with a thin form factor resembling many of the fitness-oriented watches. It will have 11 sensors and will be cross platform, according to a Tom's Hardware report.
As I noted, the Apple Watch was icing on an event intended to announce the pending ability of two the new iPhones. Arriving as expected, one measures 4.7 inches and the other is a 5.5-inch phablet that is almost the size of an iPad mini. They're poised to appeal to those who want something like Samsung's large Galaxy line of phones but aren't enamored by Android. The new iPhones will also put pressure on Microsoft to promote its large 6-inch Nokia Lumia 1520, which sports a 20 MP camera and a 1920x1080 display. Though Apple says its camera only has 8 megapixels, the company emphasized a new burst mode (60 fps) that can intelligently pull the best photo in a series of images. The new iPhone 6 Plus also has a 1920x1080 display and starts at $299 for a 16GB model (the standard iPhone 6 is still $199).
Besides some other incremental improvements in the new iPhones, perhaps the most notable new capability will be their support for the company's new Apple Pay service. This new capability will allow individuals to make payments at supporting merchants by using the phone's fingerprint recognition interface called Touch ID, which accesses a NFC chip on the phone that stores encrypted credit card and shipping information.
If Apple (and ultimately all smartphone suppliers) can convince customers and merchants that this is a more secure way of handling payments than traditional credit cards, we could see the dawn of a new era in how transactions are made. A number of high-profile breaches including this week's acknowledgment by Home Depot that its payment systems were compromised, could, over the long run, hasten demand for this technology if it's proven to be more secure. Of course, Apple has to convince sceptics that last week's iCloud breach was an isolated incident.
Regardless of your platform preference or if you use best of breed, we now have a better picture of Apple's next generation of wares. We can expect to hear what's in the pipeline for the iPad in the next month or two. Reports suggest a larger one is in the works.
Posted by Jeffrey Schwartz on 09/10/2014 at 12:07 PM0 comments
In critiquing the lack of availability of apps for Windows Phone last week, dozens of readers took issue with my complaints, effectively describing them as trivial and ill-informed. The good news is there are a lot of passionate Windows Phone users out there, but, alas, not enough -- at least for now -- to make it a strong enough No. 3 player against the iPhone and Android-based devices. Though the odds for it becoming a solid three-horse race appear to be fading, I really do hope that changes.
While I noted the increased number of apps for Windows Phone, many readers came at me saying I overstated the fact that many apps are still missing. "I am not sure what all the griping is about," Sam wrote. "I personally think that this whole 'app thing' is a step backward from a decade ago when we were moving toward accessing information using just a browser. Now you buy dumb devices and apps are a must."
Charles Sullivan agreed. "I have a Windows Phone 7, which is now nearly 4 years old and I rarely think about apps," he said. "Apps, for the most part, are for people that cannot spell their own name, which is a lot of people." Regarding the lack of a Starbucks app, Sam added, "Seriously, if you need an app to buy coffee, your choice of a phone is the least of your problems." Several critics said the apps I cited were missing were indeed available. "Yeah, I took two seconds and searched on my Windows Phone and found most of the apps. Seems like zero research went into this," Darthjr said.
Actually, I did search the Windows Store for every app I listed, some on numerous occasions, a few players I even called. If I overlooked some exact apps that are actually there, I apologize. But the truth remains that there's still a vast number of apps available for the iPhone and Android that aren't available for Windows Phone. In many cases they aren't even in the pipeline. Just watch a commercial or look at an ad in print or online and you'll often be directed to download their apps on either iOS or Android.
In some instances, there are similar apps offered by different developers. "Sorry, but your examples are poor and just continue to perpetuate the notion that if you have a Windows Phone, you are helpless," reader The WinPhan lamented. "Everything you listed as your 'daily go-to's' in app usage can be found in the Windows Phone Store, maybe not by same developers/name, but with same functionality and end result, with the exception of your local newspaper and cable. Please do better research."
Acknowledging the lead iOS and Android have, Curtis8 believes the fragmentation of Android will become a bigger issue and Windows Phone will incrementally gain share. "As Windows Phone gets better, with more markets and coverage, devs will start to support it more," Curtis8 noted, adding that iPhone will likely remain a major player for some time. "But I do see Windows Phone gaining traction regardless of the numbers people try to make up. One of our biggest fights is not the consumer, it is the carriers and sales people. Walk into any store and ask about Windows Phone: selection is crap, stock is crap and the sales people will try to convince you on an iPhone or Samsung device. Many people do not feel the missing apps as much if they stay on Windows Phone and find what we have."
That's absolutely true. As a Verizon customer, I've gone into a number of company-owned stores and the story remains the same. You're lucky if you can even find Windows Phones if you're looking for them in the several Verizon Stores I've visited. Forget about it if you're not looking for one. Ask a sales rep about Windows Phone and they'll effectively say that no one's buying them. The selection of Windows Phones on AT&T is better thanks to a partnership with Nokia.
Some respondents argued that the question of apps will become a moot point as Cortana, the voice-activated feature in the latest version of Windows Phone 8.1, catches on, especially if it's also included in the anticipated new Windows 9, code-named "Threshold," as is rumored to be a possibility.
Prabhujeet Singh, who has a Nokia Lumia 1020 indicated he loves the 41 megapixel camera (indeed a key differentiator), "Cortana is amazing. I tell it to set an alarm, set a reminder and when I am driving, I just speak the address. It understands my accent and I am not a native speaker of English. Do I miss apps? Nope. Not at all."
Tomorrow could very well mark an inflection point as Apple launches its next generation of mobile phones -- the larger iPhone 6 models -- and its widely anticipated iWatch. While that's a topic for another day, a report in The Wall Street Journal Saturday said the new Apple iWatch will be enabled by a health app to be integrated in iOS 8 called HealthKit. Several key health care institutions including Sloan Memorial Cancer Center in New York, insurance giant Kaiser Permanente and the Mayo Clinic were on board in some fashion.
If Apple has -- or can gain -- the broad support of the health care industry it had of the music industry when it introduced the iPod 13 years ago despite a crowded market of MP3 music players, it could spawn a new market. To date others including Google with Google Health and Microsoft with its HealthVault community has seen only limited success. On the other hand, some argue that winning in the health care market is a much smaller fish to reel in than capturing the huge music and entertainment market over a decade ago. Nevertheless, Apple's success in the health and fitness market would be a boon to the company at the expense of others.
As long as Microsoft says it's committed to its mobile phone business, it would be foolish to write off the platform. Much of the future of its Nokia Lumia phone business could be tied to Microsoft's commitment to hardware (including its Surface business). As Mary Jo Foley noted in her Redmond magazine column this month: "Like the Lumia phones, the role of the Surface in the new Microsoft will be a supporting, not a starring one. As a result, I could see Microsoft's investment in the Surface -- from both a monetary and staffing perspective -- being downsized, accordingly."
Even if that turns out to be the case, it doesn't portend the end for Windows and Windows Phone, especially if it can get OEMs on board. For now, that's a big if, despite removing fees to OEMs building devices smaller than nine inches. Fortunately for Microsoft, as recently reported, the company's mobility strategy doesn't depend merely on Windows, but on its support for all platforms, as well. The good news is this market is still evolving. For now, though, as long as apps matter, despite some of its unique niceties, Windows Phone faces an uphill battle.
Posted by Jeffrey Schwartz on 09/08/2014 at 2:52 PM0 comments
Back in the 1990s when America Online ruled the day, Microsoft's entry with MSN followed in AOL's footsteps. Microsoft is hoping its latest MSN refresh tailored for the mobile and cloud era will take hold and its new interface is quite compelling.
Launched as a public preview today, the new MSN portal is a gateway to popular apps such as Office 365, Outlook.com, Skype, OneDrive, Xbox Music as well as some outside services, notably Facebook and Twitter. The interface to those services is the Service Stripe. After just spending a short amount of time with it, I'm already considering replacing My Yahoo as my longtime default browser portal home page.
"We have rebuilt MSN from the ground up for a mobile-first, cloud-first world," wrote Brian MacDonald, Microsoft's corporate VP for information and content experiences, in a blog post. "The new MSN brings together the world's best media sources along with data and services to enable users to do more in News, Sports, Money, Travel, Food & Drink, Health & Fitness, and more. It focuses on the primary digital daily habits in people's lives and helps them complete tasks across all of their devices. Information and personalized settings are roamed through the cloud to keep users in the know wherever they are."
If anything, MacDonald downplayed its ability to act as an interface to some of the most widely used services, while providing a rich offering of its own information and access to services utilized for personal use. The preview is available now for Windows and will be coming shortly to iOS and Android. Could the new MSN be the portal that gives Microsoft's Bing search engine the boost it needs and brings more users someday to Cortana?
Posted by Jeffrey Schwartz on 09/08/2014 at 2:22 PM0 comments
Microsoft is readying a tool that it says will "seamlessly" migrate physical and virtual workloads to its Azure public cloud service. A limited preview of the new Migration Accelerator, released yesterday, moves workloads to Microsoft Azure from physical machines, VMs (both VMware and Hyper-V-based) and those running in the Amazon Web Services public cloud.
The launch of the new migration tool comes as Microsoft officials are talking up the growth of its Azure cloud service at the expense of Amazon Web Services. Microsoft Technical Fellow in the Cloud and Enterprise Mark Russinovich emphasized that point in a speech at last month's TechMentor conference, which like Redmond magazine is produced by 1105 Media.
Migration Accelerator "automates all aspects of migration including discovery of source workloads, remote agent installation, network adaptation and endpoint configuration," wrote Srinath Vasireddy, a lead principal program manager for enterprise and cloud at Microsoft, in a post on the Microsoft Azure Blog yesterday . "With MA, you reduce cost and risk of your migration project."
The technology enabling the workload migrations comes from Microsoft's July acquisition of InMage, whose Scout software appliances for Windows and Linux physical and virtual instances captures data on a continuous basis as those changes occur. It then simultaneously performs local backups or remote replication via a single data stream. A week after announcing the acquisition, Microsoft said the InMage Scout software will be included in its Azure Site Recovery subscription licenses.
While the tool looks to give Microsoft a better replication story, it appears Microsoft's Migration Accelerator is pushing customers to use Azure for more than just disaster recovery and business continuity, although that has emerged as a popular application for all public cloud usage.
For example Vasireddy pointed to the Migration Accelerator's capability of migrating multitier production systems that he said offers consistency of applications that are orchestrated across all tiers. "This ensures multitier applications run the same in Azure, as they ran at the source," he said. "Application startup order is even honored, without the need for any manual configuration."
Vasireddy outlined in his blog post how the Migration Accelerator works and its components:
- Mobility Service: A lightweight (guest based) centrally deployed agent which gets installed on source servers (on-premises physical or virtual) to be migrated to the target virtual machines on Azure. It is responsible for real time data capture and synchronization of the selected volumes of source servers to target servers.
- Process Server (PS): A physical or virtual server that is installed on-premises. It facilitates the communication between the Mobility Service and target virtual machines in Azure. It provides caching, queuing, compression, encryption and bandwidth management.
- Master Target (MT): A target for replicating disks of on-premises servers. It is installed within a dedicated Azure VM in your Azure subscription. Disks are attached to the MT to maintain duplicate copies.
- Configuration Server (CS): Manages the communication between the Master Target and the MA Portal. It is installed on a dedicated Azure VM in your Azure subscription. Regular synchronization occurs between the CS and MA Portal.
- MA Portal: A multitenant portal to discover, configure protection and migrate your on premise workloads into Azure.
Microsoft's new cloud migration tool also offers automated asset discovery and migration, cutovers to Azure within minutes using in-memory change-tracking, ensuring target VMs are dormant during migrations to lower compute costs and ensure automated provisioning, lightweight agents on targets to enable continuous replication and support for automated network adaptation and endpoint reconfiguration, he said. Those interested in testing the Migration Accelerator must sign up here for the preview.
Posted by Jeffrey Schwartz on 09/05/2014 at 12:33 PM0 comments
In my monthly Redmond View column I raised the question, "Can Windows Phone Defy Odds and Gain Share?" It's a valid question given the platform's recent falloff in shipments and the lack of widespread enthusiasm for Windows Phone.
Some of you disagree but the numbers speak for themselves. Nevertheless reader John Fitzgerald feels I need to go deeper. "Your article is pretty short on specifics," he wrote. "I have over 100 apps installed, including many useful ones and have no app gap in my life."
Given that was my monthly column for the print magazine, I was constrained by the word count to elaborate. But Fitzgerald raised a valid point: If all the apps (or reasonable substitutes) you use on your iPhone or Android phone are available for Windows Phone, you're in business. Indeed if Tyler McCabe, the millennial son of SMB Group Analyst Laurie McCabe, objected to the lack of apps in the Lumia 1520 (he evaluated and described on his mother's blog), he didn't mention it. The computer engineering student is accustomed to iPhones and Android devices.
In today's mobile consumer-driven world, apps do rule, at least for now. As such, looking at my iPhone, there are a number of apps I use daily that are not available on Windows Phone or the functionality is different. Among them are the Starbucks app, which I use to pay for purchases and manage my account, and the MLB app to track baseball scores and news. When I board flights I use my iPhone to access my boarding pass. That's currently not an option with Windows Phone, though Delta and Southwest offer apps for major phones. When I want to check the train schedules, New York's MTA offers an app for iOS and Google Play but not for Windows Phone. The United States Post Office app I use to check rates and track packages is available only for the iPhone, Android and BlackBerry. Google recently made a version of its Chrome Web browser for iOS, while the company last said in March it was investigating bringing it to Windows Phone as well.
An app I use throughout the day to access articles on Newsday, my local newspaper, isn't available for Windows Phone and I was told no plans are under way to develop one. Nor does Newsday's parent company, Cablevision Systems, offer one to provide mobile DVR controls and other capabilities.
The bottom line is if I were to switch to a Windows Phone, I'd be giving up a lot of convenience I've grown accustomed to over the years. If others feel the same way, that could explain why, despite some otherwise compelling features, it could be an uphill battle for the prospects of Windows Phone gaining substantial share. Alas, I believe that ship has sailed.
Posted by Jeffrey Schwartz on 09/03/2014 at 12:55 PM0 comments
Two key suppliers of enterprise tools yesterday said they're being acquired and another well-known software-as-a-service provider is reportedly in play. BeyondTrust and Compuware have agreed to be acquired by private equity firms while shares of SaaS provider Concur have traded higher following rumors that several software vendors have approached it, including Microsoft, Oracle and SAP, after hiring an investment bank to gauge interest.
Veritas Capital said it will acquire cybersecurity software vendor BeyondTrust, provider of software used to manage user access and privileges and risk management assessment tools. BeyondTrust's PowerBroker software is especially popular among those looking to manage access rights in Microsoft's Active Directory.
BeyondTrust said it has 4,000 customers in a variety of industry sectors including government, technology, aerospace and defense, media/entertainment, telecommunications, healthcare/pharmaceutical, education and financial services. In addition to managing user access, BeyondTrust offers tools designed to defend against cyber attacks. Terms of the deal weren't disclosed.
Application performance management software supplier Compuware yesterday said it has agreed to be acquired by private equity firm Thoma Bravo for $2.5 billion. The deal puts a 12 percent premium on the company's shares based on its closing price at the end of the trading day Sept. 2 and 17 percent over Friday's close.
Compuware is a leading provider of APM software used to monitor the performance of mobile, enterprise and mobile application infrastructure as well as a line of network testing and management tools. The company also offers a line of mainframe management tools.
Concur, whose popular SaaS software is used to manage business travel and entertainment expenses, had a stock price increase of 6.3 percent yesterday, giving it a market cap of $6.1 billion. According to a Bloomberg report, Oracle has decided to pass on it, while Microsoft and SAP are companies of interest, though none of the companies noted commented on the possible acquisition.
Posted by Jeffrey Schwartz on 09/03/2014 at 12:54 PM0 comments
Steve Ballmer's decision to step down from Microsoft's board six months after "retiring" as the company's CEO (announced a year ago tomorrow) has once again put him in the limelight in the IT world, at least for a few days. The spotlight has already pointed to Ballmer in the sports world now that he shelled out an unprecedented $2 billion to purchase the Los Angeles Clippers and gave an infamous rally cry earlier this week on how his team will light the NBA on fire.
More power to him. I hope he brings the Clippers a championship ring -- maybe they become a dynasty. But at the same time, Ballmer's decision to step down from Microsoft's board appears to be a good move, whatever may have actually motivated him. Some may argue it's symbolic. Even though Ballmer is no longer on the board with voting rights, he's still the company's largest shareholder and if he was to align himself with activist investors, he could make his already audible voice heard. As such it would be foolish to write him off.
Following the news of his resignation from the board, I've seen a number of tweets and commentaries saying critics of Ballmer don't appreciate all he has done for Microsoft. Indeed Microsoft performed well during most of his 13-year reign, with consistent year-over-year revenue and profit growth. And products such as SharePoint grew from nothing to multi-billion-dollar businesses. Yet under Ballmer's watch, Microsoft seemed to have lost its way and the company that cut off Netscape's "air supply" lost its caché to companies like Apple and Google.
To those who think Ballmer is unappreciated, did Microsoft perform well because of or in spite of him? Consider this: As manager of the Atlanta Braves, New York Mets and St. Louis Cardinals, Joe Torre's teams performed abysmally. Yet his four World Series rings and year-over year appearances in the playoffs as skipper of the Yankees brought him praise as a great manager and a place in the Hall of Fame. As CEO of Novell, Eric Schmidt was unable to return the one-time networking giant to its former glory. As CEO of Google, he was a rock star. If Novell had hired Ballmer away from Microsoft, could he have saved the company from its continued decline? We'll never know for sure.
In my view, it all started with Ballmer's ill-advised attempt to acquire Yahoo in 2008 for $45 billion. Yahoo CEO Jerry Yang's foolhardy refusal to accept that astronomically generous offer at the time probably saved Microsoft from disaster. In retrospect, perhaps Ballmer was desperate, whether he knew it or not at the time. In 2008, Microsoft was late to the party with a hypervisor, delivering the first iteration of Hyper-V even though VMware had already set the standard for server virtualization.
Also at the time, maybe he realized, consciously or otherwise, that shrugging off the iPhone a year earlier sealed Microsoft's fate in the mobile phone business. Tablets and mobile phones were a market Microsoft could have owned or at least become a major player in had there been more vision. I remember Microsoft talking up technologies like the WinPad in the mid to late '90s and making feeble attempts to offer tablets that were effectively laptops which accepted pen input.
I wasn't planning to weigh in on Ballmer stepping down from the board until I read a review of the latest Windows Phone from HTC by Joanna Stern of The Wall Street Journal. The headline said it all: "HTC One for Windows: Another Great Phone You Probably Won't Buy." While praising the new phone, which actually is based on its popular Android-based device, Stern effectively said while Windows Phones are good, so were Sony Betamaxes (that's my analogy, not hers). Stern noted IDC's latest smartphone market share report, showing it sits at a mere 2.5 percent and not growing. In fact it has declined this quarter.
That got me thinking, once again, that the missteps with Windows Phone and Windows tablets all happened on the watch of Ballmer, who also let Apple and Google dominate the tablet market. To his credit, Ballmer acknowledged earlier this year that missing that market shift was his biggest regret. That still brings little consolation. Now Ballmer's successor, CEO Satya Nadella, is trying to move Microsoft forward with the hand he was dealt. Founder and former CEO Bill Gates sits on the board and is working with Nadella closely on strategic technical matters. Does Nadella also need the person whose mess he's cleaning up (including the fiefdoms that exist in Redmond) looking over his shoulder as a board member?
No one can overlook the passion Ballmer had at Microsoft, but his "developers, developers, developers" rant only went so far. Today most developers would rather program for iOS, Android and other platforms. Hence, it's fortunate that Ballmer has found a new passion. Maybe he can do with the Clippers what he couldn't do for Microsoft.
Posted by Jeffrey Schwartz on 08/22/2014 at 12:51 PM0 comments
NetApp Adds Hybrid Cloud Storage for Azure
NetApp Inc. is now offering a private storage solution that supports Microsoft private clouds and the Microsoft Azure cloud service. The new NetApp Private Storage for Microsoft Azure is designed to provide storage for hybrid clouds, letting organizations elastically extend storage capacity from their own datacenters to the public cloud service. The offering utilizes the Microsoft FlexPod private cloud solution consisting of converged storage, compute and network infrastructure from Cisco Systems Inc., NetApp and the Windows Server software stack. Organizations can create internal private clouds with the Microsoft Cloud OS, which includes the combination of Windows Server, System Center and the Windows Azure Pack. Those which require scale can use the Azure cloud service and Microsoft's new Azure ExpressRoute offering, which through partners such as AT&T, BT, Equinix, Level 3 Communications and Verizon Communications, provides high-speed, dedicated and secure links rather than relying upon the public Internet. NetApp Private Storage for Microsoft Azure offers single or multiple region disaster recovery and uses the public cloud service only when failover is necessary or for planned scenarios such as testing. When used with System Center and the NetApp PowerShell Toolkit, customers can manage data mobility between the private cloud and on-site storage connected to the Azure cloud.
Netwrix Adds Compliance to Auditor
Organizations faced with meeting regulatory compliance requirements need to ensure their IT audits are in line with various standards. The new Netwrix Auditor for 5 Compliance Standards addresses five key standards: the Payment Card Industry Data Security Specification (PCI/DSS), Health Insurance Portability and Accountability Act (HIPAA), Sarbanes-Oxley Act (SOX), Federal Information Security Management Act (FISMA) and Gramm-Leach-Bliley Act (GLBA). Netwrix Corp. released the new compliance tool last month. It's designed to help ensure IT passes compliance audits by accessing the mandated reports from the Netwrix Auditor AuditArchive, which the company describes as two-tiered storage that can hold data for upward of 10 years. It tracks unauthorized changes to configurations and can issue real-time alerts. It also comes with 200 preconfigured reports. In addition, it monitors user activities and audits access control changes, which helps discover and preempt theft of confidential information. It builds on the built-in auditing capabilities of widely used platforms including Active Directory, Exchange Server, file servers, SharePoint, SQL Server, VMware and Windows Server, though the company said it covers others, as well.
Advanced Systems Concepts Ties to Configuration Manager
Advanced Systems Concepts Inc. is now offering an extension for Microsoft System Center 2012 R2 Configuration Manager. Along with its other System Center extensions, the new release provides common management across on-premises, services provider and Azure environments. By integrating ActiveBatch with Configuration Manager, the company said IT organizations can reduce administration requirements and lower overall cost of PC ownership by automating such functions as OS deployment, patch management, client device management and other systems administration functions. It's designed to eliminate the need for scripting. Among the Configuration Manager functions it offers are production-ready job steps, which include commonly used Configuration Manager objects and functions such as creating job steps to build packages, deployment programs and folders. Among other job steps, it lets administrators modify and delete, as well as assign packages to distribution. ActiveBatch already supports most other key components of System Center. It already has extensions for System Center Operations Manager, System Center Service Manager, System Center Orchestrator and System Center Virtual Machine Manager, as well as Active Directory, SQL Server, Exchange, Azure and Windows Server with the latest rev of Hyper-V. It also works with other key platforms including Amazon Web Services EC2 and VMware ESX.
Posted by Jeffrey Schwartz on 08/22/2014 at 11:43 AM0 comments
Microsoft will release the third version of its Virtual Machine Converter (MVMC) this fall and the key new feature will be the return of support for physical to virtual conversions. MVMC is Microsoft's free tool for migrating VMware virtual machines to Hyper-V and when it released the 2.0 version earlier this year, the company removed the P2V support and only allowed for V2V conversions.
That "disappointed" some customers, said Matt McSpirit, a Microsoft senior technical product marketing manager on the company's cloud and enterprise marketing group. McSpirit joined me in a panel discussion on choosing a hypervisor at last week's TechMentor conference, held on the Microsoft campus in Redmond. While the panel covered many interesting implications of converting to Hyper-V, which I will break out in future posts, one key area of discussion was converting existing VMware VMs to Hyper-V.
Released in late March, Microsoft Virtual Machine Converter 2.0 upped the ante over the inaugural release by incorporating support for vCenter and ESX 5.5, VMware virtual hardware version 4 through 10 support and Linux guest OS migration support including CentOS, Debian, Oracle, Red Hat Enterprise, SuSE Enterprise and Ubuntu, as I reported at the time. It also added an on-premises VM to Azure VM conversion tool for migrating VMware VMs directly to Azure.
Another key feature is its native PowerShell interface, enabling customers and partners to script key MVMC 2.0 tasks. These scripts can also be integrated with other automation and workflow tools such as System Center Orchestrator, among others. In addition, Microsoft has released the Migration Automation Toolkit for MVMC 2.0, which is a series of prebuilt PowerShell scripts to drive scaled MVMC 2.0 conversions without any dependency on automation and workflow tools.
The reason for killing the P2V support in that release was apparently to emphasize and encourage that it's a VM to VM conversion tool. During the panel discussion, McSpirit described P2V as less specific to virtual conversion but more focused on physical to virtual.
"A while back, we had within System Center a P2V capability," he explained. "So for customers who were just getting started with virtualization or have some workloads they need to convert from the physical world, we have P2V built into VMM [System Center Virtual Machine Manager]. So as an admin, you've got your Hyper-V host being managed, [you] select a physical server that will actually convert either online or offline, and VMM will handle [the conversion to a virtual machine] and bring it [into its management tool]. And that functionality was deprecated in 2012 R2 and removed. And thus, for a period of time, no P2V tool from Microsoft. Yes there's disk to VHD and some other tools but no [fully] supported, production ready tool [from Microsoft, though there are third-party tools]."
In a followup e-mail to clarify that point, McSpirit said that "P2V stands for physical to virtual, thus by definition, it's less focused on virtual conversion and more focused on physical to virtual, but that's not to say it can't be used for converting VMs," he noted. "The P2V wizard in the previous release of System Center Virtual Machine Manager (2012 SP1) could still be pointed at an existing VMware virtual machine, after all, regardless of whether it's converting a physical machine or not, it's the OS, and it's data that's being captured, not the hardware itself. Thus, you could use P2V, to do a V2V."
Microsoft confirmed in April that P2V was planned for the fall release. "With MVMC 3, that comes in the fall P2V is coming back into that, which pleases a lot of customers because it doesn't have that requirement for System Center which for smaller environments is more applicable and it enables you to perform P2Vs with a supported tool," McSpirit said.
Update: Just to clarify, MVMC never had P2V functionality. Rather it was offered in System Center 2012 SP1 Virtual Machine Manager (and earlier). When McSpirit said P2V was deprecated, he was referring to System Center 2012 R2 Virtual Machine Manager, which offered V2V only. With the MVMC 3.0 release this fall, P2V will once again be available.
The session confirmed what many already know: MVMC is a good tool for converting a handful of hypervisors but it still requires manual configuration and "lacks any sort of bulk conversion mechanism (unless you want to script the conversion process through Windows PowerShell)," wrote Brien Posey in a tutorial outlining how to use MVMC 2.0 earlier this year.
But if you plan to migrate numerous VMware VMs to Hyper-V, you may want to consider third-party tools from the likes of Vision Solutions, NetApp and 5nine Software.
Are you migrating your VMware VMs to Hyper-V? If so, are you using MVMC or one of the third-party tools?
Posted by Jeffrey Schwartz on 08/20/2014 at 9:22 AM0 comments
Over the weekend I downloaded the Chrome Web browser on my iPad after it showed up as a suggested download. Somehow I forgot Chrome is now available on iOS and I was thrilled to see I could not only run the browser, but instantly have access to my bookmarks and Web browsing history from other PCs on which I work. Then I wondered if Internet Explorer would ever find its way into the iTunes App Store. After all, Office is now available on iOS, as well as other popular Microsoft offerings. However, it doesn't appear that Internet Explorer on iOS is in the cards.
Apparently, Microsoft put cold water on that idea last week during a Redit Ask Me Anything (AMA) chat. As reported by "All About Microsoft" blogger Mary Jo Foley, Internet Explorer is one of Redmond's offerings that the company won't deliver on iOS or Android. "Right now, we're focused on building a great mobile browser for Windows Phone and have made some great progress lately," said Charles Morris, a member of the Internet Explorer platform team, during the AMA chat. "So, no current plans for Android/iOS."
That's unfortunate, especially given the rate of growth for Windows Phone, which actually dropped last quarter. It appears Microsoft is still fixed on the notion that its Internet Explorer Web browser is inextricably tied to Windows.
Also, as noted by Forbes, the Internet Explorer team acknowledged during the AMA chat that Microsoft has considered renaming the browser, presumably to make it more hip. That's a questionable idea considering Internet Explorer is still the most widely used browser, especially among enterprises.
When asked about the potential for a name change, Jonathan Sampson, also on the Internet Explorer platform team, responded: "It's been suggested internally; I remember a particularly long e-mail thread where numerous people were passionately debating it. Plenty of ideas get kicked around about how we can separate ourselves from negative perceptions that no longer reflect our product today."
Asked why the company didn't just change the name, Sampson responded: "The discussion I recall seeing was a very recent one (just a few weeks ago). Who knows what the future holds." Changing the name, of course, won't do much to help the browser's reputation if it's known for the same problems it has had in the past -- namely security and privacy flaws.
If something has a very bad reputation, sometimes rebranding it is the only course of action. But as the old saying goes, "a rose is still a rose ..." Despite the fact it's no longer the dominant browser, Internet Explorer still has the largest market share, according to Net Applications. Even if its share continues its incremental decline, Internet Explorer is a well-known browser and not merely for its flaws. People who don't use Microsoft products, particularly those who prefer Apple offerings or the LAMP stack, aren't going to be moved by a new name for anything coming out of Redmond. Produce a browser that has fewer flaws and advanced features and it'll be popular no matter what it's called.
Regardless of the name, would making it available on iOS and Android stem any declines on a platform that's still widely used? Or would it facilitate it? Perhaps the company sees Internet Explorer as critical to holding onto the Windows franchise. If that's the case, the company might wait. In the meantime, I look forward to using the Chrome browser on my iPad and will stick to using Internet Explorer only on Windows.
Posted by Jeffrey Schwartz on 08/18/2014 at 1:31 PM0 comments
Could there be a day when the desktop or mobile operating system you use and developers program to don't matter? The age-old question may not be answered any time soon and don't expect to see any of this in Microsoft's next client OS. But IT pros should prepare themselves for the possibility that Microsoft or other players may someday successfully commercialize a "library operating system" where developers rely primarily on APIs to run their applications, not a client OS or even a virtual machine.
While researchers have bandied about the concept of a library OS for some time, Microsoft Research revealed its own work on one back in 2011 with a project called Drawbridge. Microsoft Research Partner Manager Galen Hunt joined others at the time to outline in detail their working prototype of a Windows 7 library operating system that ran then-current releases of Excel, PowerPoint and Internet Explorer. In this desktop prototype, they said the securely isolated library operating system instances worked via the reuse of networking protocols.
"Each instance has significantly lower overhead than a full VM bundled with an application: a typical application adds just 16MB of working set and 64MB of disk footprint," according to a paper published by the widely regarded Associated for Computer Machinery (ACM) at the time. "We contribute a new ABI [application binary interface] below the library OS that enables application mobility. We also show that our library OS can address many of the current uses of hardware virtual machines at a fraction of the overheads."
Little has been said by Microsoft's library OS efforts since then. Yet three years later, Microsoft's stronghold on the desktop has been diminished by the rapid growth of BYOD. While Microsoft hopes to revive enthusiasm for Windows with a new offering next year, others believe it may be too little, too late. That remains to be seen of course. Nevertheless, Microsoft could have a lot to gain by someday delivering a library OS.
Don Jones, a Microsoft MVP and a Redmond magazine columnist, who recently joined IT training firm Pluralsight, revived the idea of why a library OS could be the right approach for Microsoft to take. And he did so right in Microsoft's house. Speaking in a keynote address at the TechMentor conference this week at the Microsoft Conference Center on the Redmond campus, Jones explained why a library OS could reduce existing operating systems to the firmware layer, putting the emphasis on APIs and application delivery rather than low-level services.
"We tend to say that developers develop for a specific operating system -- iOS, Android, Windows, Macintosh or Linux," Jones told the audience of IT pros. "It's not actually true. They don't. What they are writing for is an operating environment. What they are writing for is a set of APIs."
The APIs which developers code against today are tied to a specific operating system, whether it is Apple's interfaces to iOS or Microsoft's .NET Framework. Efforts to make APIs that work across operating system s has remain ed unsuccessful, Jones noted
"Even when we make efforts to duplicate those APIs for other operating systems with projects like Mono, attempting to duplicate .NET onto the Linux platform, they are rarely successful because the ties between the APIs and the operating system are so integrated," Jones said. "It's really tough to pull them apart and duplicate them elsewhere."
Jones called for a smarter BIOS so when users turn on their computers, it knows how to access the network, write to a display or disk and other basic low-level services, including CPU and memory access and storing user credentials in a secure area.
"This is going to sound crazy but if you take all of Windows and you refactor it into a bunch of APIs that talk to that low-level firmware and if a developer wants to code to those, he or she can," Jones said. "But at the same time, if someone is more comfortable programming against a Linux set of APIs, there's no reason those same APIs can't live on that same machine and run in parallel at the same time because it's not taking over the machine, it's the firmware that runs everything. These are just different ways of getting at that firmware that provides a different comfort level for developers."
In this type of scenario, Jones added: "Windows essentially becomes a set of DLLs, which essentially is not that far from what it is. So we move the operating system to a lower level and everything else is a set of APIs that run on top of that. Meaning you gain the ability to run multiple operating system personalities with APIs side by side because none of them owns the machine."
Can, or will, Microsoft bring this to market? Certainly Hunt believed it's feasible. "Our experience shows that the long-promised benefits of the library OS approach -- better protection of system integrity and rapid system evolution -- are readily obtainable," he wrote at the time.
Jones said not to expect it anytime soon, and warned it's possible it may never see the light of day. But he pointed to some meaningful reasons why Microsoft could be leaning in that direction, or at the very least could benefit by doing so sometime in the future. Perhaps most obvious is the company's key emphasis and investment is on building out the Microsoft Azure cloud and the Windows Server/System Center-based cloud OS that underpins it.
Jones also pointed to Microsoft's new Azure RemoteApp service, which the company announced at TechEd in May and is available in preview. Microsoft Azure RemoteApp, an application delivery service, will let IT deliver Windows applications to almost any device including Windows tablets, Macs, iPads and Android tablets via the cloud. It will work with 1,300 Windows apps, Microsoft said at the time, though the company has yet to announce its plans to offer Azure RemoteApp commercially.
"RemoteApp delivers an app, not a desktop, not an operating system," Jones said. "What the user gets is an app, because the user has been trained that's all they need." Jones also likes the idea of a library OS because it eliminates virtualization, he added. "We don't have to have an operating system and a virtual machine rolling it up. We just have these little API silos," he said. "Suddenly the idea of running applications in the cloud and delivering them becomes a lot clearer and a lot easier and you can get a lot better densities on your hosts."
The role of desktop administrators may not disappear overnight but it would not be a bad idea for them to study and gain expertise on the datacenter side of things if this move comes to play. "Look at what's happening in your environment, and whether you think it's a good idea or not, try to predict what you think Microsoft will or might do and then try to make a couple of side investments to line yourself up so you can ride the bus instead of being hit by it."
Would you like to see Microsoft deliver a library OS? Share your views on the potential benefits or pitfalls of such a computing model.
Posted by Jeffrey Schwartz on 08/15/2014 at 12:01 PM0 comments
Growing use of the Microsoft Azure cloud service, rapidly expanding features and a price war are putting the squeeze on cloud market leader Amazon Web Services, according to Microsoft Technical Fellow in the Cloud and Enterprise Division Mark Russinovich.
In the opening keynote at the TechMentor conference at the Microsoft Conference Center on the company's Redmond campus, Russinovich showcased the edge Microsoft sees that it has over its key rival. Russinovich argued that the company's squeeze contributed to Amazon's disappointing quarterly earnings report last month, which rattled investors. "Amazon Web Services is struggling in the face of pricing pressure by us and Google," Russinovich said. "People are starting to realize we're in it to win."
Indeed, market researchers last month pointed to Microsoft and IBM as those who are gaining the most ground on Amazon in the public cloud. When asked how many TechMentor attendees have logged into the Microsoft Azure Portal, only about one-quarter of the IT pros in the audience said they have. Disclosure: TechMentor, like Redmond magazine, is produced by 1105 Media.
Pointing to the growth of Azure, Russinovich showcased some recently released figures. The service hosts 300,000 active Web sites and 1 million active database services. The Azure Storage service has just surpassed 30 trillion objects with 3 million active requests per second at peak times. Russinovich also said that 57 percent of the Fortune 500 companies are using Azure. And Azure Active Directory, which organizations can federate with their on-premises versions of Active Directory, now has 300 million accounts and 13 billion authentication requests per week.
Russinovich emphasized Microsoft's advantage with Azure Active Directory and the cloud service's emphasis on identity. "Azure Active Directory is the center of our universe," Russinovich said. "When you take a look at all the places where you need identity, whether it's a [third-party] SaaS service or whether Microsoft's own, like Office 365, you look at custom line-of-business apps that enterprises are developing and they want to integrate their own identity services."
Throughout his talk, Russinovich emphasized Microsoft's focus on its hybrid cloud delivery model and the security layers that can extend to the public Azure cloud. To that point, Azure Active Directory's role in supporting hybrid clouds is "it supports ways you can foist identities to the cloud so that they're accessible from all of these targets, line-of-business apps in the cloud and Microsoft's own cloud platforms."
The primary way to achieve that is to federate an Azure Active Directory tenant with an on-premises Active Directory service, he said. "What that means is when someone goes and authenticates, they get pointed at Azure Active Directory in the cloud. Their identities are already there and the authentication flow goes to an Active Directory federated server on-premises that verifies the password and produces the token." It uses OAuth 2.0 for authentication, he said.
Microsoft plans to integrate OAuth 2.0 into Windows Server and all Azure services, though Russinovich noted it supports other token services such as SAML and other identity services from the likes of Facebook, Google and others.
One area Microsoft needs to deliver on for Azure is role-based access control, which is in preview now, Russinovich said. "This will be fleshed out so we have a consistent authorization story all pointing to Azure Active Directory, which lets you not just connect on-premises to Active Directory through [Active Directory Federation Services], but these third-party consumer services, as well. That's a key point, you want identity to be enabled in the cloud," he said.
Another key area of leadership Microsoft has gained in the cloud is the company's defiance in giving in to law enforcement agency efforts to subpoena user data, alluding to last week's order to turn over data in a Dublin datacenter, which Microsoft said it will appeal. "Brad Smith, our head of legal, is probably now the face of the tech industry going out and fighting the United States government's efforts to get access to data that they probably won't get access to," he said. "We're really trying to make sure we're a great partner for customers that care about data privacy."
When it comes to the public cloud, Russinovich discussed how there's significant demand for cloud services such as storage and disaster recovery. Russinovich demonstrated a few offerings including the new Azure ExpressRoute, which through third-party ISPs and colocation operators, can provide high-speed dedicated private connections between two datacenters using Azure as a bridge or to connect directly to Azure. Other demos included point-to-site VPN and Web apps using Azure.
Russinovich also gave a nod to Azure's broad support for those who want to create Windows PowerShell scripts to add more automation. "Everything in Azure is PowerShell-enabled because Jeffrey Snover [its revered inventor and Microsoft distinguished engineer and lead architect] makes us enable everything from PowerShell," Russinovich quipped.
During the evening reception at TechMentor, I reconnected with some attendees who had told me the previous evening they wanted to learn more about Azure. Some saw opportunities, but others weren't sure how it fits into what they're doing. Nevertheless, most said they were surprised to learn how Azure has advanced. This is exactly what Russinovich aimed to do in his talk -- to make a case that Microsoft has more to offer than Amazon and other cloud providers.
To be sure, Amazon still has a growing cloud business and is seen as the leader for enterprise Infrastructure as a Service. As noted by investment researcher Zacks, Amazon reported a 90 percent usage growth for its AWS cloud business. While Amazon doesn't break out revenues and profits for AWS, it falls in the category of "Other," now accounting for 6 percent of revenues. Experts believe AWS generated the majority of revenues in that "Other" category.
"AWS continues to launch new services and enhance the security of its services," according to the Zacks report. "Amazon again reduced prices significantly, which was the reason for the revenue decline in the last quarter."
By all indications here at TechMentor and messaging from Microsoft at the recent Worldwide Partner Conference (and TechEd before that), Microsoft is not letting up on its Azure push. While many attendees saw the depth and progress Microsoft has made with Azure for the first time, Rome wasn't built in a day.
Posted by Jeffrey Schwartz on 08/13/2014 at 11:34 AM0 comments
Google wants Web sites to become more secure and said Wednesday it will do its part by motivating organizations to build stronger encryption for their sites. The company is giving a pretty significant incentive: it will reward those who do so by ranking them higher than sites lacking the added support to Transport Layer Security, also known as HTTPS encryption. Another way to look at it is Google will punish those who lack the extra encryption.
It's always troubling to hear reports that allege Google is playing with its search algorithm in a way that can unfairly benefit some to the detriment of others. Given its dominance in search, any action, real or perceived, places it under scrutiny and risk of regulators getting on the company's case.
Yet one could argue Google is now putting a stake in the ground in the interest of everyone who uses the Web. By forcing sites to implement stronger encryption by implementing TLS, the company is using its clout to make it a safer place. This could have major consequences to many businesses that live and die by how well they appear in Google search results. That's especially the case for those who expend efforts in search engine optimization, or SEO. But Google is doing so by trying to force those with insecure sites to step to implement TLS. While not a panacea, it's a step up.
Google has talked up "HTTP by Default" for years. It means Search, Gmail and Google Drive automatically direct secure connections to the Google sites. At its recent Google IO developer conference, the company introduced its HTTPS Everywhere push. Webmaster trends analysts Zineb Ait Bahajji and Gary Illyes explained in a post Wednesday how the company plans to rank sites based on their HTTPS/TLS support.
"Over the past few months we've been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms," they wrote. "We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal -- affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content -- while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we'd like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the Web."
In the coming weeks Google said it will publish detailed best practices on how to make it easier to implement TLS at its help center. In the meantime, Google offered the following tips:
- Decide the kind of certificate you need: single, multi-domain or wildcard certificate.
- Use 2048-bit key certificates.
- Use relative URLs for resources that reside on the same secure domain.
- Use protocol relative URLs for all other domains.
- Check out our Site move article for more guidelines on how to change your Web site's address.
- Don't block your HTTPS site from crawling using robots.txt.
- Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.
Google is also recommending those with sites already serving HTTPS should test the security levels and configuration using Qualys SSL Server Test tool.
What's your take on Google's effort to force the hand of organizations to make their sites more secure? Is it a heavy handed and unfair move by taking advantage of its search dominance or an altruistic use of its clout that could make the Web safer for everyone?
Posted by Jeffrey Schwartz on 08/08/2014 at 12:34 PM0 comments
Microsoft has extended its relationship with the NFL in a new deal for its Surface Pro 3 tablet-PCs to be used by coaches on the sidelines during games. Viewers of NFL games this season will prominently see team personnel and coaches using Surface Pro 3 devices to review images of plays on screen rather than on printed photos.
The NFL struck a deal last year with Microsoft for the Surface to be the league's official tablet. Now a new $400 million deal calls for Surface Pro 3s to be used on the sidelines during games. The arrangement calls for coaches to use the Surface Pro 3s for five years, according to a Wall Street Journal blog post.
Microsoft said the Surface devices used by teams are equipped with the Sideline Viewing System (SVS). The app, developed by Microsoft, picks up the same digital feed previously sent to the printers. By replacing the printouts with digital images sent to the tablets, coaches will get them faster (four to five seconds versus 30 seconds) and have the ability to use the high resolution touch display to zoom in on a portion of the image and make annotations, Microsoft said.
While coaches will be able to receive images, the tablets won't have access to the public Internet and team personnel won't be permitted to use the devices to take photos or videos during games. It remains to be seen whether teams in future seasons will use the Surface Pro 3s for making play-by-play decisions.
Perhaps more telling will be whether the promotional value of this deal will boost sales of the Surface line of tablets. Certainly with the vast viewership of NFL games it's a nice boost. But as Computerworld reported Monday, Microsoft's losses on Surface have swelled to $1.7 billion since their 2012 launch, according to a July 22 8-K statement. The publication also calculated that Microsoft saw a loss of $363 million for the last fiscal quarter. That makes it the largest quarterly loss for the tablets since the company began reporting revenues for them. Hence, Microsoft needs to score many more deals like this to keep the Surface in the game.
What's your take on this deal? Did Microsoft score big or was this just a Hail Mary pass?
Posted by Jeffrey Schwartz on 08/06/2014 at 11:00 AM0 comments
In a setback for U.S. cloud providers looking to ensure privacy of data stored in foreign countries, a search warrant ordering Microsoft to turn over e-mail stored in its Dublin, Ireland datacenter was upheld. Judge Loretta Preska of the U.S. District Court for the Southern District of New York upheld the search warrant by a domestic magistrate judge ruling. The identity and locale of the suspect, which is suspected in an illegal drug-related matter, is not known. Microsoft has said it will appeal last week's ruling.
Several tech companies had filed briefs in support of Microsoft's appeal of the warrant during a two-hour hearing held Tuesday, including Apple, AT&T, Cisco and Verizon, along with the Electronic Frontier Foundation, according to published reports. The outcome of this case can potentially set a precedent for all U.S.-based cloud providers storing data abroad.
"Under the Fourth Amendment of the U.S. Constitution, users have a right to keep their e-mail communications private," wrote Microsoft Chief Counsel Brad Smith in April. "We need our government to uphold Constitutional privacy protections and adhere to the privacy rules established by law. We're convinced that the law and the U.S. Constitution are on our side, and we are committed to pursuing this case as far and as long as needed."
In an OpEd piece published in The Wall Street Journal last week prior to the hearing, Smith furthered that view. "Microsoft believes you own e-mails stored in the cloud, and they have the same privacy protection as paper letters sent by mail," he said. "This means, in our view, that the U.S. government can obtain e-mails only subject to the full legal protections of the Constitution's Fourth Amendment. It means, in this case, that the U.S. Government must have a warrant. But under well-established case law, a search warrant cannot reach beyond U.S. shores."
In upholding the warrant Judge Preska argued that's not the point. "It is a question of control, not a question of the location of that information," Preska said, according to a report byThe Guardian. Hanni Fakhoury, staff attorney for the Electronic Frontier, told Computerworld he wasn't surprised by the ruling, saying it ultimately will be decided by the Second Circuit Court of Appeals. "I hope the Second Circuit looks closely at the magistrate's reasoning and realizes that its decision radically rewrote the Stored Communications Act when it interpreted 'warrant' to not capture all of the limitations inherent in a warrant, including extraterritoriality," he said.
The outcome of this case will have significant ramifications on the privacy of data stored in foreign datacenters. But that's not all Microsoft and its supporters have at stake. Should the warrant ultimately be upheld, The Guardian noted, U.S. companies are concerned they could lose billions of dollars in revenues to foreign competitors not subject to seizure by U.S. law enforcement agencies.
Posted by Jeffrey Schwartz on 08/04/2014 at 12:52 PM0 comments
Microsoft has substantial plans for its flagship Office suite, as noted by Mary Jo Foley in her August Redmond magazine column. But for now its version for the iPad is getting all the love. Just four months after the long-awaited release of Office for iPad, Microsoft has upgraded it with some noteworthy new features. The 1.1 versions of Word, Excel and PowerPoint are now available in Apple's iTunes App Store. Microsoft also updated OneNote for the iPad with its version 2.3 release.
While Microsoft added new features specific to the three key components of Office, the most noteworthy addition across the board is the ability to let users save files in the Adobe PDF format (even for those who don't have Office 365 subscriptions). That was one of the three top feature requests, according to a post in Microsoft's Office Blog announcing the upgrade. Second is the ability to edit photos on the iPad, which users can now do in Word. Though limited, the photo editing feature enables cropping and resetting. The third key feature is support for third-party fonts.
In Excel, Microsoft has added a new sorting capability which lets users filter, extend, collapse, display details and refresh PivotTables in an Excel workbook as well as changing the way they're displayed. Another new feature called "flick gesture" aims to simplify the use of workbooks including the selection of data fields with large ranges. "Simply grab the selection handle, flick it in any direction and Excel will automatically select from where you started to the next blank cell," read the blog post, adding if "you're at the top of a column of data and want to select all the way to the bottom, just flick down and the column is selected automatically."
The new PowerPoint app supports the use of multimedia capabilities on the iPad, specifically audio and video embedded into slides. It also lets users insert videos and photos from their iPad camera rolls. When projecting a presentation from the iPad, users can now enable Presenter View to see their own notes on the device.
Microsoft pointed out this is part of its new effort to offer continuous updates to iPad for Office. When you download the update, don't do it when you're in a rush -- give yourself about 15 to 30 minutes, depending on your network. I found Word and Excel in the Update section, though I had to search the store for the PowerPoint version.
Share your thoughts on the new update.
Posted by Jeffrey Schwartz on 08/01/2014 at 11:35 AM0 comments
If you've been holding off on buying Microsoft's Surface Pro 3, awaiting either a less expensive version or the higher-end model with an Intel Core i7 processor, they're now available. Microsoft announced the release of the new Surface Pro 3 models today.
The first units started shipping in June and featured mid-range models with i5 processors, priced between $999 and $1,299 (not including the $130 removable keyboard). Now the other units are available, as previously indicated. The unit with an i3 processor is priced at $799 for the device, while the i7 models, targeted at those with high-performance requirements such as those working in Adobe Photoshop or computer aided design-type applications, will run you quite a bit more. A Surface Pro 3 with a 256GB solid-state drive and 8GB of RAM costs $1,559. If you want to go whole-hog, one with a 512GB SSD will set you back $1,949.
Microsoft also said the $200 docking station for the Surface Pro 3 is on pace to ship in two weeks. The dock will have 5 USB ports (three USB 3.0 and two USB 2.0), an Ethernet input and a mini DisplayPort that renders 3840 x 2600 DPI resolution.
If you missed it, here's my first take on the mid-range Surface Pro 3. Have you looked at the Surface Pro 3? What's your take? Is it a device you're considering or are you sticking with a device from one of Microsoft's OEM partners?
Posted by Jeffrey Schwartz on 08/01/2014 at 11:39 AM0 comments
In its latest effort to apparently reign in large U.S. tech companies from expanding their presence in China, government investigators in China this week raided Microsoft offices throughout the country, according to several reports.
The raids by China's State Administration for Industry and Commerce included offices in Beijing, Shanghai, Guangzhou and Chengdu, according to a report Monday in The New York Times, citing a spokeswoman who declined to elaborate due to the sensitivity of the issue. Another Microsoft spokesman told The Wall Street Journal that the company's business practices are designed to comply with Chinese law.
Accusing Microsoft of monopolistic practices and other violations that are less clear, the raids are the latest salvo in tensions between the two countries over the past few years which have been recently escalated by spying, malware attacks and hacking allegations. The move could also be retaliation by the Chinese government following indictments by the U.S. government in May that charged five of China's Army officers with cyber attacks, The New York Times added.
The raids follow a visit to China last week by Qualcomm CEO Steven Mollenkopf, according to the report, which said he held talks with government officials and announced a $150 million "strategic venture fund" to invest in Chinese technology start-up companies, though it's unclear whether the visit sparked the escalation against Microsoft.
Microsoft, which, like many U.S. corporations, has identified China as one of its largest growth markets, is not the only company in the country's crosshairs these days. Government officials have also started to explore the reliance by Chinese banks on IBM mainframes and servers, though Big Blue has agreed to sell its x86 x Series server group to Lenovo for $2.3 billion. Apple, Cisco and Google have also faced heighted scrutiny, according to reports.
Approximately 100 investigators raided Microsoft's offices in China, according to The Wall Street Journal, which reported Tuesday that China's State Administration for Industry and Commerce vaguely accused Microsoft of not disclosing certain security features and how the company integrates its various products. China's government, through state-controlled news outlet CCTV, has raised concerns about the security of Windows 8 and also has deemed Apple's iPhone as a danger to the country's national security.
Disclosures by Edward Snowden of the surveillance efforts have also escalated concerns by China's government, several reports noted. Yet some wonder if the moves are more about protectionism of companies based in China than concerns about security.
Posted by Jeffrey Schwartz on 07/30/2014 at 12:37 PM0 comments
Hedge fund holding company Elliot Management has taken a $1 billion stake in storage giant EMC, according to a report in The Wall Street Journal last week. The activist investor with an estimated $25 billion under management is reportedly looking for the company to spin off VMware and Pivotal.
EMC has long opposed spinning off VMware, which is a potential move often floated. It most recently came up in late May when two Wells Fargo analysts advocated the spin off of VMware. Elliot is among those impatient with EMC's stock performance, which has risen only 163% over the past decade, compared with the Dow Jones U.S. Computer Hardware Index's 309% gain.
There's no reason to believe EMC CEO Joe Tucci would now welcome such a move, according to the report, noting his planned retirement is set for next year. Having VMware and Pivotal operate as separate companies, which EMC has a majority stake in, lets it operate under its "federation strategy." This would allow EMC, VMware, Pivitol and RSA to work together or individually with fewer competitive conflicts. An EMC spokesman told The Journal the company always welcomes discussions with shareholders but had no further comment. Elliot Management did not respond to an inquiry.
Freeing EMC of those companies could make it a potential takeover target for the likes of Cisco, Hewlett Packard or Oracle, though, the report noted, due to EMC's 31 percent decline in market share in the external storage market. EMC's stake in VMware is approximately 80 percent.
Elliot Management has taken many stakes in companies including BMC, Juniper and EMC rival NetApp. The company also made a $2 billion bid to acquire Novell in 2010, which the company rejected. Novell was later acquired by Attachmate.
Posted by Jeffrey Schwartz on 07/28/2014 at 12:29 PM0 comments
The release of the Amazon Fire Phone this week adds another wildcard to the smartphone race largely dominated by Apple's iPhone and devices based on Google's Android OS. With Windows Phone in a distant but solid third place in market share, it remains to be seen if it's too late for anyone to make a serious dent in the market at this point.
Microsoft must believe the new Amazon Fire Phone has a chance of gaining at least some amount of share and as a result is offering its Skype and OneNote apps in the Amazon App Store for Android right out of the gate. Even if the Windows Phone team doesn't see the Amazon Fire Phone as a threat, Microsoft also seems to realize that the new phone's success or failure won't rest on adding apps such as Skype and OneNote. And if the Amazon Fire Phone should prove to be a wild success, Microsoft, which already has acknowledged that we're no longer in an all-Windows world, likely doesn't want its apps and services to be supplanted by others. This move is not without precedent: Microsoft has offered Amazon Kindle apps in the Windows Store since the launch of Windows 7 back in 2009. Amazon also offers Windows Server and SQL Server instances in its Amazon Web Services public cloud
If any new entrant can gain share in the mobile phone market, it's Amazon. But even when Facebook was reportedly floating the idea of offering its own phone, it appears it realized that an apparent single-purpose device was a risky bet. Amazon has more to gain -- its phones are inherently designed to let people buy goods in its retail store. It even lets consumers capture an image of an item in a store and see if they can get it cheaper, which, they more often than not, can. And customers who have an Amazon Prime account, which all new subscribers get with the phone for a full year, can purchase goods with the phone.
Amazon already has experience in the device market with its Kindle Fire tablets with respectable but not dominant share. And Amazon recently launched its own TV set top box. The Amazon Fire Phone, like the Kindle Fire tablets, runs on a forked version of Android that doesn't support apps from the Google Play store. Amazon's new phone introduces some interesting new features including the ability to flip pages in a calendar and other apps without touching the screen but rather moving your hand. It comes standard with 32GB of storage and has the ability to recognize products.
Critics argue some of these "whiz bang" features are nice and some other appealing features like unlimited photo storage in the cloud are also nice touches. But it also it has a relatively basic design and form factor. While I haven't seen the phone myself, it appears it will appeal to those who use the Amazon Prime service extensively. As an Amazon Prime subscriber too, I'll settle for using whatever apps Amazon offers for iOS or Windows Phone when selecting my next phone.
Amazon has shown a willingness to break even or even sell products at a loss if it will further its overall business goals. On the other hand, with yesterday's disappointing earnings report where its net loss of $27 per share was substantially higher than the $16 loss expected by analysts, it remains to be seen how far Amazon can sustain losses, especially in new segments. Investor pressure aside, CEO Jeff Bezos shows no sign of backing off on his company's investments in distribution centers for its retail operation, datacenter capacity for the Amazon Web Service public cloud and now its entry into the low-margin phone business.
What's your take on the Amazon Fire Phone? Will it be a hot commodity or will it vindicate Facebook for choosing not to enter the market?
Posted by Jeffrey Schwartz on 07/25/2014 at 10:18 AM0 comments
Veeam this week has released a new management tool to provide visibility into Hyper-V as well as VMware environments. While previous versions of the Veeam Management Pack only supported VMware, the new v7 release now provides common visibility and management of Hyper-V, Microsoft's hypervisor. Administrators can use the Veeam Management Pack from within System Center Operations Manager.
Veeam had demonstrated a near-ready release at the TechEd conference in Houston back in May, among other System Center Operations Manager management packs. Within System Center Operations Manager, the new Veeam Management Pack version 7 for System Center offers a common dashboard that provides monitoring, capacity planning and reporting for organizations using Veeam Backup & Replication.
With the new management pack, System Center Operations Manager administrators can manage both their vSphere and Hyper-V environments together with complete visibility to physical and virtual components and their dependencies. In addition to offering deeper visibility into both hypervisors within a given infrastructure, the new Veeam Management Pack provides contextual views using color-coded heat maps for viewing various metrics and it provides real-time data feeds.
It also lets administrators manage the Veeam Backup & Replication for Hyper-V platform to determine if and when a host or virtual machine (VM) is at risk of running out of storage capacity, Doug Hazelman, the company's vice president of product strategy, said during a meeting at TechEd. "We provide views on networking, storage, heat maps -- the smart analysis monitors, as we call them," Hazelman said. "This is something you don't see in general in System Center."
If memory pressure is too high on a specific VM, the Veeam Management Pack can analyze the environment such as host metrics, the properties of the VM or whether it's configured with too little memory. Or perhaps the host has exhausted its resources. In that case a dynamic recommendation is provided. While administrators typically default to the Windows Task Manager to determine gauge utilization of CPU, memory and other common resources on a physical server, Hazelman pointed out that the common utility isn't designed to do so for VMs. The Veeam Task Manager addresses that.
The new release will be available with two licensing options. The Enterprise Plus Edition provides complete real-time forecasting, monitoring and management of the entire infrastructure including clusters and the complete virtual environment. It's available as a free update to existing Veeam Management Pack 6.5 customers.
A new lower-end Enterprise Edition is a scaled-down release that provides management and monitoring but not the full level of reporting of the Enterprise Plus version. The company is offering 100 free cores of the new Enterprise Edition free of charge, including maintenance for one year. This offer is available through the end of this year.
Posted by Jeffrey Schwartz on 07/25/2014 at 3:20 PM0 comments
Longtime Redmond magazine columnists Don Jones and Greg Shields have joined online IT training firm Pluralsight, where they will provide courses for IT administrators.
The two will continue to write their respective Decision Maker and Windows Insider columns for Redmond magazine and other content to the publication's Web site. They will also continue to present at the TechMentor and Live! 360 conferences, which, like Redmond magazine, is produced by 1105 Media.
Pluralsight announced the hiring of Jones and Shields on Wednesday. "As thought leaders, Don and Greg are the cream of the crop in their field, bringing the kind of experience and expertise that will add immense richness to Pluralsight's IT offering," said Pluralsight CEO Aaron Skonnard in a statement. Both Jones and Shields are Microsoft MVPs and VMware vExperts, and Shields is also a Citrix Technology Professional (CTP).
The move means they are leaving the boutique consulting firm they created, Concentrated Technology. Jason Helmick will continue to provide PowerShell training for Concentrated. Helmick, Jones and Shields will be presenting at next month's TechMentor conference, to be held on the Microsoft campus in Redmond, Wash.
Jones' and Shields' Redmond columns cover the gamut of issues that relate to Windows IT professionals. Some columns offer hands-on tips, like a recent one on how to troubleshoot with Microsoft's RDS Quality Indicator and another on what's new in Group Policy settings. Other columns have led to heated debates, such as last summer's "14 Reasons to Fire Your IT Staff." Jones' recent Decision Maker columns have explained how organizations can create a culture of security.
Posted by Jeffrey Schwartz on 07/23/2014 at 2:11 PM0 comments
Despite seeing its profits shrink thanks to its acquisition of Nokia, Microsoft on Tuesday reported a nice uptick in its core business lines -- notably its datacenter offerings -- and strong growth for its cloud services including Office 365 and Azure.
CEO Satya Nadella appeared in his second quarterly call with analysts to discuss Microsoft's fourth quarter earnings for fiscal year 2014. The company exceeded its forecasts for Office 365 subscription growth and saw double-digit gains across its enterprise server lines.
One of the key questions among analysts is what the future holds for Windows and its struggling phone business, now exacerbated. Nadella underscored that bringing a common Windows core across all device types, including phones, tablets, PCs, Xbox and embedded systems, will strengthen Microsoft's push into mobility, as well as the cloud. This is the notion of what Microsoft described to partners last week as the next wave of Windows, which will come in different SKUs but will be built on a common platform -- what Nadella described as "one" Windows that supports "universal" apps.
"The reality is we actually did not have one Windows," Nadella said on Tuesday's call. "We had multiple Windows operating systems inside of Microsoft. We had one for phone, one for tablets and PCs, one for Xbox, one for even embedded. Now we have one team with a layered architecture that enables us to, in fact, for developers, bring [those] collective opportunities with one store, one commerce system, one discoverability mechanism. It also allows us to scale the UI across all screen sizes. It allows us to create this notion of universal Windows apps."
Responding to an analyst question about what it will take to incent developers to build not just for Apple's iOS and Google's Android but also for Windows Phone and Windows-based tablets, Nadella said he believes this concept of "dual use" -- in which people use their devices for work and their personal lives -- will make it attractive for reluctant developers.
Now that Microsoft has brought all of the disparate Windows engineering teams together into one organization, when the next version of Windows comes out next year, Nadella said it will allow customers to use even their core desktop apps on any device. He's betting that application portability will make it easier and economical for developers to build more apps for Windows.
"The fact that even an app that runs with a mouse and desktop can be in the store and have the same app in a touch-first, in a mobile-first way, gives developers the entire volume of Windows, which you see on a plethora of units as opposed to just our 4 percent share of mobile in the U.S. or 10 percent in some counties," Nadella said. "That is the reason why we are actually making sure that universal Windows apps are available and developers are taking advantage of it. We have great tooling. That's the way we are going to be able to create the broadest opportunity to your very point about developers getting an ROI for building for Windows."
Yet between the lines, the fact that Nadella two weeks ago said Microsoft is a "productivity and platforms" company rather the previous "devices and services" descriptor suggests that the emphasis of Windows is a common platform tied with Office and OneDrive. Microsoft's goal is that this will allow users to do their work more easily, while making it easy for them to use their devices for their personal activities without the two crossing paths. And the most likely way to succeed is to ensure developers who have always built for the Windows platform continue to do so.
Posted by Jeffrey Schwartz on 07/23/2014 at 11:26 AM0 comments
After a protracted decline in PC sales, Intel last week said that enterprises of all sizes are refreshing their portable and desktop computers. In its second quarter earnings report, Intel said PC shipments rose for the third consecutive quarter. While the company acknowledged that the end of life of Windows XP has helped fuel the revival, the company appears optimistic the trend will continue.
Though company officials didn't give specific guidance for future quarters, the company is optimistic that the pending delivery of new systems based on its new 14mm Broadwell processor will propel demand in the following quarters. The new smaller CPU is expected to offer systems that are lighter and offer better battery life.
Intel said its PC group's revenues of $8.7 billion represented a 6 percent increase over the same period last year and a percent jump over the prior quarter. The second quarter Intel reported last week covers the period when Microsoft officially stopped releasing regular patches for its Windows XP operating system.
"The installed base of PCs that are at least four years old is now roughly 600 million units and we are seeing clear signs of a refresh in the enterprise in small and medium businesses," said Intel CEO Brian Krzanich, during the company's earnings call. "While there are some signs of renewed consumer interest and activity, the consumer segment remains challenging, primarily in the emerging markets."
Krzanich was particularly optimistic about the arrival of newest ultramobile systems that will arrive from the 14nm Llama Mountain reference design, which he said will result in fanless, detachable two—in-one systems that are 7.2 mm and weigh 24 ounces. OEMs demonstrated some of these new systems at the recent Computex show in Taipei. Microsoft showcased many new Windows PCs in the pipeline at last week's Worldwide Partner Conference in Washington, D.C.
Posted by Jeffrey Schwartz on 07/21/2014 at 1:41 PM0 comments
Microsoft moved quickly after last week's acquisition of InMage Systems to say that the InMage Scout software appliances for Windows and Linux physical and virtual instances will be included in its Azure Site Recovery subscription licenses.
Azure Site Recovery, a service announced at Microsoft's TechEd conference in Houston in May, is the rebranded Hyper-V Recovery Manager. Azure Site Recovery, unlike its predecessor, allows customers to use the Microsoft Azure public cloud as a backup target rather than requiring a second datacenter. Image Scout is an on-premises appliance which in real time captures data on a continuous basis as those changes occur. It then simultaneously performs local backups or remote replication via a single data stream.
With the addition of InMage Scout subscription licenses to the Azure Site Recovery service, Microsoft said customers will be able to purchase annual protection for instances with the InMage Scout offering. Microsoft is licensing Azure Site Recovery on a per virtual or physical instance basis. The service will be available for customers with Enterprise Agreements on Aug. 1. For now, that's the only way Microsoft is letting customers purchase InMage Scout, though it's not required for Azure Site Recovery. The InMage Scout software is available for use on a trial basis through Aug. 1 via the Azure portal.
The company also quietly removed the InMage-4000 appliance from the portfolio, a converged system with compute storage and network interfaces. The InMage-4000 was available with up to 48 physical CPU cores, 96 threads, 1.1TB of memory and 240TB of raw storage capacity. It supports 10GigE storage networking and built-in GigE Ethernet connectivity. Though it was on the InMage Web site last Friday, by Sunday it was removed.
A Microsoft spokeswoman confirmed the company is no longer offering the turnkey appliance but hasn't ruled out offering a similar type system in the future
Posted by Jeffrey Schwartz on 07/18/2014 at 11:55 AM0 comments
Microsoft this morning announced what was widely rumored -- it will kick off the largest round of layoffs in the company's history. The company will reduce its workforce by 18,000 employees -- much greater than analysts had anticipated. More than two thirds of them -- 12,500 -- will affect workers in its Nokia factories with the rest impacting other parts of the company. The layoffs are aimed at creating a flatter and more responsive organization.
CEO Satya Nadella announced the job cuts just one week after indicating that major changes were in the works. Just yesterday in his keynote address at Microsoft's Worldwide Partner Conference in Washington, D.C. Nadella reiterated that the company must change its culture and be easier to do business with. Since Microsoft announced its intent to acquire Nokia last year for $7.2 billion dollars, critics were concerned it would be a drag on the company's earnings. Nadella clearly signaled he is moving to minimize its impact.
The larger than expected reductions from Nokia is the result of a plan to integrate the company's operations into Microsoft, Nadella said in an e-mail to employees announcing the job cuts, 13,000 of which will take place over the next six months. "We will realize the synergies to which we committed when we announced the acquisition last September," Nadella said. "The first-party phone portfolio will align to Microsoft's strategic direction. To win in the higher price tiers, we will focus on breakthrough innovation that expresses and enlivens Microsoft's digital work and digital life experiences. In addition, we plan to shift select Nokia X product designs to become Lumia products running Windows. This builds on our success in the affordable smartphone space and aligns with our focus on Windows Universal Apps."
Nadella also said that Microsoft is looking to simplify the way employees work by creating a more agile structure that can move faster than it has and make workers more accountable. "As part of modernizing our engineering processes the expectations we have from each of our disciplines will change," he said, noting there will be fewer layers of management to accelerate decision making.
"This includes flattening organizations and increasing the span of control of people managers," he added. "In addition, our business processes and support models will be more lean and efficient with greater trust between teams. The overall result of these changes will be more productive, impactful teams across Microsoft. These changes will affect both the Microsoft workforce and our vendor staff. Each organization is starting at different points and moving at different paces."
The layoffs don't mean Microsoft won't continue to hire in areas where the company needs further investment. Nadella said he would share more specific information on the technology investments Microsoft will make during its earnings call scheduled for July 22.
Posted by Jeffrey Schwartz on 07/17/2014 at 7:14 AM0 comments
In what Apple and IBM describe as a "landmark" partnership, the two companies have forged a deal to bring 100 industry specific, enterprise-grade iOS apps and provide cloud services such as security, analytics and mobile integration for iPads and iPhones. The pact also calls for the two companies to offer AppleCare support for enterprises and IBM will offer device activation, supply and management of devices.
This broad partnership is a significant arrangement for both companies in that it will help IBM advance its cloud and mobility management ambitions and Apple will gain its largest foothold to date into the enterprise. It bears noting, Apple rarely forms such partnerships, preferring to go it alone. At the same time, the buzz generated by this partnership, though noteworthy, may be overstating the impact it will have. The harm it will have on Android and Windows also appears marginal.
To date, Apple has benefited by the BYOD movement of the past few years and that's predominantly why so many iPads and Android-based tablets and smartphones are used by employees. While there's no shortage of enterprise mobile device management platforms to administer the proliferation of user-owned devices, Apple is hoping IBM's foothold in the enterprise, its strong bench of developer tools and its growing cloud infrastructure will lead to more native apps and software-as-a-service offerings.
"This alliance with Apple will build on our momentum in bringing these innovations to our clients globally," said Ginni Rometty, IBM chairman, president and CEO, in a statement. "For the first time ever we're putting IBM's renowned big data analytics at iOS users' fingertips, which opens up a large market opportunity for Apple," added Apple CEO Tim Cook. "This is a radical step for enterprise and something that only Apple and IBM can deliver."
While the pact certainly will give IBM more credibility with its customers, its benefit to Apple appears marginal, which is why the company's stock barely budged on the news last night. "We do not expect the partnership to have a measurable impact on the model given that Apple has already achieved 98 percent iOS penetration with Fortune 500 companies and 92 percent penetration with Global 500 companies," said Piper Jaffray Analyst and known Apple bull Gene Munster in a research note. "While we believe that the partnership could strength these existing relationships, we believe continued success with the consumer is the most important factor to Apple's model."
The Apple-IBM partnership certainly won't help Microsoft's efforts to keep its Windows foothold intact, which is already under siege. On the other hand, it's a larger threat to Android than to Windows. The obvious reason is that Android has more to lose with a much larger installed base of user-owned tablets. Even if the number of combined tablets and PCs running Windows drops to 30 percent by 2017, as Forrester Research is forecasting, enterprises still plan to use Windows for business functions because of its ability to join Active Directory domains and its ties to Windows Server, SharePoint, Office and the cloud (including OneDrive and Azure).
"It makes it more challenging for Windows Phone to gain ground in the enterprise, because IBM bolster's Apple's hardware in the enterprise, for both sales/support and enterprise apps," said Forrester analyst Frank Gillett. "And that indirectly makes it harder for Windows PCs to stay strong also, but that's incremental."
Pund-IT Analyst Charles King sees this deal having a more grim effect on Microsoft. "Microsoft is in the most dangerous position since the company is clearly focusing its nascent mobile efforts on the same organizations and users as IBM and Apple," he said in a research note. The partnership was announced at an unfortuitous time for Microsoft -- the company is rallying its partners around Windows, among other things, at its Worldwide Partner Conference in Washington, D.C. where Microsoft has talked up its commitment to advance Windows into a common platform for devices of all sizes, from phones to large-screen TVs. "The goal for us is to have them take our digital work-life experiences and have them shine," Microsoft CEO Satya Nadella said in the keynote address at WPC today.
While Apple and IBM described the partnership as exclusive, terms were not disclosed. Therefore it's not clear what exclusive means. Does that mean Apple can't work with other IT players? Can IBM work with Google and/or Microsoft in a similar way? At its Pulse conference in Las Vegas back in February, IBM said it would offer device management for Windows Phone devices through its recently acquired MaaS360 mobile device management platform.
Also while Apple may have broken new ground with its IBM partnership, Microsoft has made a number of arrangements with providers of enterprise and vertical applications to advance the modern Windows platform. Among them are Citrix, Epic, SAP, Autodesk and Salesforce.com (with Salesforce One being available for Windows apps this fall).
Munster predicted if half the Fortune 500 companies were to buy 2,000 iPhones and 1,000 iPads above what they were planning to purchase from this deal, it would translate to half of one percent of revenue in the 2015 calendar year. In addition, he believes IBM will offer similar solutions for Android. Even if Munster is underestimating the impact this deal will have on Apple, there's little reason to believe this pact will move the needle significantly, if at all, for Windows. The fate of Windows is in Microsoft's hands.
Posted by Jeffrey Schwartz on 07/16/2014 at 12:35 PM0 comments
It's no secret that big changes are coming to Microsoft. CEO Satya Nadella made that clear in his 3,100-word memo to employees late last week. The key takeaways of that message were that Microsoft is now a platforms and productivity company and it intends to become leaner in a way that it can bring products to market faster.
While the new platforms and productivity mantra doesn't mean it's doing away with its old devices and services model, Nadella is trying to shift the focus to what Microsoft is all about and it's sticking with a strong cloud and mobile emphasis.
The latter sounds like Nadella wants to accelerate the Microsoft One strategy introduced last year and, reading between the lines, he wants to break down the silos and fiefdoms. In his keynote address at Microsoft's Worldwide Partner Conference in Washington, D.C. today, Nadella said the company intends to change its culture.
"We change the core of who we are in terms of our organization and how we work and our value to our customers," Nadella said. "That's the hardest part really. The technology stuff is the simpler thing. We all know that but we need to move forward with the boldness that we can change our culture. It's not even this onetime change, it's this process of continuous renewal that [will] succeed with our customers."
Nadella's push comes when there is unease among the Microsoft ranks. Rumors persist that Microsoft is planning some layoffs. The move isn't unexpected, given Microsoft inherited 25,000 new employees from its $7.2 billion acquisition of Nokia. According to a Reuters report, Microsoft is planning on laying off 1,000 of those employees based in Finland. The layoffs are expected to be the most extensive in Microsoft's history, according to a Bloomberg report.
To date Nadella is not indicating any planned cutbacks but at the same time it appears Nadella is well aware that Microsoft needs to rid itself of those who aren't on board with the company's new way of doing business.
Nadella said Microsoft needs to "enable the employees to bring their A game, do their best work, find deeper meaning in what they do. And that's the journey ahead for us. It's a continuous journey and not an episodic journey. The right way to think about it is showing that courage in the face of opportunity."
Posted by Jeffrey Schwartz on 07/16/2014 at 12:40 PM0 comments
In his annual address to partners, Microsoft COO Kevin Turner said the company will not provide any government access to customer data. Microsoft will fight any requests by a government to turn over data, Turner told 16,000 attendees at the company's annual Worldwide Partner Conference, which kicked off today in Washington, D.C.
"We will not provide any government with direct unfettered access to customers' data. In fact we will take them to court if necessary," said Turner. "We will not provide any government with encryption keys or assist their efforts to break our encryption. We will not engineer backdoors in the products. We have never provided a business government data in response to a national security order. Never. And we will contest any attempt by the U.S. government or any government to disclose customer content stored exclusively in another place. That's our commitment."
Microsoft will notify business and government customers when it does receive legal orders, Turner added. "Microsoft will provide governments the ability to review our source code, to reassure themselves of its integrity and confirm no backdoors," he said.
The remarks were perhaps the most well received by the audience during his one-hour speech that also covered the progress Microsoft has made for its customers in numerous areas including the success of Office 365, Azure, virtualization gains over VMware, business intelligence including last year's boost to SQL Server and the release of Power BI, which included its new push into machine learning. While Microsoft Chief Counsel Brad Smith has issued a variety of blog posts providing updates and assurance that it will protect customer data, Turner's public remarks step up the tenor of Microsoft's position on the matter.
While not addressing former NSA contractor Edward Snowden by name, it was a firm and public rebuke to accusations last year that Microsoft provided backdoors to the government. Turner acknowledged that despite its 12-year-old Trustworthy Computing Initiative, its Security Development Lifecycle and a slew of other security efforts, Microsoft needs to (and intends) emphasize security further. "When you think about the cyber security issues, there's never been an issue like this past year," Turner said. "It is a CEO-level decision and issue."
Turner talked up Microsoft's existing efforts including its ISO standard certifications, operational security assurance Windows Defender, Trusted Platform Module, Bitlocker and various point products. He also played up the company's higher level offerings such as assessments, threat detection response services and its digital crimes unit.
Microsoft has other security offerings and/or efforts in the pipeline, Turner hinted. "We will continue to strengthen the encryption of customer data across our network and services," he said. "We will use world-class cryptography and best-in-class cryptography to do so."
Posted by Jeffrey Schwartz on 07/14/2014 at 2:47 PM0 comments
If you thought Microsoft was looking to disrupt the storage landscape earlier this week when it launched its Azure StorSimple appliances, the company has just upped the ante. Microsoft is adding to its growing storage portfolio with the acquisition of InMage, a San Jose, Calif.-based provider of converged disaster recovery and business continuity infrastructure that offers continuous data protection (CDP). Terms weren't disclosed.
InMage is known for its high-end Scout line of disaster recovery appliances. The converged systems are available in various configurations with compute storage and network interfaces. Its InMage-4000 is available with up to 48 physical CPU cores, 96 threads, 1.1TB of memory and 240TB of raw storage capacity. It supports 10GigE storage networking and built-in GigE Ethernet connectivity.
Over time InMage will be rolled into the Microsoft Azure Site Recovery service to add scale to the company's newly added disaster recovery and business continuity offering. Microsoft had earlier announced plans to enable data migration to Azure with Scout, InMage's flagship appliance.
"InMage Scout continuously captures data changes in real time as they occur and performs local backup or remote replication simultaneously with a single data stream," a description on the company's Web site explained. "It offers instantaneous and granular recovery of data locally and enables push-button application level failovers to remote sites to meet local backup and/or remote DR requirements, thus going above and beyond the protection offered by conventional replication backup and failover automation products alone."
It also collects data from production servers in real time into memory before they're written to disk and moves the data to the InMage Scout Server. This eliminates any added I/O load from the backup or replication process. It also has built-in encryption, compression and WAN acceleration. It supports backups of Hyper-V, VMware ESX and Xen virtual machines.
The Scout portfolio also protects Linux and various Unix environments, and the company offers specialized appliances for Exchange Server, SAP, Oracle SQL Server, SharePoint, virtualization and data migration.
"Our customers tell us that business continuity -- the ability to backup, replicate and quickly recover data and applications in case of a system failure -- is incredibly important," said Takeshi Numoto, Microsoft's corporate VP for cloud and enterprise marketing, in a blog post announcing the acquisition.
These products don't overlap. StorSimple offers primary storage with Azure as a tier, while InMage offers disaster recovery using the cloud or a secondary site as a target.
Posted by Jeffrey Schwartz on 07/11/2014 at 12:23 PM0 comments
When it comes to enterprise storage, companies such as EMC, Hewlett Packard, Hitachi, IBM and NetApp may come to mind first. But there are a lot of new players out there taking a piece of the pie. Among them are Fusion-io, GridStore, Nimble Storage, Nutanix, Pure Storage, SolidFire and Violin Memory, just to name a few high fliers. Another less obvious but potentially emerging player is Microsoft, which acquired storage appliance maker StorSimple in 2012.
As I noted a few weeks ago, Microsoft is aiming at commoditizing hardware with software-defined storage. In recent months Microsoft has also indicated it has big plans for making StorSimple a key component of its software defined storage strategy, which of course includes Storage Spaces in Windows Server. Microsoft this week announced it is launching the Azure StorSimple 8000 Series, which consists of two different arrays that offer tighter integration with the Microsoft Azure public cloud.
While Microsoft's StorSimple appliances always offered links to the public cloud, the new Azure StorSimple boxes with disks and flash-based solid-state drives use Azure Storage as an added tier of the storage architecture, enabling administrators to create virtual SANs in the cloud just as they do on premises. Using the cloud architecture, customers can allocate more capacity as needs require.
"The thing that's very unique about Microsoft Azure StorSimple is the integration of cloud services with on-premises storage," said Marc Farley, Microsoft's senior product marketing manager for StorSimple, during a press briefing this week to outline the new offering. "The union of the two delivers a great deal of economic and agility benefits to customers."
Making the new offering unique, Farley explained, is the two new integrated services: the Microsoft Azure StorSimple Manager in the Azure portal and the Azure StoreSimple Virtual Appliance. "It's the implementation of StorSimple technology as a service in the cloud that allows applications in the cloud, to access the data that has been uploaded from the enterprise datacenters by StorSimple arrays," Farley explained.
The StorSimple 8000 Series lets customers run applications in Azure that access snapshot virtual volumes which match the VMs on the arrays on-premises. It supports Windows Server and Hyper-V as well as Linux and VMware-based virtual machines. However unlike earlier StorSimple appliances, the new offering only connects to Microsoft Azure -- not other cloud service providers such as Amazon Web Services. Farley didn't rule out future releases enabling virtual appliances in other clouds.
The aforementioned new StorSimple Manager consolidates the management and views of the entire storage infrastructure consisting of the new arrays and the Azure Virtual Appliances. Administrators can also generate reports from the console's dashboard, letting them reallocate storage infrastructure as conditions require.
Farley emphasized that the new offering is suited for disaster recovery, noting it offers "thin recoveries." Data stored on the arrays in the datacenter can be recovered from copies of the data stored in the Azure Virtual Appliances.
The arrays support iSCSI connectivity as well as 10Gb/s Ethernet and inline deduplication. When using the Virtual Appliance, administrators can see file servers and create a virtual SAN in the Azure cloud. "If you can administer a SAN on-premises, you can administer the virtual SAN in Azure," Farley said.
Microsoft is releasing two new arrays: the StorSimple 8100, which has 15TB to 40TB of capacity (depending on the level of compression and deduplication implemented) and the StorSimple 8600, which ranges from 40TB to 100TB with a total capacity of 500TB when using Azure Virtual Appliances.
The StorSimple appliances are scheduled for release next month. Microsoft has not disclosed pricing but the per GB pricing will be more than the cost of the Microsoft Azure blog storage offering, taking into account bandwidth and transaction costs.
Posted by Jeffrey Schwartz on 07/09/2014 at 1:50 PM0 comments
A study commissioned by VMware finds enterprise users "overwhelmingly" prefer Macs over Windows PCs. According to the survey of 376 IT professionals conducted by Dimensional Research, 71 percent of enterprises now support Macs and 66 percent have employees who use them in the workplace.
VMware, which of course has a vested interest in the demise of traditional Windows PCs in the enterprise, didn't ask to what extent Macs are deployed within respondents' organizations. While the share of Macs in use overall has increased over the years, according to IDC, the share of Macs dropped slightly to 10.1 percent last quarter from 11 percent year-over-year. However, that may reflect the overall decline in PC hardware sales over the past year. Nevertheless with more employees using their personal Macs at work and execs often preferring them over PCs, their presence in the workplace continues to rise.
Consequently, VMware is asserting that the findings show the dominance of Windows is coming to an end. "For companies, the choice is very clear -- they need to respond to end-user demand for Macs in the enterprise or they will find it difficult to recruit and retain the best talent on the market," said Erik Frieberg, VMware's VP of marketing for End-User Computing in a blog post last week. "They also need to provide IT administrators the tools to support a heterogeneous desktop environment. Otherwise there will be disruption to the business."
Despite user preference, the VMware study shows that 39 percent of the IT pros surveyed believe Macs are more difficult to support and 75 percent don't believe they are any more secure. "While employees clearly prefer Macs, there are challenges from an IT perspective that Macs must overcome before they can replace Windows PCs in the enterprise," Freiberg noted.
Exacerbating the challenge, 47 percent said only some applications that employees need to do their jobs run on Macs and 17 percent report none of their apps can run on Macs.
That trend is good news for Parallels, whose popular Parallels Desktop for Mac allows Windows to run as a virtual machine on Macs. I happened to catch up with Parallels at TechEd in Houston in May, where the company also announced a management pack for Microsoft's System Center Configuration Manager (SCCM). The new tool gives admins full visibility of Macs on an enterprise network, Parallels claims. In addition to network discovery, it includes a self-service application portal, an OSX configuration profile editor and can enable FireVault 2 encryption. The management pack can also deploy packages and prebuilt OSX images as well as configuration files.
VMware naturally sees these findings as lending credibility to its desktop virtualization push, including its Fusion Professional offering, which lets IT create virtual desktops for PCs and Macs, as well as its Horizon desktop-as-a-service offerings.
The survey also found that less than half (49 percent) unofficially support user-owned PCs, while 27 percent officially have such policies in place. The remaining 24 percent don't support user-owned PCs.
Are you seeing the rise of Macs in your organization? If so, would you say an invasion of Macs in the enterprise is coming? How are you managing Macs within your shop?
Posted by Jeffrey Schwartz on 07/07/2014 at 9:30 AM0 comments
CA Technologies on Monday said it is selling off its Arcserve data protection software business to Marlin Equity Partners, whose holdings include Changepoint, Critical Path, Openwave, Tellabs and VantagePoint.
The new company will take on the Arcserve name. Mike Crest, the current general manager of CA's Arcserve business, will become CEO of the new company. Terms of the deal, expected to close at the end of this calendar quarter, were not disclosed.
Developed more than two decades ago by Cheyenne Software, which CA acquired in 1996, Arcserve has a following of enterprise customers who use it to protect mission-critical systems.
CA just released the latest version of Arcserve Unified Data Protection (UDP), which is available as a single offering for Linux, Unix and Windows systems. It includes extended agentless protection for Hyper-V and VMware virtual environments. However, the backup and recovery market has become competitive, and CA has been divesting a number of businesses since its new CEO, Mike Gregoire, took over last year.
Marlin has $3 billion in capital under management. "We are committed to providing the strategic and operational support necessary to create long-term value for Arcserve and look forward to working closely with CA Technologies through the transition," said Marlin VP Michael Anderson in a statement.
In a letter to partners, Chris Ross, VP for worldwide sales for CA's data-protection business, said the move will benefit Arcserve stakeholders. "Greater focus on company business functions, R&D and support will mean higher levels of service and customer satisfaction," Ross said. "Simply put, the new company will be purpose-built end-to-end for Arcserve's unique target customer segment, partner model and overall strategy."
Posted by Jeffrey Schwartz on 07/07/2014 at 9:32 AM0 comments
The preview of the next version of Windows could appear in the next few months and will have improvements for those who primarily use the traditional desktop environment for Win32-based applications, according to the latest rumors reported Monday.
"Threshold," which could be branded as Windows 9 (though that's by no means certain) will target large audience of Windows 7 user who want nothing to do with the Windows 8.x Modern user interface, according to a report by Mary Jo Foley in her All About Microsoft blog. At the same time, Microsoft will continue to enhance the Modern UI for tablet and hybrid laptop-tablet devices.
To accomplish this, the Threshold release will have multiple SKUs. For those who prefer the classic desktop and want to run Win32 apps, one SKU will put that front and center, according to Foley. Hybrid devices will continue to support switching between the Modern UI (also referred to as "Metro") and the more traditional desktop interface. And another SKU, aimed at phones and tablets only, will not have a desktop component, which may prove disappointing to some (myself included) who use tablets such as the Dell Venue 8 Pro. At the same time, it appears that SKU will be used for some Nokia tablets and one might presume a future Surface 3 model (and perhaps for a "mini" form factor).
As previously reported, Threshold will get a new Start menu. Microsoft in April released Windows 8.1 Update, which added a Start screen and various improvements for keyboard and mouse users, but no Start menu. Foley pointed out that the mini Start menu that was demonstrated at Microsoft's Build conference in April is expected to be customizable.
The Threshold release is expected to arrive in the spring of 2015. Meanwhile, Foley also noted a second and final Windows 8.1 Update is expected to arrive next month for Patch Tuesday, though users will have the option of opting out.
Though details of how Microsoft will improve the classic Windows desktop remain to be seen, this should be welcome news to Windows 7 shops (and perhaps some Windows XP holdouts) making long-term migration and system upgrade plans. Our research has suggested all along that shops that plan to pass on Windows 8.x will consider Windows Threshold.
Microsoft said it had no comment on the report.
Posted by Jeffrey Schwartz on 07/02/2014 at 8:09 AM0 comments
Microsoft's Surface Pro 3 could benefit all types of workers looking for a laptop that they can also use as a tablet. Among them are SharePoint administrators.
As soon as the new Surface Pro 3s went on sale at BestBuy 10 days ago, Tamir Orbach, Metalogix's director of product management for SharePoint migration product, went out and bought one. Having seen my first-look write up last week, he reached out, wanting to share with me his observations on the device in general and why he believes every SharePoint administrator would benefit by having one.
Many of his customers who are SharePoint administrators tend to have a small, low end Windows tablet or iPad and a heavy laptop or desktop on their desks. Orbach believes the Surface Pro 3's high resolution, light weight and the coming availability of a unit with an Intel Core i7 processor and 8GB of RAM will make the device suitable as a SharePoint administrator's only PC and tablet.
"Pretty much all of us professionals want or need both a laptop or desktop and a slate," Orbach said. "It's so light that you can carry it anywhere you want and you would barely even feel it. And the screen is big enough, the resolution is good, the functionality is powerful enough to be used as our day-to-day computer."
We chatted about various aspects of the device:
- New keyboard: The new keyboard is bigger and we both agreed the fact that it can be locked on an angle is a significant improvement over previous systems (which only could be used in a flat position). Orbach said one downside to that new angle is you can feel the bounce, which is true but it's not that bad in my opinion. "I'd definitely take it over the flat one though," he said.
- Cost and configuration: Orbach bought the unit configured with a 128GB SSD and 4GB of RAM. That unit cost $999 plus $129 for the keyboard. A SharePoint administrator would be better off with at least the system with a 256GB drive and 8GB of RAM but there's a $300 premium. For one with a i7 processor, you're up to $1,549 without the keyboard.
- Docking station: If the Surface Pro 3 becomes your only computer it would be worth adding the docking station if you have a primary work area.
If you're a SharePoint administrator or any type of IT pro, do you think the Surface Pro 3 would help you do your job better?
Posted by Jeffrey Schwartz on 06/30/2014 at 12:28 PM0 comments
I think it's great that Microsoft is now offering 1TB of capacity in its OneDrive service for Office 365 but that only makes the proverbial haystack even larger due to the lack of a suitable way of finding files when using the new Windows 8.x modern UI.
The major drawback of using OneDrive is that it doesn't sort files in the order they were last modified. I think it's fair to say that's the preferred way for most individuals who want to access files they're working with. Instead, when using the modern UI, it sorts them alphabetically. If you have hundreds or thousands of files in a folder, good luck finding a specific file in the way you're accustomed to.
Sure you can use the search interface or go to the traditional desktop (and that's what I'm forced to do). But if Microsoft wants to get people to use its Windows 8.x UI, wouldn't it make sense to make it easier to use?
Now if you use OneDrive for Business with SharePoint Online, it's my understanding you do have the ability to do custom sorts when using the modern UI. So why not offer the same capabilities to the core OneDrive offering? Surely if Microsoft wants to stave off cloud file service providers such as Dropbox, this would be an easy way to accomplish that.
Do you agree?
Posted by Jeffrey Schwartz on 06/27/2014 at 11:07 AM0 comments
Infrastructure systems management provider SolarWinds is extending into the Web performance monitoring market with the acquisition of Pingdom. Terms of the deal, announced last week, weren't disclosed. Pingdom's cloud-based service monitors the performance and uptime of Web servers and sites.
Web performance monitoring is a natural extension of its business and is a key requirement of those managing their infrastructure, said SolarWinds Executive VP Suaad Sait.
"Our product strategy has always been around delivering IT infrastructure and application performance management products to the market," Sait said. "We had a hole in our portfolio where we wanted to extend the capabilities for cloud-based applications and Web sites. We heard this request from our customers as well from the macro market. Instead of building it organically, we looked for a really good partner and that led to us acquiring Pingdom."
Sait said SolarWinds proactively looked for a company to acquire for two reasons. One is that Web sites are becoming a critical component not only as a company's presence but for lead generation. Second, to ensure availability of Web-based applications from remote sites requires they are monitored. "Pingdom rose to the top of the kind of company that fits into the market we serve but also our business model," he said.
Pingdom is a cloud-based offering and the company claims it has 500,000 users. The service monitors Web sites, DNS, e-mail servers and other infrastructure. Setup only requires a URL. The deal has already closed but Sait said the company hasn't determined its integration roadmap for Pingdom.
Posted by Jeffrey Schwartz on 06/27/2014 at 11:11 AM0 comments
While Microsoft has extended the storage features in Windows Server and its Azure cloud service for years, the company is stepping up its ability to deliver software-defined storage (SDS). Experts and vendors have various opinions on what SDS is, but in effect it pools all hardware and cloud services and automates processes such as tiering, snapshotting and replication.
During a webinar yesterday presented by Gigaom Research, analyst Anil Vasudeva, president and founder of IMEX Research, compared SDS to server virtualization. "Software-defined storage is a hypervisor for storage," Vasudeva said. "What a hypervisor is to virtualization for servers, SDS is going to do it for storage. All the benefits of virtualization, the reason why it took off was basically to create the volume-driven economics of the different parts of storage, servers and networks under the control of the hypervisor."
Prominent storage expert Marc Staimer, president and chief dragon slayer of Dragon Slayer Consulting, disagreed with Vasudeva's assessment. "In general, server virtualization was a way to get higher utilization out of x86 hardware," he said. "The concept of a hypervisor, which originally came about with storage virtualization, didn't take off because what happened with storage virtualization [and] the wonderful storage systems that were being commoditized underneath a storage virtualization layer. What you're seeing today is your commoditizing the hardware with software-defined storage."
Organizations are in the early stages when it comes to SDS. A snap poll during the webinar found that 18 percent have on-premises SDS deployed, while 11 percent have a hybrid cloud/on-premises SDS in place and 32 percent said they are using it indirectly via a cloud provider. GigaOM research director Andrew Brust, who moderated the panel discussion, warned that the numbers are not scientific but participants agreed the findings are not out of line with what they're seeing.
Siddhartha Roy, principal group program manager for Microsoft (which sponsored the webinar), agreed it is the early days for SDS, especially among enterprises. "Enterprises will be a lot more cautious for the right reasons, for geopolitical or compliance reasons. It's a journey," Roy said. "For service providers who are looking at cutting costs, they will be more assertive and aggressive in adopting SDS. You'll see patterns vary in terms of percentages but the rough pattern kind of sticks."
SDS deployments may be in their early stages today but analyst Vasudeva said it's going to define how organizations evolve their storage infrastructure. "Software defined storage is a key turning point," he said. "It may not appear today but it's going to become a very massive change in our IT and datacenters and in embracing the cloud."
Both analysts agree that the earliest adopters of SDS in cloud environments, besides service providers, will be small and midsize businesses. For Microsoft, its Storage Space technology in Windows Server is a core component of its SDS architecture. Storage Space lets administrators virtualize storage by grouping commodity drives into standard Server Message Block 3.0 pools that become virtual disks exposed and remoted to an application cluster.
"That end to end gives you a complete software-defined stack, which really gives you the benefit of a SAN array," Roy said. "We were very intentional about the software-defined storage stack when we started designing this from the ground up."
Meanwhile, as reported last week, Microsoft released Azure Site Recovery preview, which lets organizations use the public cloud as an alternate to a secondary datacenter or hot site and it has introduced Azure Files for testing. Azure Files exposes file shares using SMB 2.1, making it possible for apps running in Azure to more easily share files between virtual machines using standard APIs, such as ReadFile and WriteFile, and can be accessed via the REST interface to enable hybrid implementations.
Is SDS in your organization's future?
Posted by Jeffrey Schwartz on 06/25/2014 at 12:49 PM0 comments
Equinix, which operates the largest global colocation of over 100 datacenters, plans to join the OpenCompute Project and implement some of its specs by early next year. Open Compute is a consortium of vendors initiated by Facebook with a large roster of members that include AMD, Dell, Hewlett Packard, Intel, Microsoft, Rackspace and VMware.
Ihab Tarazi, CTO of Equinix, said the company hasn't officially announced its plan to join OpenCompute, but it probably won't come as a major surprise to observers since Facebook is one of its largest customers. Tarazi said the decision to participate goes beyond Facebook. "Our model is to support the needs of our customers," he said. "There's a whole community on OpenCompute we're going to support."
Among them is Microsoft, which also has a major partnership with Equinix, among several interconnection partners it announced at its TechEd conference last month. With Microsoft's new ExpressRoute service, the company will provide dedicated high-speed links to Equinix datacenters. Microsoft joined OpenCompute earlier this year, saying it plans to share some of its Windows Azure datacenter designs as part of that effort.
I sat down with Tarazi, who is in New York for a company investor conference. Despite jumping on the standards bandwagon, Tarazi said he agrees with comments by Microsoft Azure General Manager Steven Martin, who in a speech earlier this month said, "you have to innovate then commoditize and then you standardize."
In his speech Marin added: "When you attempt to standardize first and say 'I want you as vendors, customers and partners, to get together and agree on a single implementation that we're all going to use for years and years and years to come,' the only thing I know for sure is that you're going to stifle anything meaningful being accomplished for years."
Tarazi concurred. "Innovation takes off faster if you are not waiting on a standard, which is what Steve was saying," he said. "As long as you are able to still deliver the service that is the best way to go. You have to sometimes go for a standard where it's impossible to deliver the service without connectivity or standard."
There's good reason for Tarazi to agree. Equinix is stitching together its own Cloud Exchange, announced in late April, with the goal of providing interconnection between multiple cloud providers. In addition to Microsoft, which has started rolling out interconnectivity to some Azure datacenters (with all planned by year's end), Cloud Exchange also connects to Amazon Web Services through its DirectRoute dedicated links.
Others announced include telecommunications provider Orange, Level 3 Communications and TW Telcom (which the latter agreed to acquire last week). Tarazi said the company is in discussion with all of the players that have operations in its datacenters. "We have 970 plus networks inside our datacenters," he said. "All of those connect to Microsoft in one way or another."
Though he agreed with Martin that there's a time for standards, apparently Tarazi believes in addition to OpenCompute, the time has come to support the OpenStack platform. "If you want to move workloads between them, we're going to make that very simple," Tarazi said. "Or if you want to have a single connection and get to all of them, that's really doable as well."
Tarazi said Equinix also plans to support the IETF's Network Configuration (NETCONF) protocol and Yang modelling language to ease device and network configurations.
Posted by Jeffrey Schwartz on 06/25/2014 at 3:06 PM0 comments
I don't make a habit of attending grand opening ceremonies but when Microsoft opened its second retail store in my backyard Saturday, I decided to accept the company's invitation to check it out. Microsoft opened one of the largest stores to date at the Roosevelt Field mall in Garden City, N.Y. on Long Island (right outside New York City). It's the fifth store in New York and arrives less than two years after opening area locations in Huntington, N.Y. (also on Long Island) and in White Plains, N.Y. in Westchester County. Roosevelt Field is the largest shopping mall in the New York metro area and the ninth largest in the U.S., according to Wikipedia.
The store that opened this weekend is one of the company's largest at 3,775 square feet and 41 employees. It coincidentally opened a day after Friday's Surface Pro 3 launch. "It just worked out that way," said Fazal Din, the store manager, when asked if the opening was timed in coordination with the launch. "But it's a great way to open the store."
While Microsoft's retail stores are primarily intended to draw consumers and are often strategically located near Apple Stores (as this one is), the stores are also targeting business customers, Din said. "We want this store to be the IT department for small businesses," Din said. The store is also reaching out to local partners, he added.
Microsoft corporate VP Panos Panay was on hand for the ribbon cutting ceremony Saturday, where a number of customers later asked him to autograph their new Surface Pro 3s. "This is not only the 98th store, but it's also the 12th store in the tri-state area [New York, New Jersey and Connecticut]. It's kind of a big deal," Panay said. "This is a great area for Microsoft to show its technologies."
Hundreds, if not a few thousand, teenagers camped outside the store in the enclosed mall to score tickets for a free Demi Lovato concert Microsoft arranged in the outside parking lot. The company also gave $1 million in donations to the local divisions of several charities including Autism Speaks, United Way, Variety Child Learning Center and the Girl Scouts of Nassau County.
Nine additional stores are in the pipeline, one of which will open this week in The Woodlands, Texas this Thursday. Most of the stores are in the U.S. with a few in Canada and Puerto Rico. By comparison, Apple, which started opening stores years before Microsoft, has an estimated 424 stores worldwide and 225 in the U.S. With retail sales of over $20 billion for Apple's stores, they represented 12 percent of the company's revenues. Like Apple and Samsung, Microsoft also has its own specialty departments in Best Buy stores.
Though Microsoft is touting the 98th store, by my count only 59 are full retail stores. The rest are smaller specialty stores. It appears Microsoft is largely opening retail stores in the suburbs of large cities rather than in urban locations. For example the only location in New York City is a specialty store in the Time Warner Center in Columbus Circle.
Posted by Jeffrey Schwartz on 06/23/2014 at 1:28 PM0 comments
The preview of Microsoft's Azure Site Recovery is now available, the company said on Thursday. Among numerous offerings announced at last month's TechEd conference in Houston, Azure Site Recovery is the company's renamed Hyper-V Recovery Manager for disaster recovery.
But as I reported, Azure Site Recovery is more than just a name change. It represents Microsoft's effort to make Azure a hot site for data recovery. While Hyper-V Recovery Manager, released in January, provides point-to-point replication, Microsoft says Azure Site Recovery aims to eliminate the need to have a secondary datacenter or hot site just for backup and recovery.
"What if you don't have a secondary location?" Matt McSpirit, a Microsoft technical product manager, asked that question during the TechEd opening keynote. "Microsoft Azure Site Recovery, [provides] replication and recovery of your on-premises private clouds into the Microsoft Azure datacenters."
The original Hyper-V Recovery Manager required a secondary datacenter. "When first released, the service provided for replication and orchestrated recovery between two of your sites, or from your site to a supporting hoster's site," the company said in a blog post Thursday. "But now you can avoid the expense and complexity of building and managing your own secondary site for DR. You can replicate running virtual machines to Azure and recover there when needed."
Microsoft says both offer automated protection, continuous health monitoring and orchestrated recovery of applications. It also protects Microsoft's System Center Virtual Machine Manager clouds by setting up automated replication of the VMs, which can be performed based on policies. It integrates with Microsoft's Hyper-V Replica and the new SQL Server AlwaysOn feature.
The service monitors clouds with SCVMM remotely and continuously, according to Microsoft. All links with Azure are encrypted in transit with the option for encryption of replicated data at rest. Also, Microsoft said administrators can recover VMs in an orchestrated manor to enable quick recoveries, even in the case of multi-tier workloads.
Customers can test it in the Azure Preview Portal.
Posted by Jeffrey Schwartz on 06/20/2014 at 12:08 PM0 comments
A month after introducing the new Surface Pro 3 -- which Microsoft advertises as the tablet designed to replace your laptop -- the device is now available for purchase at select retail locations. But the first batch of units will require a quick firmware update to address an issue where Surface Pro 3 would occasionally fail to boot up even when fully charged.
After spending a month with the Surface Pro 3, I can say the device is a real impressive improvement over the first two versions. It's bigger yet still portable, weighing 1.76 pounds with a much thinner form factor. And it has a much more usable keyboard. See my take, which appears in the forthcoming July issue of Redmond magazine.
I didn't mention the problem booting up because I hadn't experienced it when my review went to press. In recent weeks, I have experience the bug quite regularly. When the problem occurs, typically when Windows goes into sleep mode, I have eventually managed to boot it up, though it has taken anywhere from 15 to 30 minutes to do so. Microsoft last week shared a tip on how to do it faster. It requires a strange combination of pressing the volume button in the up position and holding the power button for 10 seconds with the power adapter plugged in. I initially thought I had a flawed unit but Microsoft said it was a common problem and the firmware upgrade currently available aims to fix that.
The firmware update fixes the power problem. It's easy enough to install the update. Just go to the Settings Charm, touch or click Update and Recovery and then check for a Windows Update. I attempted to run it last night but I received an error message saying to make sure the system is fully charged and try again. I did so this morning without incident. It's too early to say that the patch worked for me.
Microsoft also issued an update which lets the Surface Pen users double click to capture and save screen grabs, which should be welcome since there's no Print Screen button on the keyboard. This requires the installation of the June 10 Windows and OneNote updates. With the included Surface Pen, users can also use it to boot the machine right into OneNote to start taking notes.
In my evaluation of the test unit, I noted that I experienced occasional problems with the system failing to find a network connection, which I did mention in my first look article. In fact, it would sometimes indicate in the device manager that there is no network adapter. It wasn't clear if this problem was unique to my test unit or a universal problem -- it turns out there are reports of others who have experienced this issue as well, Microsoft confirmed. The way to fix that is to reboot but a spokeswoman for Microsoft said a patch for that problem is forthcoming.
Units with the Intel Core i5 processors are available at Best Buy stores and the Microsoft Store (both in retail locations and online). Versions with i3 and i7 processors will ship in August, with preorders open now. The i7 model is good if you'll be using the Adobe Creative Cloud Suite, part of which the company this week optimized for photographers using the Surface Pro 3. The i5 will appeal to most mainstream workers who don't want or need to use it for any complex photo or video editing or computer aided design (CAD) work.
If you get to a Best Buy or Microsoft Store near you, check it out and share your thoughts.
Posted by Jeffrey Schwartz on 06/20/2014 at 10:32 AM0 comments
When Microsoft said it was targeting MacBook users with its new Surface Pro 3 last month, the company demonstrated how much lighter its latest device is by putting the two on a balancing scale. But to really tip the scales for the new tablet PC, Microsoft also talked up its new partnership with Adobe to enhance Photoshop and the rest of the Creative Cloud suite for the new Surface.
Adobe today delivered on that promise with the launch of its new Creative Cloud Photography offering. The company said it will offer the new suite at a starting price of $9.99 per month, which includes what the company calls its new Creative Cloud Photography plan. Part of the new Photoshop CC 2014 release, it is now optimized for Windows 8.1 for those who want to use an electronic stylus or touch.
"We're offering 200 percent DPI support, as well as touch gestures," said Zorana Gee, Adobe's senior product manager for Photoshop during a media briefing. "All of the touch gestures you were able to experience [on traditional PCs and Macs] -- pan, zoom, scale, etc., will now be supported on the new Windows 8 platform."
The optimization for the Surface Pro 3 is available in Photoshop 2014, though the company indicated it was looking to optimize other apps in the suite over time as well.
Adobe last year took the bold step of saying it would only offer its entire Creative Suite of apps, which include Photoshop, Dreamweaver, Illustrator and InDesign, as a cloud-based software as a service. At the Surface launch last month, Adobe was among a number of software vendors including Autodesk and SAP that said they're optimizing their apps for the touch and gesture features in Windows 8.x.
"It's really, really easy to interact with the screen," said Michael Gough, Adobe's VP of Experience Design, when demonstrating the new Windows 8.1-enabled Photoshop at the Surface Pro 3 launch. "The pen input is natural. The performance is great -- both the performance of the software and the hardware working together."
While Photoshop is bundled with the new plan and is optimized for Surface, the subscription also includes Lightroom and Lightroom mobile, which Adobe has designed for use with Apple's iPhone and iPad.
The new Photoshop release also has a number of other new features including content-aware color adaption improvements, Blur gallery motion effects, Perspective Warp and Focus Mask. The latter automatically selects areas in an image that are in focus. It's suited for headshots and other images with shallow depth of field.
Posted by Jeffrey Schwartz on 06/18/2014 at 8:34 AM0 comments
If you're wondering where Microsoft stands with cloud standardization efforts such as OpenStack and Cloud Foundry, the general manager for Microsoft Azure gave his take, saying providers should innovate first. In the keynote address at this week's Cloud Computing Expo Conference in New York, Microsoft's Steven Martin questioned providers that have emphasized the development of cloud standards.
"I think we can agree, you have to innovate, then commoditize and then you standardize," Martin said. "When you attempt to standardize first and say 'I want you as vendors, customers and partners, to get together and agree on a single implementation that we're all going to use for years and years and years to come,' the only thing I know for sure is that you're going to stifle anything meaningful being accomplished for years."
The remarks are clearly a veiled reference to major IT providers offering public cloud services such as IBM, Hewlett Packard, and Rackspace, along with VMware, which is pushing its Cloud Foundry effort. Amazon, Microsoft and Google have the largest cloud infrastructures. According to a report in The New York Times Thursday, Amazon and Google both have 10 million servers in their public clouds while Microsoft Azure has 1 million in 12 datacenters today and 16 planned to be operational by year's end. Despite the large footprints none of the big three have pushed running a standard cloud platforms stack.
Martin's statements about standards were rather telling, considering Microsoft has always had little to say publicly about OpenStack and Cloud Foundry. While Microsoft has participated in OpenStack working groups and has made Hyper-V-compatible in OpenStack clouds, the company has never indicated either way whether it sees its Azure cloud ever gaining some OpenStack compatibility, either natively or through some sort of interface.
"The best thing about cloud technology is in addition to the data, in addition to the access, is the market gets to decide," he said. "The market will pick winners and losers in this space, and we will continue to innovate." Asked after his session if he sees Microsoft ever supporting OpenStack, Martin reiterated that "we'll let the market decide."
Not long ago, one might have construed that as Microsoft talking up its proprietary platforms. However Martin was quick to point out that Microsoft Azure treats Linux like a first-class citizen. "Microsoft will use the technologies that make sense and contribute them back to the public," he said." What will matter is going to be the value that people generate, and how strong and robust the systems are, and the service level agreements you can get."
Posted by Jeffrey Schwartz on 06/13/2014 at 11:14 AM0 comments
As I reported the other day, Microsoft is getting tougher on surveillance reforms. Later that day, Microsoft stepped its battle of overreach up a notch by releasing a court filing seeking to overturn an order to turn over an e-mail stored in its Dublin datacenter. In its appeal released Monday, Microsoft is arguing the search warrant is in violation of international law.
It's believed Microsoft's move is the first time a major company has challenged a domestic search warrant for digital information overseas, The New York Times reported today. Privacy groups and other IT providers are concerned over the outcome of this case, according to the report, noting it has international repercussions. Foreign governments are already concerned their people's data are not adequately protected.
Microsoft filed its objection in the United States Southern New York Court last Friday, saying if the warrant to turn over the e-mail stored abroad is upheld, it "would violate international law and treaties, and reduce the privacy protection of everyone on the planet." The case involves a criminal inquiry, where a federal judge granted a search warrant in New York back in December. The customer's identity and country of origin is not known.
Search warrants seeking data oversees are rare, according to The Times report, but granting one could pave the way for further cases and international conflicts at a time when foreign governments are already unnerved by the surveillance activities by the United States. In its latest filing, Microsoft is seeking a reversal of the warrant. The report said the case could go on for some time with oral arguments scheduled for July 31.
The case could put pressure for revisions to the Electronic Communications Privacy Act of 1986, which was created before international electronic communications over the Internet was common.
Posted by Jeffrey Schwartz on 06/11/2014 at 11:54 AM0 comments
In the year since Edward Snowden stunned the world with revelations that the National Security Agency (NSA) had a widespread digital surveillance effort that included the covert PRISM eavesdropping and data mining program, Microsoft marked the anniversary last week by saying it had unfinished business in the quest for government reforms.
While most cynics presumed intelligence agencies intercepted some communications, Snowden exposed what he and many believe was broad overreach by the government. Many of the revelations that kicked off a year ago last Thursday even put into question the legality of the NSA's activities and the restrictions imposed on the telecommunications and IT industry by the Foreign Intelligence Security Act (FISA).
The leaked documents implicated the leading telcos, along with Microsoft, Google, Facebook, Yahoo and many others, saying they were giving the feds broader access to e-mail and communications of suspected terrorists than previously thought. While the government insisted it was only acting when it believed it had probable cause and on court orders, the NSA's broad activities and the compliance of Microsoft and others put into question about how private our data is when shared over the Internet, even when stored in cloud services.
Whether you think Snowden is a hero for risking his life and liberty for exposing what he believed defied core American freedoms or you feel he committed treason, as Netscape Cofounder and Silicon Valley heavyweight Marc Andreessen believes, the worldview and how individuals treat their data and communications is forever changed.
The revelations were a setback for Microsoft's efforts to move forward its "cloud-first" transformation because the leaked NSA documents found that the company was among those that often had to comply with court orders without the knowledge of those suspected. To his credit, Microsoft General Counsel Brad Smith used the revelations to help put a stop to the objectionable activities.
Both Microsoft and Google last week marked the anniversary by showing the progress both companies have made. Google used the occasion to announce its new end-to-end encryption plugin for the Google Chrome browser and a new section in its Transparency Report that tracks e-mail encryption by service providers. Google announced it is using the Transport Layer Security (TLS) protocol to encrypt e-mail across its Gmail service. Its reason for issuing the Transparency Report was to point out that a chain is only as strong as its weakest link, hoping it would pressure all e-mail providers to follow suit. The report last week showed Hotmail and Outlook.com only implementing TLS 50 percent of the time.
Microsoft has lately emphasized it is stepping up its encryption efforts this year. Encryption for Office 365 is coming, Microsoft said last month. The company will offer 2018-bit Private Forward Secrecy as the default decryption for Office 365, Azure, Outlook.com and OneDrive. Next month Microsoft will also offer new technology for SharePoint Online and OneDrive for Business that will move from a single encryption key per disk to offering a unique encryption key for each file.
Shortly after the Snowden revelations, Microsoft, Google and others filed a lawsuit challenging the Foreign Intelligence Surveillance Act's stipulation that made it illegal for the companies to be more transparent. In exchange for dropping that lawsuit, Microsoft and others were able to make some limited disclosures. But in his blog post last week, Smith said providers should be permitted to provide more details, arguing doing so wouldn't compromise national security.
The unfinished business Smith would like to see resolved includes in summary:
- Recognize that U.S. search warrants end at U.S. borders: The U.S. government should stop trying to force tech companies to circumvent treaties by turning over data in other countries. Under the Fourth Amendment of the U.S. Constitution, users have a right to keep their e-mail communications private. We need our government to uphold Constitutional privacy protections and adhere to the privacy rules established by law. That's why we recently went to court to challenge a search warrant seeking content held in our data center in Ireland. We're convinced that the law and the U.S. Constitution are on our side, and we are committed to pursuing this case as far and as long as needed.
- End bulk collection: While Microsoft has never received an order related to bulk collection of Internet data, we believe the USA Freedom Act should be strengthened to prohibit more clearly any such orders in the future.
- Reform the FISA Court: We need to increase the transparency of the FISA Court's proceedings and rulings, and introduce the adversarial process that is the hallmark of a fair judicial system.
- Commit not to hack data centers or cables: We believe our efforts to expand encryption across our services make it much harder for any government to successfully hack data in transit or at rest. Yet more than seven months after the Washington Post first reported that the National Security Agency hacked systems outside the U.S. to access data held by Yahoo and Google, the Executive Branch remains silent about its views of this practice.
- Continue to increase transparency: Earlier this year, we won the right to publish important data on the number of national security related demands that we receive. This helped to provide a broader understanding of the overall volume of government orders. It was a good step, but we believe even more detail can be provided without undermining national security.
President Obama has put forth some recommendations, though some believe they don't go far enough and have yet to affect any major changes. If you saw the interview NBC's Brian Williams conducted with Snowden in Moscow, it's clear, regardless of the legality of the leaks, this debate is far from over. But if you're concerned about your privacy, encrypting your data at rest and in transit is an important step moving forward.
Posted by Jeffrey Schwartz on 06/09/2014 at 1:06 PM0 comments
Asus, Dell and Hewlett Packard are among the PC suppliers extending the boundaries of Microsoft's new Windows 8.1 operating system with several new enterprise-grade hybrid PC-tablets being revealed at this week's annual Comutex trade show in Taipei.
Some of the devices could even offer an alternative to Microsoft's new Surface Pro 3, a device the company believes is finally suited to combine all the functions of a commercial-grade laptop and a tablet. If the new PC-tablets challenge the Surface Pro 3, that's a good thing for the advancement of Windows for Microsoft. "Surface is a reference design for Microsoft's OEM partners," said David Willis, Gartner's chief of research for mobility and communications, when I caught up with him yesterday at the Good Technology Xchange user conference in New York.
For example, the new HP Pro x2 612, launched today, has a 12.5-inch full high-definition (FHD) display that's just slightly larger than the Surface Pro 3. HP's detachable tablet is available with either an Intel Core i3 or i5 processor with vPro hardware-based security, solid-state drives and two USB 3.0 ports. It is also available with HP's UltraSlim dock. While the Surface Pro 3 is also available with a Core i7 processor, the latter two CPUs should serve the needs of most mainstream business users. And there's nothing to say that HP won't later offer an i7-equipped model down the road.
The HP Pro x2 612 will get 8.25 hours of battery life, though an optional power keyboard extends that to 14 hours, the company said. While the Surface Pro 2 is also available with a power keyboard, Microsoft didn't announce one yet for the new Surface Pro 3. In addition to offering hardware-based security with vPro, HP also added other features to offer improved security for the new device, including HP BIOS, HP Client Security, Smart Card Reader, HP TPM and an optional fingerprint scanner for authentication.
HP also announced a smaller version, the HP Pro x2 410, with an 11.6-inch display and a starting price of $849 for a unit with an i3 processor, 128GB of storage and 4GB of RAM. HP didn't announce pricing for the larger HP Pro x2 612, which ships in September.
Meanwhile, Asus rolled out several new Windows devices including the new Zenbook NX500, available with an i7 quad-core processor. It supports optional NVIDIA GeForce GTX 850M graphics adaptors with 2GB of GDDR5 video memory. The new system also includes a Broadcom 3-stream 802.11ac Wi-Fi, SATA 3 RAID 0 or PCIe x4 SSD storage.
Asus said the new NX500 is the first laptop offered by the company with a 4K/UHD display and VisualMaster technology. Its 15.6-inch device offers 3840x2160 resolution, and an IPS display. The company did not disclose pricing or availability.
And complementing its Venue Pro 8 tablets, Dell also launched several Inspiron models including the 7000 Series 2-in-1. Due to ship in September, it also is powered by Intel's latest Core processors and comes with a 13.3-inch capacitive touchscreen display. A lower-end 11.6-inch model, the 3000 Series, is also available with a starting price of $449.
In all, Microsoft showcased 40 new Windows PCs, tablets and phones at Computex, according to OEM Corporate VP Nick Parker, who gave a keynote address earlier today. "We're delivering on our vision today with rapid delivery of enhancements in Windows, new licensing and programs for an expanded set of partners," Parker said, in a blog post.
Of course, it wasn't all Windows at Computex. Intel said more than a dozen Android and Windows tablets debuted at the conference, with 130 in its radar for this year overall. And Dell revealed it will offer Ubuntu 14.04 LTS version of Linux as an option on its new Inspiron 2-in-1 laptop-tablets.
Posted by Jeffrey Schwartz on 06/04/2014 at 2:33 PM0 comments
While Amazon Web Services (AWS) remains by far the most widely used cloud provider by enterprises, it appears Microsoft's Azure cloud service has gained significant ground over the past year since releasing its Infrastructure as a Service (IaaS) offering.
Azure was the No. 2 cloud service behind Amazon last year, according to a Redmond magazine reader survey, and that finding remained consistent this year, as well. But given Redmond readers are predisposed to using Microsoft technology, it has always remained a mystery which cloud provider was the greatest alternative to Amazon in the broader IT universe.
Every major IT vendor -- including Google, IBM, Hewlett-Packard, Oracle and VMware -- and the telecommunication service providers offer enterprise public cloud services and want to expand their footprints. Many of them, notably Rackspace, AT&T, IBM and HP, are betting on OpenStack infrastructures, which, besides Amazon, is the most formidable alternative to Azure.
In the latest sign Azure is gaining ground, Gartner last week released its Magic Quadrant for IaaS providers, where only Amazon and Microsoft made the cut as leaders (a first for Microsoft in that category). Gartner published a measured assessment of Azure IaaS and all of the major cloud service providers.
"Microsoft has a vision of infrastructure and platform services that are not only leading stand-alone offerings, but that also seamlessly extend and interoperate with on-premises Microsoft infrastructure (rooted in Hyper-V, Windows Server, Active Directory and System Center) and applications, as well as Microsoft's SaaS offerings," according to Gartner's report.
"Its vision is global, and it is aggressively expanding into multiple international markets. It is second in terms of cloud IaaS market share -- albeit a distant second -- but far ahead of its smaller competitors. Microsoft has pledged to maintain AWS-comparable pricing for the general public, and Microsoft customers who sign a contract can receive their enterprise discount on the service, making it highly cost-competitive. Microsoft is also extending special pricing to Microsoft Developer Network (MSDN) subscribers."
The fact that Azure has a wide variety of features in its Platform as a Service (PaaS), as well, offers significant complementary offerings. Microsoft also was one of two vendors described as leaders in Gartner's application PaaS (which it calls aPaaS) Magic Quadrant back in January, bested only by Salesforce.com, now a Microsoft partner.
"The IaaS and PaaS components within Microsoft Azure feel and operate like part of a unified whole, and Microsoft is making an effort to integrate them with Visual Studio, Team Foundation Server, Active Directory, System Center and PowerShell. Conversely, Windows Azure Pack offers an Azure-like user experience for on-premises infrastructure," according to Gartner. "Microsoft has built an attractive, modern, easy-to-use UI that will appeal to Windows administrators and developers. The integration with existing Microsoft tools is particularly attractive to customers who want hybrid cloud solutions."
That's a pretty glowing assessment of Azure, but Gartner also issued some warnings to customers considering Microsoft's cloud service. Notably, Gartner cautioned that Microsoft's infrastructure services are still relatively new -- just over a year old -- while Amazon has offered IaaS since 2006.
"Customers who intend to adopt Azure strategically and migrate applications over a period of two years or more (finishing in 2016 or later) can begin to deploy some workloads now, but those with a broad range of immediate enterprise needs are likely to encounter challenges," according to the Gartner report.
Gartner also warned that Microsoft faces the challenge of operating and managing its Azure at cloud scale and enabling enterprises to automate their infrastructures. In addition, Microsoft is still in the early stages of building out its partner ecosystem and doesn't yet offer a software licensing marketplace, it pointed out. Despite offering some Linux services, Gartner believes Azure is still "Microsoft-centric," appealing primarily to .NET developers. That's an image Microsoft has begun working in earnest to shake. For example Microsoft has open-sourced some of its own .NET offerings, while making Java a first-class citizen on Azure.
Microsoft has 12 datacenters worldwide supporting Azure and that number will reach at least 16 by year's end, the company said. Azure is a key component of Microsoft's hybrid cloud strategy, called Cloud OS, which is based on running multitenant instances using Windows Server, System Center, the Azure Pack (for running Azure-like operations in a private datacenter) and the public cloud.
Azure took center stage at last month's TechEd conference in Houston. It was evident in the keynote, but also in talking with folks on the show floor. "I'm seeing more rapid adoption of Azure overall," said Randy DeMeno, CommVault's chief technologist for Windows.
And speaking during a TechEd session, BlueStripe CTO Vic Nyman noted the benefits of using Azure to scale on demand. "Using Azure, and particularly Platform as a Service and Infrastructure as a Service, is a simple, elegant solution whose presentation layers, turning up and down, is an interesting trend we see."
Are you looking at Azure to scale your infrastructure?
Posted by Jeffrey Schwartz on 06/02/2014 at 12:10 PM0 comments
As Google targets everything from serving ads in your thermostat, to making a driverless car, machine learning and now broadband communications with its reportedly planned $1 billion investment in satellite technology, the search giant is also stepping up its less glamorous effort of developing an alternative to everyday enterprise services offered by Microsoft.
Google has won its share of big conversions from Lotus Notes and Microsoft Exchange, but experts say the majority of enterprises moving their messaging and collaboration efforts to the cloud are going with Office 365. Now Google is looking to make the switch easier. Last week, Google said enterprises can migrate from Exchange Server to Google Apps with its cloud-based data migration service directory from the Admin console to the Gmail servers.
The direct migration offering replaces the need for the Google Apps Migration for Microsoft Exchange tool, which customers had to install on their local mail servers. The new migration service also lets administrators monitor the progress of the migration. The new migration service currently only works for e-mail, with calendar migration currently under development. Google is making the new e-mail migration service available on its Gmail servers over the next two weeks.
Google said the migration service currently is suitable for the following:
- Microsoft Exchange servers that support Exchange Web Services (EWS), specifically Office 365 and Exchange Server 2007 SP1 or higher.
- IMAP servers, including Gmail, Exchange 2003 or lower, and ISPs like GoDaddy.
Google last month also made it easier to manage retention of mail and documents on Google Apps via its Google Vault service. "The options for setting or modifying a retention period -- the length of time your company's messages are archived in Google Vault -- are now more and we've added safeguards when setting a retention period for a specified number of days," Google said in a blog post last month.
Organizations using Microsoft Outlook with Google Apps can now add, manage and join Hangout video calls by downloading a plug-in to Outlook.
Posted by Jeffrey Schwartz on 06/02/2014 at 8:51 AM0 comments
Would VMware and its parent EMC be better off as one company? A report last week by two Wells Fargo analysts suggesting the two should combine into one company was rejected by VMware CEO Pat Gelsinger. The analysts suggested the plans to offer federated solutions among the companies EMC controls, which, in addition to VMware, include RSA and the recently spun-out Pivotal, would make more business sense and offer more shareholder value.
At its annual EMC World conference earlier this month, the company launched what it calls EMC II, an effort to federate the four companies to offer software-defined datacenter solutions. Despite this new federated business model, EMC said it remains committed to letting customers choose best-of-breed solutions. Wells Fargo analysts Um and Jason Maynard issued a note suggesting that could be better accomplished by combining EMC and VMware into one company. EMC spun VMware off in 2007.
"What EMC and VMware call federated solutions is, to us, taking the next step in addressing a key trend in the market today of converged solutions," they wrote, as reported by Barron's Tiernan Ray. "Over the past few years, large OEMs such as IBM, HP, Oracle and Dell have built up or acquired a broader capability across the stack and are offering complete converged solutions rather than point products. Cooperation turned into coopetition and will likely become full-on competition -- to us, the friction is fairly evident and we expect this to continue to grow."
Pressed on the matter in an interview on CNBC's Fast Money program Tuesday during the grand opening of VMware's expanded campus in Palo Alto Calif., Gelsinger said there are no plans to combine the two organizations.
"Simple answer, no," Gelsinger said. "It is working so well. We have this federated model where each company has their own strategic role. We're independent, we're loosely coupled and we're executing like crazy. And it's working for shareholders, our ecosystems, our employees on this beautiful campus here. This has worked and we're going to stay on this model because it's been completely successful."
Speaking at the Sanford Bernstein conference yesterday, EMC chairman and CEO Joe Tucci reiterated the strategy. "In each of these companies the missions are aligned," Tucci said, according to the Seeking Alpha transcript of his remarks. "One depends on the other, built on the other. But, again, you can take these and you can use them as a card giving customers choice, which I think is going to help to find a winner in the third platform. We're not forcing you to use our technologies. You can use Pivotal without using VMware. You can use VMware without using EMC, but when they all work together you get a special layer of magic."
Even though they're separately traded companies, EMC holds an 80 percent stake in VMware and has 97 percent control of voting. Longtime storage industry analyst John Webster wrote the companies will have to deliver the so-called "third platform" it evangelizes more