Like many cloud service providers, Microsoft has identified disaster recovery as a key driver for its hybrid infrastructure-as-a-service (IaaS) offering. Microsoft this year delivered a critical component of delivering its disaster recovery as a service (DRaaS) with Azure Site Recovery.
If you saw Brien Posey's First Look at Azure Site Recovery, you may have quickly lost interest if you're not a Microsoft System Center user. That's because Azure Site Recovery required System Center Virtual Machine Manager. But with last week's Microsoft Azure release upgrade, the company lifted the SCVMM limitation.
The new Azure Site Recovery release allows customers to replicate and recover virtual machines using Microsoft Azure without SCVMM. "If you're protecting fewer VMs or using other management tools, you now have the option of protecting your Hyper-V VMs in Azure without using System Center Virtual Machine Manager," wrote Vibhor Kapoor, director of marketing for Microsoft Azure, in a blog post outlining the company's cloud service upgrades.
By making Azure Site Recovery Manager available without SCVMM, it brings the DRaaS to branch offices and smaller organizations that can't afford Microsoft's systems management platform or simply prefer other tools, explained Scott Guthrie, executive vice president of Microsoft's enterprise and cloud business, in a blog post. "Today's new support enables consistent replication, protection and recovery of Virtual Machines directly in Microsoft Azure. With this new support we have extended the Azure Site Recovery service to become a simple, reliable and cost effective DR Solution for enabling Virtual Machine replication and recovery between Windows Server 2012 R2 and Microsoft Azure without having to deploy a System Center Virtual Machine Manager on your primary site."
Guthrie pointed out that Azure Site Recovery builds upon Microsoft's Hyper-V Replica technology built into Windows Server 2012 R2 and Microsoft Azure "to provide remote health monitoring, no-impact recovery plan testing and single click orchestrated recovery -- all of this backed by an SLA that is enterprise-grade." Since organizations may have different uses for Azure Site Recovery, Guthrie underscored the One-Click Orchestration using Recovery Plans option, which provides various Recovery Time Objectives depending on the use case. For example using Azure Site Recovery for test and/or planned failovers versus unplanned ones typically require different RTOs, as well as for disaster recovery.
In addition to Hyper-V Replica in Windows Server 2012 R2, Azure Site Recovery can use Microsoft's SQL Server AlwaysOn feature. Azure Site Recovery also integrates with SAN replication infrastructure from NetApp, Hewlett Packard and EMC. Also, according to a comment by Microsoft's Roan Daley in our First Look, Azure Site Recovery also protects VMware workloads across VMware host using its new InMage option. Acquired back in July, InMage Scout is an on-premises appliance that offers real-time data capture on a continuous basis, which simultaneously performs local backups or remote replication via a single data stream. Microsoft is licensing Azure Site Recovery with the Scout technology on a per-virtual or per-physical instance basis.
Are you using Microsoft's Azure Site Recovery, planning to do so or are you looking at the various third party alternatives as cloud-based DRaaS becomes a more viable data protection alternative?
Posted by Jeffrey Schwartz on 12/17/2014 at 12:08 PM0 comments
Salesforce.com today launched a connector that aims to bridge its cloud-based CRM portfolio of services with enterprise file repositories. The new Salesforce File Connect will let organizations centralize their customer relationship management content with file stores including SharePoint and OneDrive with a connector to Google Drive coming in a few months.
The release of the connector to SharePoint and OneDrive was promised back in late May when both Salesforce.com and Microsoft announced a partnership to integrate their respective offerings. While the two companies have a longstanding rivalry, they also share significant overlapping customer bases. The companies at the time said they would enable OneDrive for Business and SharePoint Online as integrated storage options for the Salesforce platform.
In today's announcement, Salesforce claims it's the first to create a repository that natively integrates CRM content and files among popular enterprise file stores. Salesforce.com said it provides a simple method of browsing, searching and sharing files located in various repositories.
Salesforce.com described two simple use cases. One would enable a sales rep to attach a presentation on OneDrive for Business to a sales lead in the Salesforce CRM app. The other would allow a service representative to pull an FAQ content form OneDrive for Business running in the Salesforce Service Cloud app.
The connector supports federated search to query repositories simultaneously from any device and lets users attach files to social feeds, groups or records, enabling them to find contextually relevant information in discussions running in Salesforce Chatter. The tool is also designed to enforce existing file permissions.
For customers and third-party software providers wanting to embed file sharing into their applications, Salesforce.com also is offering the Salesforce Files Connect API.
Posted by Jeffrey Schwartz on 12/17/2014 at 12:26 PM0 comments
Microsoft last month entered the wearables market with the Microsoft Band, which, paired with the new Microsoft Health Web site and app for the wrist band, is designed to track your physical activities and bring some productivity features to your wrist.
The Microsoft Band, in my opinion, does a lot of interesting things, though it doesn't really excel at any of them at this point. Among the productivity features included are alerts that let you glance at the first sentence or two of a message, texts, Facebook posts, Facebook Messenger, phone calls, voicemails, schedules, stock prices and an alarm clock that vibrates gradually for deep sleepers who don't like to jump out of bed.
Then there's the health component that has a pedometer to track your steps, a monitor for runners as well as one to track general workouts. It also tracks your sleep including your average heart rate and how often you supposedly woke up. You can synchronize whatever physical activity it monitors with Microsoft Health, an app that runs on any iOS, Android and Windows Phone device. If you use it with Windows Phone, you get the added benefit of using Microsoft's Cortana, the digital assistant that responds to spoken commands. You can also look at reports on the Microsoft Health Web site.
My personal favorite: the Starbucks app, which presents the scan image of your account allowing for the barista to scan it when making a purchase. Most of them seeing it for the first time responded with awe, with one saying that "this is the wave of the future."
Though I've been skeptical about wearables like this and others like it from Fitbit, Samsung, Garman, Nike, Sony, and dozens of other providers, it's clearly a growing market and it may very well be the wave of the future -- or at least a wave of the future. Market researcher Statistica forecasts that the market for these wearables will be close to $5.2 billion this year, which is more than double over last year. In 2015, sales of these gadgets will hit $7.1 billion and by 2018 it will be $12.6 billion.
Gartner last month reported that while smart wristbands are poised for growth, its latest survey shows at least half are considering smartwatches. It actually sees the smartwatch market growing from 18 million units this year to 21 million in 2015, while purchases of wristbands will drop from 20 billion to 17 billion. Certainly the release of the Apple Watch, despite its hefty starting price of $349, will likely fuel that market, though I already questioned how much demand we'll see for it.
I haven't tested other devices so it's hard to say how the Microsoft Band rates compared to them. But I find the notion of having information on my wrist more compelling than I had thought. However, performance of my Microsoft Band is flaky. I've encountered synchronization problems that have required me to uninstall and reinstall the Microsoft Health app on my iPhone on a number of occasions. It has presented realistic heart rates when I'm at the gym and suddenly it would give numbers not believable. When I click on the e-mail button it often says I have nothing new and even when I can read them, the messages are cryptic and don't always indicate the sender.
I like that the Microsoft Band does synchronize with some other health apps, such as MyFitnessPal, which I use to track my meals these days. By importing that data, it provides more relevant info that I'd otherwise have to figure out and enter manually. The problem is, I don't believe I could have possibly burned 2,609 calories from a recent visit to the gym, though it would be nice if that was indeed the case.
That's why after spending several weeks with it, I can say I like the concept but it's not worth its $199 price tag unless money is no object to you. While I agree with my colleague Brien Posey that the Microsoft Band has some nice features, I think I'd wait for an improved version of the Microsoft Band and a richer Microsoft Health site before buying one of these (unless they become remarkably less expensive).
That stated, I hope Microsoft continues to enhance the Microsoft Band by adding more capacity and battery life to make it a more usable and comfortable device. If everyone had accurate readings of our physical activities, maybe it would lead to healthier lifestyles.
Posted by Jeffrey Schwartz on 12/15/2014 at 7:28 AM0 comments
Once hailed as the future of in-vehicle communications and entertainment, a partnership between Ford and Microsoft has all but unraveled. Ford this week said it's replacing Microsoft Sync with BlackBerry's QNX software.
Ford launched its Sync 3 platform, which ushers in significant new features and will show up in 2016 vehicles sometime next year, the company announced yesterday. Though Ford didn't officially announce it was walking away from Microsoft Sync in favor of BlackBerry QNX, The Seattle Times reported in February that the automaker was on the verge of making the switch. Raj Nair, Ford's CTO of global product development, said in numerous reports yesterday that QNX is now the new platform. Some 7 million Ford vehicles are reportedly equipped with Microsoft Sync but the systems have continuously scored poorly in consumer satisfaction reports due to frequent malfunctions.
Swapping out Microsoft Sync for QNX would also result in cost savings, according to The Seattle Times, noting that it's also used in the in-vehicle navigation systems of Audis and BMWs. Apple and Google also have alliances with various car manufactures. While BlackBerry smartphones may be rapidly disappearing, QNX has gained significant ground in the in-vehicle systems market. While Microsoft Sync, based on Windows Embedded, is said to also run the vehicle entertainment systems of some BMW, Kia, Fiat and Nissan models, Ford and Microsoft announced with great fanfare in 2007 their plans roll out models with the entertainment system as an option.
Microsoft Sync was initially designed to link iPods and Zune music players to entertainment systems, debuting just at the dawn of the smartphone age. At the time, Microsoft Founder Bill Gates saw Microsoft Sync as another element of the company's "Windows Everywhere" effort. As we all know, much as changed since then.
If Microsoft has new plans for Sync, the next logical time to announce them would be at next month's annual Detroit Auto Show.
Posted by Jeffrey Schwartz on 12/12/2014 at 11:28 AM0 comments
While IP address conflicts are as old as networks themselves, the growing number of employee-owned devices in the workplace are making them a more frequent problem for system administrators. By nature of the fact that PCs and devices have become transient in terms of the number of networks they may connect to, it's not uncommon for a device to still think it's linked to one network, causing an IP address conflict when it tries to connect to another network.
SolarWinds is addressing that with its new IP Control Bundle, which identifies and resolves IP address conflicts. The bundle consists of the new SolarWinds IP Address Manager (IPAM) and the SolarWinds User Device Tracker (UDT). There are two parts to the IP address resolution process.
First, IPAM identifies the IP conflicts by subnet, provides a history of where the user and machine connected to the network, identifies the switch and port on which that system connected and then actually disables that user's connection. Next, UDT uses that information to disable the switch port and assigns a new IP address and updates any DNS entries as necessary for the device to work before reconnecting.
IPAM and UDT are typically installed on a separate server, and when a problem arises an administrator can use the software to scan the network and IP address ranges. It also interrogates routers, switches and other network infrastructure to gather relevant troubleshooting information. Rather than using agents, it relies on standard protocols, notably SNMP.
In addition to troubleshooting and remediating client-based devices, the SolarWinds package can handle IP address conflicts occurring on servers and virtual machines, says Chris LaPoint, vice president of product management at SolarWinds.
"If I'm the owner of that critical app trying to figure out what's going on, I can go to this tool and see that Joe over in another part of the datacenter has spun up a new VM and that's what's creating issues with my application," LaPoint explains. "So now I can probably notify Joe and tell him I'm kicking him off the network because it's actually affecting the availability of a customer-facing application that we need to have running."
Pricing for IPAM starts at $1,995 and UDT begins at $1,795.
Separately, SolarWinds this week said its SolarWinds Web Help Desk now works with DameWare Remote Support. SolarWinds acquired DameWare in 2011 but it operates as a separate business unit. The products are collectively used by 25,000 customers and the combined solution will allow help desk technicians to connect with remote devices or servers, collect support data including chat transcripts and screen shots and generate reports.
The SolarWinds Web Help Desk offering provides automated ticketing, SLA alerts, asset management and reporting while DameWare Remote Support provides remote access to client devices and servers, allowing administrators to take control of those systems and manage multiple Active Directory Domains as well as resetting passwords.
Posted by Jeffrey Schwartz on 12/12/2014 at 11:27 AM0 comments
During the Thanksgiving break, I had a number of simultaneous encounters with PCs in public places still sporting the Windows XP logo and it got under my skin. Among them was a computer near the checkout area at Home Depot. And within an hour I spotted another on a counter right next to the teller stations at my local Bank of America branch.
Given that we know Windows XP systems are no longer patched by Microsoft, the sight of them is becoming as uncomfortable as being near someone who has a nasty cold and coughs without covering his or her mouth. Speaking of spreading viruses, I've even been to two different doctors' offices in recent months that were running Windows XP-based PCs -- one of them is used to actually gather patient information and the other to schedule appointments. In both cases, when I asked if they planned to upgrade those systems, I got the equivalent of a blank stare. I don't think they had any idea what I was talking about.
Nevertheless, seeing a Windows XP PC just after I used the self-checkout terminal at Home Depot was especially unsightly given the retailer's massive breach last month in which e-mail addresses were stolen. Home Depot Spokeswoman Meghan Basinger said: "Thanks for reaching out, but this isn't detail we'd discuss."
Now the Bank of America situation is a bit different. The day after the Thanksgiving holiday weekend, InformationWeek announced their IT chief of the year: Cathy Bessant, head of Bank of America's 100,000-person Global Technology & Operations, who manages an IT organization of 100,000 employees. That's a lot of IT pros and developers.
Bank of America appeared to have a strong IT organization just by the nature of the way the company is often first to market with new e-banking features and mobile apps. The bank's systems tend to be reliable and they haven't had any major breaches that I can recall. Also, having worked in the past for InformationWeek Editor-in-Chief Rob Preston, who interviewed Bessant and reported on the bank's ambitious IT efforts, I have no doubt the choice was a well vetted one.
So when he noted among the bank's many milestones this year that its IT team completed the largest Windows 7 migration to date (300,000 PCs), I felt compelled to check in with Bank of America Spokesman Mark Pipitone. Perhaps after updating so many systems, my inquiry sounded petty, but I was curious as to how they were dealing with these stray Windows XP systems. Were they paying $200 for premium support per system or maybe the PC was just front-ending an embedded system? (Microsoft does still support Windows XP embedded.) As such, I sent a picture of the system to Pipitone.
"Not knowing exactly what device you took a picture of, the best the team can tell is that it's an excepted device (there are some across our footprint), or it's a device that's powered on but not being used on a regular basis," Pipitone responded.
I made a trip to the branch and asked what the XP machine was used for. A rep there told me that it was used for those needing to access their safe deposit boxes. I informed Pipitone of that, though he declined to comment further. Maybe the lone PC I saw isn't connected to the Internet or it is otherwise protected. But the mere public display of Windows XP machines in so many different places for many tech-aware people is still disconcerting.
I laud Bank of America and others who have undertaken the painful move of modernizing their PC environments. At the same time, I look forward to a day when I don't have to see that Windows XP logo when I walk into a place of business, whether it's a doctor's office, a local restaurant or a major retailer or bank. Windows XP was a great operating system when it came out and I know some defenders of the legacy OS will be outraged by my stance -- many of whom are angered by Microsoft's decision to stop supporting it. But Windows XP machines are likely unprotected unless they're not, and never will be, connected to a network.
There is some encouraging news. Waiting in my inbox on December 1 right after the holiday weekend was a press release from StatCounter reporting that there are more Windows 8.1 PCs out there than those with Windows XP. According to the November report, 10.95 percent of systems are running Windows 8.1. Windows XP still accounts for 10.67 percent. This marks the first time that there are more Windows 8.1-based systems than Windows XP PCs, according to its analysis. Back in August, the combination of Windows 8 and Windows 8.1 systems achieved that milestone, so it could be argued the latest report is a minor feat.
Nevertheless, the stragglers will remain for some time, according to Sergio Galindo, general manager of GFI Software, a provider of Web monitoring and patch management software. "I'm aware of several companies that continue running large XP installations -- and even larger IT budgets -- that may have custom XP agreements," Galindo said. "Windows XP will continue to survive as long as it meets people's needs. To keep a network secure, IT admins and computer consultants can 'lock down' the accounts on the XP machines. I strongly advise that machines running XP be allowed only minimal capabilities and have no admin access. I also favor using more secure browsers such as Chrome versus Internet Explorer in these cases. Also, IT admins may want to shut off some of the more common attack vectors such as Adobe Flash. In the case of XP, less (software) is more (secure)."
By the way, just a friendly reminder: there are just over 200 days left before Microsoft will no longer support Windows Server 2003. You'll be hearing a lot about that from us and Redmond magazine's Greg Shields last month primed the pump.
Posted by Jeffrey Schwartz on 12/10/2014 at 12:54 PM0 comments
At Microsoft's annual shareholder meeting Wednesday in Bellevue, Wash., CEO Satya Nadella cashed in big. Shareholders approved his proposed $84 million pay package, a reward for a job well done. The pay package, which includes $59.2 million in stock options and a $13.5 million in retention pay, according to Bloomberg, has come under attack as excessive by Institutional Shareholder Services, an investor advisory organization.
Indeed Nadella ranks among the most highly paid CEOs. According to this year's Wall Street Journal/Hay Group CEO Compensation report ranking the 300 largest companies in revenue, the median pay package was $11.4 million, with Oracle CEO Larry Ellison taking the top spot in 2013 earning $76.9 million.
By that measure, Nadella isn't breaking any records. Oracle's share price rose nearly 29 percent, while Microsoft's share price jumped 32 percent since Nadella took over in early February. Nevertheless, investor advocates have scrutinized CEO compensation in wake of the financial crisis.
While Microsoft's prospects look better than they have in a long time, the package for some may look excessive. Others would argue Nadella has plenty of incentive to continue Microsoft's turnaround, which is still in its early stages and certainly not yet a sure thing, given rapid changes in fortune that can take place in the IT industry.
Do you believe Nadella's compensation is excessive or is it fair?
Posted by Jeffrey Schwartz on 12/05/2014 at 12:09 PM0 comments
It was hard to ignore the hype over the Thanksgiving weekend's traditional Black Friday and Cyber Monday barrage of cut rate deals including this year's decision by quite a few retailers to open their doors earlier than ever. Many, including the Microsoft Store, opened as early as 6 p.m. on Thanksgiving Day, hoping to lure people away from their turkey dinner earlier to get a jump on their holiday shopping.
Content with spending Thanksgiving Day with my family and not a big fan of crowds anyway, I decided to stop by my local Staples at a normal hour on Friday morning. To my surprise, there were just a handful of people in the store. When I asked an employee why the store was so empty on Black Friday, she said the crowds were all there Thanksgiving night.
When I asked her how many people bolted early from their turkey dinners, she said there was a line of about 100 people outside the store prior to its 6 p.m. opening Thursday evening. Apparently a good chunk of them were waiting for the $99 Asus EeeBook X205TA, which normally sells for at least double that price. Truth be told, that's why I popped in, though I had anticipated the allotment would be sold out. I had already looked at the 11.6-inch Windows 8.1 notebook, which can also function as a tablet with its removable keyboard. It's powered with an Intel Atom processor, 2 GB of RAM and a 32GB SSD.
I asked her how many people in line were waiting for that device and she replied that more than half were. While many Windows 8.1 notebooks and tablets were on sale during the holiday rush, the two prominent $99 deals were the aforementioned Asus device and the HP Stream 7. The latter is a 7-inch Windows 8.1 tablet and it comes with a one-year Office 365 Personal subscription good for the tablet and one other PC. The discounted HP Stream 7 is only available today at the Microsoft Store, which is also offering up to $150 off on the most expensive Surface Pros with Intel Core i7 processors.
The HP Stream 7 is also powered by an Intel Atom processor, a 32GB SSD but only has 1GB of RAM. While you shouldn't plan on doing much multitasking with this device, it's certainly a viable option if you want an ultra-portable tablet that can quickly access information and function as an option to a Kindle Fire (the Kindle app is among many apps now available in the Microsoft Store).
Given I already have a Dell Venue 8 Pro with similar specs and 2GB of RAM, the HP Stream 7 was of little interest to me, though it would make a good gift for someone at that price. Back at Staples, I asked the employee if there were any of the Asus convertibles left at the $99 price and to my surprise she said they were all out but I could order one with free delivery from the store's kiosk. It's slated to arrive today. Apparently you can still order one on this Cyber Monday on Staples' Web site (you can probably get a competitor to match the price).
Today the National Retail Federation released a report forecasting that sales over the Thanksgiving weekend overall were down 11 percent and there are a number of theories for why that's the case. The drop in sales does show that all of those retailers who felt compelled to open their doors on Thanksgiving Day may want to rethink that strategy for next year.
Posted by Jeffrey Schwartz on 12/01/2014 at 12:54 PM0 comments
Microsoft this week gave developers and IT pros a deep dive on major new features coming to Office 365, which the company has described as the fastest growing new product in its history. The demos, which include several APIs and SDKs aimed at driving existing SharePoint users to Office 365, gave a close look at how building and administering applications for collaboration is going to change dramatically for IT pros, developers and end users alike.
Because Microsoft has made clear that organizations running applications developed in native code for SharePoint won't be able to migrate them to Office 365, the company is trying to convince customers to plan for the eventual move using the company's new app model. Microsoft is betting by offering compelling new capabilities, which it describes as its "Office Everywhere" effort, that organizations will find making the move worthwhile.
The APIs and new Office 365 features demonstrated include the new My Apps user interface, which the company also calls the App Launcher, due out for preview imminently after what the company described as a brief delay. My Apps gives users a customizable interface to applications they use such Word, Excel, PowerPoint, contacts, mail and files. They can also add other Microsoft services as well as ultimately those of third parties.
Jeremy Thake, a senior Microsoft product manager, demonstrated the new Office 365 platform and underlying API model Thursday at the Live! 360/SharePoint Live! conference in Orlando. Thake said the Microsoft Graph demo was the first given in the United States since the company unveiled it two weeks ago at TechEd Europe, where Microsoft also released the preview of the new Office 365 APIs.
"The Microsoft Graph is essentially allowing me to authenticate once and then go to every single endpoint across Microsoft. And not just Office but Dynamics to Azure and anything I've got running Windows, such as Live, Outook.com and whatnot," Thake said during the demo, noting the plan is to tie it to third-party services that have registered to Microsoft Graph. "It's an access to those things from that one endpoint. This is a really powerful thing that isn't out yet. It's in preview; it will be coming next year."
Consultant Andrew Connell, organizer of the SharePoint Live! track at Live! 360 said the release of the APIs and the Microsoft Graph bode well for the future of Office 365 and SharePoint. "It opens so much more of the company data, not just so much more of our data that we're using in Microsoft services from a uniform endpoint for other companies to interact with and provide additional value on it," he said during the closing conference wrap up panel. "That's going to be tremendous. That [Microsoft Graph] API is being pushed by the 365 group but it's a Microsoft thing -- it touches everything we do."
Thake demonstrated numerous other APIs including a discovery service and the new Android and iOS SDKs, among other things. There are huge changes coming to Office 365 in 2015 and it will have a huge impact on IT pros and developers who build to and manage it. It was a huge topic at SharePoint Live! and I'll be sharing the implications of what's in the pipeline in the coming weeks.
Posted by Jeffrey Schwartz on 11/21/2014 at 9:02 AM0 comments
While Microsoft this year has rolled out extensive additions to its data management portfolio as well as business intelligence and analytics tools, SQL Server is still its core database platform. Nevertheless, Microsoft has unleashed quite a few new offerings that DBAs, developers and IT decision makers need to get their arms around.
"I think Microsoft needs to have the full stack to compete in the big data world," said Andrew Brust, who is research director at Gigaom Research. Brust Tuesday gave the keynote address at SQL Server Live!, part of the Live! 360 conference taking place in Orlando, Fla., which like Redmond, is produced by 1105 Media. Microsoft CEO Satya Nadella has talked of the data culture that's emerging, as noted in the Redmond magazine October cover story.
Brust pointed out that Microsoft has delivered some significant new tools over the past year including its Azure HDInsight, its Apache Hadoop-based cloud service for processing unstructured and semi-structured Big Data. Microsoft recently marked the one-year anniversary of Azure HDInsight with the preview of a new feature, Azure Machine Learning, which adds predictive analysis to the platform.
"Since the summer, they've added half a dozen new data products, mostly in the cloud but they're significant nonetheless," Brust said in an interview, pointing to the variety of offerings ranging from Stream Analytics, the company's real-time events processing engine to Azure Data Factory, which lets customers provision, orchestrate and process on-premises data such as SQL Server with cloud sources including Azure SQL database, Blobs and tables. It also offers ETL as a service. Brust also pointed to the new Microsoft DocumemtDB, the company's new NoSQL entry, which natively supports JSON-compatible documents.
Microsoft's release of SQL Server 2014, which adds in-memory processing to its flagship database, aims to take aim at SAP's HANA. "Microsoft is going after it from the point of view you can have in memory and just stay in SQL Server instead of having to having to move to a specialized database," Brust said. "It's a version one, so I don't expect adoption to be huge but it will be better in the next version. They are definitely still working on it. It's not just one-off that they threw out there -- it's very strategic for them."
Posted by Jeffrey Schwartz on 11/19/2014 at 1:38 PM0 comments
Microsoft today said it has merged Windows code into Docker, allowing administrators from a Windows client to manage Docker containers running on Linux hosts. It's the latest move by Microsoft to jump on the Docker bandwagon, which began earlier this year with its support for Linux containers in the Azure pubic cloud, and continued with last month's pact by the two companies to develop native Docker clients for Windows Server.
The company published reference documentation, published in the form of a command-line interface (CLI), that illustrates how to compile a Docker container on Windows. "Up 'til today you could only use Linux-based client CLI to manage your Docker container deployments or use boot2docker to set up a virtualized development environment in a Windows client machine," wrote Khalid Mouss, a senior program manager for the Azure runtime, in a blog post.
"Today, with a Windows CLI you can manage your Docker hosts wherever they are directly from your Windows clients," Mouss added. The Docker client is in the official Docker GitHub repository. Those interested can follow its development under Pull Request#9113.
While noteworthy, it bears noting this is not the announcement of a Windows Docker client -- it's just a move to enable management of Linux clients from a Windows client, said Andrew Brust, research director at Gigaom Research, who, when I saw the news on my phone, happened to be sitting next to me at our Live! 360 conference, taking place this week in Orlando, Fla. "This is simply a client that lets you run a client to manage Linux-based Docker containers," Brust said. "It's interesting but it's not a huge deal."
Furthermore, Mouss said on the heels of Microsoft open sourcing .NET Framework last week, the company this week also released a Docker image for ASP.NET on Docker Hub, enabling developers to create ASP.NET-ready containers from the base image. The ASP.NET image is available from Docker Hub.
See this month's Redmond magazine cover story on Microsoft's move toward containers as potentially the next wave on infrastructure and application virtualization.
Posted by Jeffrey Schwartz on 11/18/2014 at 11:52 AM0 comments
When Amazon Web Services announced Aurora as the latest database offering last week, the company put the IT industry on notice that it once again believes it can disrupt a key component of application infrastructures.
Amazon debuted Aurora at its annual AWS re:Invent customer and partner conference in Las Vegas. Amazon said the traditional SQL database for transaction-oriented applications, built to run on monolithic software and hardware, has reached its outer limits. Amazon Web Services' Andy Jassy said in the opening keynote address that the company has spent several years developing Aurora in secrecy.
Built on the premise that AWS' self-managed flagship services EC2, S3 and its Virtual Private Cloud (VPC) are designed for scale-out, service-oriented and multi-tenant architectures, Aurora removes half of the database out of the application tier, said Anurag Gupta, general manager of Amazon Aurora, during the keynote.
"There's a brand new log structured storage system that's scale out, multi-tenant and optimized for database workloads, and it's integrated with a bunch of AWS services like S3," said Gupta, explaining Aurora is MySQL-compatible. Moreover, he added, those with MySQL-based apps can migrate them to Aurora with just several mouse clicks and ultimately see a fivefold performance gain.
"With Aurora you can run 6 million inserts per minute, or 30 million selects," Gutpa said. "That's a lot faster than stock MySQL running on the largest instances from AWS, whether you're doing network IOs, local IOs or no IOs at all. But Aurora is also super durable. We replicate your data six ways across three availability zones and your data is automatically, incrementally, continuously backed up to S3, which as you know is designed for eleven nines durability."
Clearly Amazon is trying to grab workloads that organizations have built for MySQL but the company is also apparently targeting those that run on other SQL engines that it now hosts via its Relational Database Service (RDS) portfolio including Oracle, MySQL and Microsoft's SQL Server.
Aurora automatically repairs failures in the background recovering from crashes within seconds, Gupta added. It can replicate six copies of data across three Availability Zones and backup data continuously to S3. Customers can scale an Aurora database instance up to 32 virtual CPUs and 244GB of memory. Aurora replicas can span up to three availability zones with storage capacities starting at 10GB and as high as 64TB.
Gupta said the company is looking to price this for wide adoption, with pricing starting at 29 cents for a two-virtual CPU, 15.25-GB instance.
The preview is now available. Do you think Amazon Aurora will offer a viable alternative to SQL databases?
Posted by Jeffrey Schwartz on 11/17/2014 at 12:32 PM0 comments
Facebook is secretly developing a social network aimed at enterprise users, according to a report published in today's Financial Times. The report said Facebook at Work could threaten Microsoft's Yammer enterprise social network as well as LinkedIn and Google Drive.
At first glance, it's hard to understand how Facebook at Work would challenge both Yammer and LinkedIn. Though they're both social networks, they are used in different ways. Granted there's some overlap, Yammer is a social network for a closed group of users. Facebook users have apparently used Facebook at Work over the past year for internal communications and the company has let others test it as well.
The report was otherwise quite vague and I wonder if the author even understands the difference between Yammer, LinkedIn and Google Drive. It's not unreasonable to think Facebook would want to offer a business social network similar to Yammer or Salesforce.com's Chatter. But as PC Magazine points out, many businesses might have issues with a service provided by Facebook.
That said, I reached out to SharePoint and Yammer expert Christian Buckley, who recently formed GTConsult, to get his take. Buckley said there's been buzz about Facebook's ambitions for some time but he's skeptical that Facebook could make a serious dent in the enterprise social networking market despite its dominance on the consumer side.
"Honestly I think they're a couple years behind in making any serious move in this space," Buckley said. "They will undoubtedly attract users, and have a number of high-profile deployments, but there is a very real line of demarcation between consumer and business platforms, and I just don't see Facebook as being able to close that gap in any serious way."
Buckley also noted that Google, LinkedIn and Yammer have very different value propositions to enterprises. "Each have their own struggles," Buckley said. "LinkedIn may be displacing Yahoo Groups and other public chat forums, but my understanding is that they are having a difficult time translating that moderate growth into additional revenue beyond job postings. Yammer's difficulties may be a closer comparison and highlight Facebook's uphill battle to win over the enterprise by aligning ad hoc social collaboration capabilities with business processes. Microsoft has Yammer at the core of its inline social strategy, and like SharePoint, the individual Yammer brand will fade (in my view) as the core features are spread across the Office 365 platform. Instead of going to a defined Yammer location, the Yammer-like features will happen in association with your content, your e-mail, your CRM activities, and so forth."
What's your take on this latest rumor?
Posted by Jeffrey Schwartz on 11/17/2014 at 11:56 AM0 comments
When Microsoft CEO Satya Nadella last month said, "Microsoft loves Linux" and pointed to the fact that 20 percent of its Azure cloud is already running the open source popular platform, he apparently was getting ready to put his money where his mouth is.
At its Connect developer conference this week, Microsoft said it will open source its entire .NET Framework core and bring it to both Linux and the Apple Macintosh platform. It is the latest move by Microsoft to open up its proprietary .NET platform. Earlier this year, the company made ASP.NET and the C# compiler open source. This week the company released the .NET Core development stack and in the coming months, Microsoft will make the rest of .NET Core Runtime and .NET Core Framework open source.
Citing more than 1.8 billion .NET installations and over 7 million downloads of Visual Studio 2013 during the past year, Microsoft Developer Division Corporate Vice President S. Somasegar said in a blog post, "we are taking the next big step for the Microsoft developer platform, opening up access to .NET and Visual Studio to an even broader set of developers by beginning the process of open sourcing the full .NET server core stack and introducing a new free and fully-featured edition of Visual Studio." These were all once unthinkable moves.
Just how big a deal is this? Consider the reaction of Linux Foundation Executive Director Jim Zemlin: "These are huge moves for the company," he said in a blog post. "Microsoft is redefining itself in response to a world driven by open source software and collaborative development and is demonstrating its commitment to the developer in a variety of ways that include today's .NET news."
Zemlin lauded a number of Microsoft's open source overtures including its participation in the OpenDaylight SDN project, the AllSeen Alliance Internet of Things initiative and the Core Infrastructure Initiative.
For IT pros, the move is Microsoft's latest affirmation of the company's embrace of open source and Linux in particular. At the same time, while some believe Microsoft is also doing so to deemphasize Windows, the company's plans to provide Docker containers in Windows Server suggests the company has a dual-pronged strategy for datacenter and applications infrastructure: bolster the Windows platform to bring core new capabilities to its collaboration offerings while ensuring it can tie to open source platforms and applications as well.
At the same time, it appears that Microsoft is seeking to ensure that its development environment and ecosystem remains relevant in the age of modern apps. Zemlin believes Microsoft has, in effect, seen the light. "We do not agree with everything Microsoft does and certainly many open source projects compete directly with Microsoft products," he said. "However, the new Microsoft we are seeing today is certainly a different organization when it comes to open source. Microsoft understands that today's computing markets have changed and companies cannot go it alone the way they once did."
Posted by Jeffrey Schwartz on 11/14/2014 at 11:11 AM0 comments
When Microsoft last month announced it has 100-plus partners adopting its burgeoning Cloud OS Network, which aims to provide Azure-compatible third party cloud services, it left out perhaps one of the biggest fishes it has landed: Rackspace.
The two companies are longtime partners, and as I recently reported, Rackspace has extended its Hyper-V-compatible offerings and dedicated Exchange, SharePoint and Lync services. But Rackspace also has a formidable cloud infrastructure as a service that competes with the Azure network. The news that Rackspace now will provide Azure-compatible cloud service, announced on Monday with Rackspace's third-quarter earnings report, signals a boost for both companies.
For Microsoft it brings one of the world's largest public clouds and dedicated hosting providers into the Azure fold. Even if it's not all in or the core of Rackspace business -- that is still reserved for its own OpenStack-based infrastructure, a healthy VMware offering and the newly launched Google Apps practice -- Rackspace has a lot of Exchange and SharePoint hosting customers who may want to move to an Azure-like model but want to use it with the service level that the San Antonio, Texas-based company emphasizes.
"Those who are down in the managed 'colo' world, they don't want to be managing the infrastructure. They want us to do that," said Jeff DeVerter, general manager of Microsoft's Private Cloud business at Rackspace. "They're happy to let that go and get back into the business of running the applications that run that business."
Customers will be able to provision Azure private cloud instances in the Rackspace cloud and use the Windows Azure Pack to manage and view workloads. This is not a multitenant offering like Azure or similar infrastructure-as-a- service clouds, DeVerter pointed out. "These are truly private clouds from storage to compute to the networking layer and then the private cloud that gets deployed inside of their environment is dedicated to theirs. We deploy a private cloud into all of our datacenters [and] it puts the customers' cloud dual homing some of their management and reporting back to us so that we can manage hundreds and then thousands of our customers' clouds through one management cloud."
Microsoft first launched the Cloud OS Network nearly a year ago with just 25 partners. Now with more than 100, Marco Limena, Microsoft's vice president of Hosting Service Providers, claimed in a blog post late last month that there are in excess of 600 Cloud OS local datacenters in 100 companies serving 3.7 million customers. The company believes this network model will address the barriers among customers who have data sovereignty and other compliance requirements.
Among the members of the Cloud OS Network listed in an online directory are Bell Canada, CapGemini, Datapipe, Dimension Data and SherWeb. "Microsoft works closely with network members to enable best-practice solutions for hybrid cloud deployments including connections to the Microsoft Azure global cloud," Limena said.
Asked if it's in the works for Rackspace to enable Cloud OS private cloud customers to burst workloads to the Microsoft Azure service, DeVerter said: "Those are active conversations today that we're having internally and having with Microsoft. But right now our focus is around making that private cloud run the best it can at Rackspace."
Posted by Jeffrey Schwartz on 11/12/2014 at 1:01 PM0 comments
If the Microsoft Azure public drive is going to be the centerpiece of its infrastructure offering, the company needs to bring third-party applications and tools along with it. That's where the newly opened Microsoft Azure Marketplace comes in. The company announced the Microsoft Azure Marketplace at a press and analyst briefing in San Francisco late last month led by CEO Satya Nadella and Scott Guthrie, executive VP of cloud and enterprise. As the name implies, it's a central marketplace in which providers can deliver to customers to run their software as virtual images in Azure.
A variety of providers have already ported these virtual images to the marketplace -- some are pure software vendors, while others are providers of vertical industry solutions -- and a number of notable offerings have started appearing. Many providers announced their offerings at last month's TechEd conference in Barcelona.
One that Microsoft gave special attention to at the launch of the Azure Marketplace was Cloudera, the popular supplier of the Apache Hadoop distribution. Cloudera has agreed to port its Cloudera Enterprise distribution, which many Big Data apps are developed on, to Microsoft Azure. That's noteworthy because Microsoft's own Azure HDInsight Hadoop as a Service is based on the Hortonworks Apache Hadoop distribution. While it could cannibalize Azure HDInsight, those already committed to Cloudera are far less likely to come to Azure than if Cloudera is there.
"To date, most of our customers have built large infrastructures on premises to run those systems, but there's increasing interest in public cloud deployment and in hybrid cloud deployment, because infrastructure running in the datacenter needs to connect to infrastructure in the public cloud," said Cloudera Founder and Chief Strategy Officer Mike Olsen, speaking at the Microsoft cloud briefing in San Francisco. "This we believe is, for our customers, a major step forward in making the platform more consumable still."
Also up and running in the Azure Marketplace is Kemp Technologies, a popular provider of Windows Server load balancers and application delivery controllers. The Kemp Virtual LoadMaster for Azure lets customers create a virtual machine (VM) optimized to run natively in the Microsoft cloud, said Maurice McMullin, a Kemp product manager.
"Even though Azure itself does have a load balancer, it's a pretty rudimentary one," McMullin said. "Having the Kemp load balancer in there totally integrated into the Azure environment allows you to script some of those environments and application scenarios. The impact of that is, for an organization that's looking toward the cloud, one of the big challenges is trying to maintain the consistency by having a consistent load balancer from on premises, meaning you get a single management interface and consistent management of apps and policies on premises or in the cloud."
Lieberman Software has made available as a virtual image in the marketplace its Enterprise Random Password Manager (ERPM), which the company said provides enterprise-level access controls over privileged accounts throughout the IT stack, both on premises and now in Azure.
The company says ERPM removes persistent access to sensitive systems by automatically discovering, securing and auditing privileged accounts across all systems and apps within an enterprise. Authorized administrators can delegate to users quick access to specific business applications, as well as corporate social media sites in a secure environment. And those activities are automatically recorded and audited. It also ensures access to such identities is temporary and able to ensure unauthorized or anonymous access to sensitive data is avoided.
Another security tool is available from Waratek Ltd., a supplier of a Java Virtual Machine (JVM) container, which lets enterprises bring their own security to the cloud. Called Runtime Application Self-Protection (RASP), it monitors for key security issues and provides policy enforcement and attack blocking from the JVM.
In the JVM, the company offers a secure container where administrators can remotely control their own security at the application level, said Waratek CEO Brian Maccaba. "This is over and beyond anything the cloud provider can do for you and it's in your control," Maccaba says. "You're not handing it to Microsoft or Amazon -- you're regaining the reins, even though it's on the cloud."
The number of offerings in the Azure Marketplace is still relatively few -- it stands at close to 1,000 based on a search via the portal, though it is growing.
Posted on 11/10/2014 at 12:44 PM0 comments
Microsoft got some positive ink yesterday when it announced that Office 365 users on iPhones and iPads can now edit their documents for free and that the same capability was coming to Android tablets. Indeed it is good news for anyone who uses one or more of those devices (which is almost everyone these days).
But before you get too excited, you should read the fine print. As Directions on Microsoft Analyst Wes Miller noted on his blog, "Office is free for you to use on your smartphone or tablet if, and only if you are not using it for commercial purposes [and] you are not performing advanced editing."
If you do fit into the above-mentioned buckets or you want the unlimited storage and new Dropbox integration, it requires either an Office 365 Personal, Home or a commercial Office 365 subscription that comes with the Office 365 ProPlus desktop suite, Miller noted. As Computerworld's Gregg Keizer put it: "What Microsoft did Thursday was move the boundary between free and paid, shifting the line."
In Microsoft's blog post announcing the latest free offering, it does subtly note that this offer may not be entirely free. "Starting today, people can create and edit Office content on iPhones, iPads, and soon, Android tablets using Office apps without an Office 365 subscription," wrote Microsoft Corporate VP for Microsoft Office John Case, though that fine print was at the end of his post. "Of course Office 365 subscribers will continue to benefit from the full Office experience across devices with advanced editing and collaboration capabilities, unlimited OneDrive storage, Dropbox integration and a number of other benefits." Microsoft offers similar wording on the bottom of its press release issued yesterday.
Still, while noting this is great news for consumers, it's going to be problematic for IT organizations, Miller warned, especially those that have loose BYOD policies. "For commercial organizations, I'm concerned about how they can prevent this becoming a large license compliance issue when employees bring their own iPads in to work."
Are you concerned about this as well?
Posted by Jeffrey Schwartz on 11/07/2014 at 11:04 AM0 comments
BlueStripe Embeds App Monitor into System Center, Windows Azure Pack
BlueStripe Software is now offering its Performance Center tool as a management pack for Microsoft System Center 2012 R2 Operations Manager. The company earlier this year released the dashboard component of FactFinder, which monitors distributed applications across numerous modern and legacy platforms.
With the addition of Performance Center, the company has embedded its core FactFinder tool into System Center. FactFinder can monitor everything from mainframe infrastructure including CICS and SAP R3 transactions, along with applications running on Unix, Linux and Windows infrastructures. BlueStripe said it provides visibility and the root causes of performance to application components on physical, virtual and cloud environments. It works with third-party public cloud services, as well.
FactFinder integrates Operation Manager workflows, providing data such as response times, failed connections, application loads and server conditions, the company said. It also maps all business transactions by measuring performance across each hop of a given chain and is designed to drill into the server stack to determine the cause of a slow or failing transaction.
In addition to the new System Center Management Pack, BlueStripe launched Performance Center for the Windows Azure Pack, which is designed to provide administrators common visibility of their Windows Server and Microsoft Azure environments. This lets administrators and application owners monitor the performance via the Windows Azure Pack.
BlueStripe Marketing Manager Dave Mountain attended last week's TechEd Conference in Barcelona and said he was surprised at the amount of uptake for the Windows Azure Pack. "There's a recognition of the need for IT to operate in a hybrid cloud world," Mountain said. "IT's reason for existing is to ensure the delivery of business services. Tools that allow them to focus on app performance will be valuable and that's what we are doing with FactFinder Performance Center for Windows Azure Pack."
Netwrix Tackles Insider Threats with Auditor Upgrade
Netwrix Corp. has upgraded its auditing software to offer improved visibility to insider threats, while warning of data leaks more quickly. The new Netwrix Auditor 6.5 offers deeper monitoring of log files and privileged accounts, which in turn provides improved visibility to changes made across a network, including file servers and file shares.
The new release converts audit logs into more human readable formats, according to the company. It also lets IT managers and systems analysts audit configurations from any point in time, while providing archives of historical data against which to match. Netwrix said this ensures compliance with security policies and thwarting rogue employees from making unauthorized changes.
In all, Netwrix said it has added more than 30 improvements to the new release of Auditor, resulting in higher scalability and performance.
Riverbed Extends Visibility and Control
Riverbed Technology this week launched the latest version of its SteelHead WAN optimization platform, including a new release of its SteelCentral AppResonse management tool to monitor hybrid environments, including Software-as-a-Service (SaaS) apps.
Core to the new SteelHead 9.0 is its tight integration with SteelCentral AppResponse, which Riverbed said simplifies the ability to troubleshoot applications using the app's analytics engine, making it easier to manage such processes as policy configuration, patch management, reporting and troubleshooting. The SteelCentral dashboard lets administrators track performance of applications, networks, quality of service and reports on how policies are maintained.
SteelCentral AppResponse 9.5 also gives administration metrics on end-user experiences of traditional and SaaS-based apps, even if they're not optimized by the SteelHead WAN platform. Riverbed said providing this information aims to let IT groups respond to business requirements and issues causing degraded performance. The new SteelHead 9.0 also is designed to ensure optimized performance of Office 365 mailboxes.
Posted by Jeffrey Schwartz on 11/07/2014 at 10:50 AM0 comments
A majority of some of the largest chief information security officers (CISOs) strongly believe that the sophistication of attackers is outstripping their own ability to fend them off and the number of threats has increased markedly. According to IBM's third annual CISO study, 59 percent are concerned about their inability to keep pace with 40 percent and say it's their top security challenge.
Moreover, 83 percent said external threats have increased over the past three years with 42 percent of them saying the increases were dramatic. IBM revealed results of its study at a gathering of CISOs held at its New York offices.
The survey also found CISOs have also found themselves more frequently questioned by the C-suite and corporate boards, while changes to the global regulatory landscape promise to further complicate efforts to step threats, where the vast majority are derived. Kristin Lovejoy, IBM's general manager of security services, said malware creation is a big business in unregulated countries, which are the origin of most attacks.
"Where we say we're worried about external attackers and we're worried about financial crime data theft, there's a correlation between people getting Internet access in unregulated, unlegislated countries where it's an economic means of getting out," Lovejoy said. "When you interview the criminals, they don't even know they're performing a crime -- they're just building code. We have to be careful here, this external attacker thing, it's not going to get any better, it's going to get worse."
Most are able to exploit the naivety of employees, she added, noting 80 to 90 percent of all security incidents were because of human error. "They're getting in because users are pretty dumb," she said. "They click on stuff all the time. It's going to continue." She added organizations that are most secure are those that have good IT hygiene, automation, configuration management, asset management, especially those that implement ITIL practices.
Posted by Jeffrey Schwartz on 11/05/2014 at 1:23 PM0 comments
A survey of small and medium enterprises found that only 8 percent are prepared to recover from an unplanned IT outrage, while 23 percent of them report it would take more than a day to resume operations.
Underscoring the risk to companies with fewer than 1,000 employees, a vast majority of the 453 organizations surveyed have experienced a major IT outage in the past two years. Companies with 50 to 250 employees were especially at risk. A reported 83 percent have gone through a major IT failure, while 74 percent of organizations with 250 to 1,000 employees have experienced a significant outage.
One-third are using cloud-based disaster recovery as a service, which has rapidly started to gain momentum this year, according to the survey, conducted by Dimensional Research and sponsored by DRaaS provider Axcient. Daniel Kuperman, director of product marketing at Axcient, said the results confirmed what the company had suspected. "In a lot of cases, companies still don't put emphasis on disaster recovery," he said.
Axcient didn't reveal whose DRaaS offerings the organizations were using, through Kuperman said Dimensional chose its own companies to poll from the researcher's own resources. DRaaS is one of the leading use cases for organizations making their foray into using cloud services.
A survey by cloud infrastructure provider EvolveIP last month found that nearly 50 percent benefitted from a recovery cloud service by avoiding outages from a disaster. Nearly three quarters, or 73 percent, cited the ability to recover from an outrage was the prime benefit of using a cloud service. As a result, 42 percent of those responding to EvolveIP's survey have increased their cloud spending budgets this year, while 54 percent plan to do so in 2015.
Posted by Jeffrey Schwartz on 11/05/2014 at 12:01 PM0 comments
In its latest bid to offer better failover and replication in its software and cloud infrastructure, Microsoft demonstrated its new Storage Replica technology at last week's TechEd conference in Barcelona.
Microsoft Principal Program Manager Jeff Woolsey demonstrated Storage Replica during the opening TechEd keynote. Storage Replica, which Microsoft sometimes calls Windows Volume Replication (or WVR) provides block-level, synchronous replication between servers or cluster to provide disaster recovery, according to a Microsoft white paper published last month. The new replication engine is storage-agnostic and Microsoft says it can also stretch a failover cluster for high availability.
Most notable is that Storage Replica provides synchronous replication, which as Microsoft describes it, enables organizations to mirror data within the datacenter with "crash-consistent volumes." The result, says Microsoft, is zero data loss at the file system level. By comparison, asynchronous replication, which Microsoft added to Windows Server 2012 via the Hyper-V Replica and updated in last year's Windows Server 2012 R2 release, allows site extension beyond the limitations of a local metropolitan area. Asynchronous replication, which has a higher possibility for data loss or delay, may not be suited for scenarios where instantaneous real-time availability is a requirement, though for general purposes it's considered adequate.
In the TechEd demo, Woolsey simulated a scenario with four server nodes, two in New York and the other across the river in New Jersey. The goal is to ensure that if users are unable to access data on the two nodes in New York, they automatically and transparently fail over to New Jersey without losing any data, Woolsey explained. It also uses a new feature in the Microsoft Azure service called Cloud Witness.
"To do a stretch cluster you need to have a vote for the cluster quorum," Woolsey explained. "In the past, this meant extra hardware, extra infrastructure, extra cost. Now we're just making this part of Azure as well. So that's an option to take advantage of the Cloud Witness. As you can see, we're baking hybrid capabilities right into Windows Server."
In the demo, Woolsey accessed the file share data to enable replication via the new storage replication wizard. From there he selected the source log disk, then the destination storage volume and log disk. "Literally in just a few clicks, that's it, I've gone ahead and I've set up synchronous replication," he said.
In the recently published white paper, the following features are implemented in the Windows Server Technical Preview:
Yes (server to server only)
Storage hardware agnostic
Windows Server Stretch Cluster creation
Write order consistency across volumes
TCP/IP or RDMA
Replication network port firewall requirements
Single IANA port (TCP 445 or 5445)
Over the wire encryption and signing
Per-volume failovers allowed
Dedup & BitLocker volume support
Management UI in-box
Windows PowerShell, Failover Cluster Manager
Microsoft also has emphasized that Storage Replica is not intended for backup and recovery scenarios. And because of the general purpose of the product, the company noted it may not be suited to specific applications behaviors. In addition, Microsoft is warning that with Storage Replica, organizations could see feature gaps in applications and hence they could be better served by those app-specific replication technologies.
What's your take on Microsoft's latest efforts to embed disaster recovery into Windows Server and Azure?
Posted by Jeffrey Schwartz on 11/03/2014 at 1:26 PM0 comments
Microsoft used its TechEd conference in Barcelona this week to give customers a first look at the new Azure cloud in a box. The so-called Cloud Platform System (CPS), announced at an event held last week in San Francisco led by CEO Satya Nadella and Executive VP for Cloud and Enterprise Scott Guthrie, is Microsoft's effort to let customers or hosting providers run their own Azure clouds.
The first CPS is available from Dell, though describing the company as the "first" to provide one implies that other major hardware providers may have plans for their own iterations -- or perhaps it's only at the wishful thinking stage. At any rate, CPS has been a long time coming.
As you may recall, Microsoft first announced plans to release such an offering more than four years ago. At the time, Dell, Hewlett Packard and Fujitsu were planning to offer what was then coined the Windows Azure Platform Appliance, and eBay had planned to run one. Though Microsoft took it on a roadshow that year, it suddenly disappeared.
Now it's back and Corporate VP Jason Zander showcased it in his TechEd Europe opening keynote, inviting attendees to check it out on the show floor. "This is an Azure-consistent cloud in a box," he said. "We think this is going to give you the ability to adopt the cloud with even greater control. You energize it, you hook it up to your network and you're basically good to go."
The CPS appears more modest than the original Windows Azure Platform Appliance in that they are sold as converged rack-based systems and don't come in prefabricated containers with air conditioning and cooling systems. The racks are configured with Dell PowerEdge servers, storage enclosures and network switches. Each rack includes 32 CPU notes and up to 282TB of storage. On the software side customers get Windows Server 2012 R2 with Hyper-V, configured in a virtualized multi-tenant architecture, System Center 2012 R2 and the Windows Azure Pack to provide the Azure-tie functionality within a customer's datacenter.
So far, the first two known customers of the CPS are NTTX, which will use it to provide its own Azure infrastructure as a service in Japan and CapGemini, which will provide its own solutions for customers running in the Azure cloud.
CapGemini is using it for an offering called SkySight, which will run a variety of applications including SharePoint and Lync as well as a secure policy driven orchestration service based on its own implementation of Azure. "SkySite is a hybrid solution where we will deliver a complete integrated application store and a developer studio all using Microsoft technologies," said CapGemini Corporate VP Peter Croes, in a pre-recorded video presented by Zander during the keynote. "CPS for me is the integrated platform for public and private cloud. Actually it's the ideal platform to deliver the hybrid solution. That is what the customers are looking for."
Microsoft last week tried to differentiate itself from Amazon Web Services and Google in its hybrid approach. CPS could become an important component of Azure's overall success.
Posted by Jeffrey Schwartz on 10/31/2014 at 1:00 PM0 comments
Andy Rubin is leaving Google to join a technology incubator dedicated to startups building hardware, according to published reports. Whether you are an Android fan or not, it's hard not to argue that Google's acquisition of the company Rubin founded was one of the most significant deals made by the search giant.
While Rubin continued to lead the Android team since Google acquired it in 2005, Google reassigned him last year to lead the company's moves into the field of robotics, which included overseeing the acquisition of numerous startups.
Rubin's departure comes a week after Google CEO Larry Page promoted Sundar Pichai to head up all of Google's product lines except for YouTube. Page in a memo to employees published in The Wall Street Journal said he's looking to create a management structure that can make "faster, better decisions."
That effectively put Pichai in charge of the emerging robotics business as well. A spokesman told The New York Times that James Kuffner, who has worked with Google's self-driving cars, will lead the company's robotics efforts.
The move comes ironically on the same day that Google sold off the hardware portion of its Motorola Mobility business to Lenovo. The two companies yesterday closed on the deal, announced in January. Lenovo said it will continue to leverage the Motorola brand.
As for Rubin, now that he's incubating startups, he'll no doubt be on the sell-side of some interesting companies again.
Posted by Jeffrey Schwartz on 10/31/2014 at 12:58 PM0 comments
Microsoft kicked off what looks to be its final TechEd conference with the launch of new services designed to simplify the deployment, security and management of apps running in its cloud infrastructure. In the opening keynote presentation at TechEd, taking place in Barcelona, officials emphasized new capabilities that enable automation and the ability to better monitor the performance of specific nodes.
A new feature called Azure Operational Insights will tie the cloud service and Azure HDInsight with Microsoft's System Center management platform. HDInsight, the Apache Hadoop-based Big Data analytics service, will monitor and analyze machine data from cloud environments to determine where IT pros need to reallocate capacity.
Azure Operational Insights, which will be available in preview mode next month (a limited preview is currently available), initially will address four key functions: log management, change tracking, capacity planning and update assessment. It uses the Microsoft Monitoring Agent, which incorporates an application performance monitor for .NET apps and the IntelliTrace Collector in Microsoft's Visual Studio development tooling, to collect complete application-profiling traces. Microsoft offers the Monitoring Agent as a standalone tool or as a plugin to System Center Operations Manager.
Dave Mountain, vice president of marketing at BlueStripe Software, was impressed with the amount of information it gathers and the way it's presented. "If you look at it, this is a tool for plugging together management data and displaying it clearly," Mountain said. "The interface is very slick, there's a lot of customization and it's tile-based."
On the heels of last week's announcement that it will support the more-robust G-series of virtual machines, which boast up to 32 CPU cores of compute based on Intel's newest Xeon processors, 45GB of RAM and 6.5TB of local SSD storage, Microsoft debuted Azure Batch, which officials say is designed to let customers use Azure for jobs that require "massive" scale out. The preview is available now.
Azure Batch is based on the job scheduling engine used by Microsoft internally to manage the encoding of Azure Media Services and for testing the Azure infrastructure itself, said Scott Guthrie, Microsoft's executive VP for cloud and enterprise, in a blog post today.
"This new platform service provides 'job scheduling as a service' with auto-scaling of compute resources, making it easy to run large-scale parallel and high performance computing (HPC) work in Azure," Guthrie said. "You submit jobs, we start the VMs, run your tasks, handle any failures, and then shut things down as work completes."
The new Azure Batch SDK is based on the application framework from GreenButton, a New Zealand-based company that Microsoft acquired in May, Guthrie noted. "The Azure Batch SDK makes it easy to cloud-enable parallel, cluster and HPC applications by describing jobs with the required resources, data and one or more compute tasks," he said. "With job scheduling as a service, Azure developers can focus on using batch computing in their applications and delivering services without needing to build and manage a work queue, scaling resources up and down efficiently, dispatching tasks, and handling failures."
Microsoft also said it has made its Azure Automation service generally available. The tool is designed to automate repetitive cloud management tasks that are time consuming and prone to error, the company said. It's designed to use existing PowerShell workflows or IT pros can deploy their own.
Also now generally available is WebJobs, the component of Microsoft Azure Websites designed to simplify the running of programs, services or background tasks on a Web site, according a blog post by Product Marketing Manager Vibhor Kapoor, in a post today on the Microsoft Azure blog.
"WebJobs inherits all the goodness of Azure Websites -- deployment options, remote debugging capabilities, load balancing and auto-scaling," Kapoor noted. "Jobs can run in one instance, or in all of them. With WebJobs all the building blocks are there to build something amazing or, small background jobs to perform maintenance for a Web site."
Posted by Jeffrey Schwartz on 10/28/2014 at 10:59 AM0 comments
Will Microsoft's final TechEd conference this week in Barcelona go out with a bang? We'll have a better sense of that over the next two days as the company reveals the next set of deliverables for the datacenter and the cloud. Microsoft has kept a tight lid on what's planned but we should be on the lookout for info pertaining to the next versions of Windows Server, System Center and Hyper-V, along with how Microsoft sees containers helping advance virtualization and cloud interoperability.
In case you missed it, this week's TechEd, the twice-yearly conference Microsoft has held for nearly two decades, will be the last. Instead, Microsoft said earlier this month it will hold a broader conference for IT pros and developers called Ignite to be held in Chicago during the first week of May. Ignite will effectively envelope TechEd, SharePoint and Exchange.
Given the company's statements about faster release cycles, if officials don't reveal what's planned for the next releases of Windows Server, System Center and the so-called "Cloud OS" tools that enable it to provide an Azure-like infrastructure within the datacenter, partner cloud services and its own public cloud, I'd be quite surprised.
If you caught wind of presentations made last week by CEO Satya Nadella and Scott Guthrie, EVP of Microsoft's cloud and enterprise group, it was clear that besides some noteworthy announcements, they were clearly aimed at priming the pump for future announcements. For example Microsoft announced the Azure Marketplace, where ISV partners can develop virtual images designed to accelerate the use of Azure as a platform. Also revealed, the Azure G-series of virtual powered by the latest Intel Xeon processors that Guthrie claimed will be the largest VMs available in the public cloud -- at least for now. Guthrie claimed that the new VMS provide twice the memory of the largest Amazon cloud machine.
As Microsoft steps up its moves into containerization with the recent announcement that it's working with Docker to create Docker containers for Windows Server, it will be interesting to hear how that will play into the next release of the server operating system. It will also be interesting to learn to what extent Microsoft will emphasize capabilities in Windows Server and Azure that offer more automation as the company moves to build on the evolving software-defined datacenter.
The opening keynote is tomorrow, when we'll find out how much Microsoft intends disclose what's next for its core enterprise datacenter and cloud platforms. I'd be surprised and disappointed if it wasn't substantive.
Posted by Jeffrey Schwartz on 10/27/2014 at 3:17 PM0 comments
While almost every part of Microsoft's business faces huge pressure from disruptive technology and competitors, the software that put the company on the map -- Windows -- continues to show it's not going to go quietly into the night. Given Microsoft's surprise report that Surface sales have surged and the company promising new capabilities in the forthcoming release of Windows 10, expectations of the operating system's demise are at least premature and potentially postponed indefinitely.
Despite the debacle with its first Surface rollout two years ago, this year's release of the Surface Pro 3 and the resulting impressive performance shows that Windows still has a chance to remain relevant despite the overwhelming popularity of iOS and Android among consumers and enterprises. Granted, we now live in a multiplatform world, which is a good thing that's not going to change. The only question still to play out is where Windows will fit in the coming years and this will be determined by Microsoft making the right moves. Missteps by Apple and Google going forward will play a role as well of course, but the ball is in Microsoft's court to get Windows right.
Amid yesterday's impressive results for the first quarter of Microsoft's 2015 fiscal year were increases along key lines including Office 365, enterprise software and cloud business and the disclosure of $908 million in revenues for its Surface business. That's more than double of what Surface devices received last year. This report includes the first full quarter that the new Surface Pro 3 has been on the market. Presuming there wasn't significant channel stuffing, this is promising news for the future of Windows overall.
Indeed while showing hope, the latest report on Surface sales doesn't mean Windows is out of the woods. Despite the surge in revenues, Microsoft didn't reveal Surface unit sales. And while the company said its Surface business is now showing "positive gross margins" -- a notable milestone given the $900 million charge the company took five quarters ago due to poor device sales -- Microsoft didn't say how profitable they are, said Patrick Moorhead, principal analyst with Moor Insights & Strategy.
Marketing and the cost of implementing Microsoft's much improved global channel and distribution reach neutralized or negated much of the overall negative margin. Moorhead predicted, "I can say with 99 percent confidence they are losing money on Surface still. That may not be bad for two reasons. They need to demonstrate Windows 8 can provide a good experience and second of all it puts additional pressure on traditional OEMs that they need to be doing a better job than what they do."
Also worth noting, the $908 million in Surface revenues were about 17 percent of the $5.3 million Apple took in for iPads during the same period (revenues for Macintoshes, which are in many ways more comparable to the Surface Pro 3, were $6.6 million, Apple said). Apple's iPads, which often displace PCs for many tasks, are also hugely profitable though ironically sales of the tablets have declined for the past three quarters amid the sudden surge in Surface sales. Naturally they also have different capabilities but the point is to underscore the positive signs the growth of Surface portend for the future of Windows.
Morehead said the current quarter and notably holiday sales of all Windows devices, led by an expected onslaught of dirt-cheap Windows tablets (possibly as low as $99) could be an inflexion point, though he warned that Microsoft will need to execute. "If Microsoft continues to operate the way they are operating, they will continue to lose considerable consumer relevance," he said. "If during the holidays, they make news and sell big volumes, I would start to think otherwise."
Key to the quarter's turnaround was the company's expanded global distribution and extended sales of corporations through its channel partners, though that effort is still at a formative stage. Despite his skeptical warning, Moorhead believes Google's failure to displace Windows PCs with large Android devices and Chromebooks gives Microsoft a strong shot at keeping Windows relevant.
"Google had this huge opportunity to bring the pain on Microsoft with larger devices and eat into notebooks," he said. "They never did it. They really blew their opportunity when they had it. While Android may have cleaned up with phones, when you think about it what they did was just blocking Microsoft as opposed to going after Microsoft, which would be in larger form factor devices in the form of Android notebooks and Chromebooks. The apps are designed for 4-inch displays, not a 15-inch display or 17-inch display. And with Chrome, its offline capabilities just came in too slowly and there really aren't a lot of apps. They just added the capability to add real apps."
Meanwhile, Moorhead pointed out that Apple this month has delivered on what Microsoft is aiming to do: provide an experience that lets a user begin a task on say an iPhone and resume that task on an iPad or Mac.
Hence keeping Windows relevant, among other thing, may rest on Microsoft's ability to deliver a Windows 10 that can do that and improve on a general lack of apps on the OS, which in the long run would incent developers to come back. The promising part of that is the renewed focus on the desktop, Moorhead said. "When they converge the bits in a meaningful way, I think they can hit the long tail because of the way they're doing Windows 10 with Windows apps and the ability to leverage those 300 million units to a 7-inch tablet and a 4-inch phone. I think that is an enticing value proposition for developers."
Given the target audience of business users for the Surface Pro 3, it also is a promising signal for the prospects of Windows holding its own in the enterprise. Do you find the new surge in Surface sales coupled with design goal of Windows 10 to be encouraging signs for the future Windows or do you see it more as one last burst of energy?
Posted by Jeffrey Schwartz on 10/24/2014 at 12:48 PM0 comments
Microsoft may be trying to compete with IBM in the emerging market for machine learning-based intelligence but like all rivals, these two with a storied past together have their share of mutual interests even as they tout competing public enterprise clouds. Hence the two are the latest to forge a cloud compatibility partnership.
The companies said today they are working together to ensure some of their respective database and middleware offerings can run on both the IBM Cloud and Microsoft Azure. Coming to Microsoft Azure is IBM's WebSphere Liberty application server platform, MQ middleware and DB2 database. IBM's Pure Application Service will also run on Microsoft Azure the two companies said.
In exchange, Windows Server and SQL Server will work on the IBM Cloud. Both companies are collaborating to provide Microsoft's .NET runtime for IBM Bluemix, the company's new cloud development platform. While the IBM Cloud already has support for Microsoft's Hyper-V, IBM said it will add expanded support for the virtualization platform that's included in Windows Server. It was not immediately clear how they will improve Hyper-V support on the IBM Cloud.
Andrew Brust, a research director at Gigaom Research, said that the IBM Cloud, which is based on the SoftLayer public cloud IBM acquired last year for $2 billion, runs a significant amount of Hyper-V instances. "They explained to me that they have a 'non-trivial' amount of Windows business and that they support Hyper-V VMs," Brust said.
"With that in mind, the announcement makes sense, especially when you consider [Microsoft CEO] Satya's [Nadella] comment on Monday that Azure will 'compose' with other clouds," Brust added. The comment made by Nadella took place Monday when he was articulating on Microsoft's strategy to build Azure into a "hyperscale" cloud. "We are not building our hyperscale cloud in Azure in isolation," Nadella said. "We are building it to compose well with other clouds."
Nadella spelled out recent efforts to do that including last week's announcement that Microsoft is working with Docker to develop Docker containers for Windows Server, its support for native Java via its Oracle partnership (which, like IBM, includes its database and middleware offerings) as well as broad support for other languages including PHP, Python and Node.js. "This is just a subset of the open source as well as other middle-tier frameworks and languages that are supported on Azure," Nadella said at the event.
Most analysts agree that Amazon, Microsoft and Google operate the world's largest cloud infrastructures but with SoftLayer, IBM has a formidable public cloud as well. Both IBM and Microsoft are seeing considerable growth with their respective cloud offerings but have reasonably sized holes to fill as well.
Nadella said Monday that Microsoft has a $4.4 billion cloud business -- still a small fraction of its overall revenues but rapidly growing. For its part, IBM said on its earnings call Monday that its public cloud infrastructure is in a $3.1 billion run rate and its overall cloud business is up 50 percent, though the company's spectacular earnings miss has Wall Street wondering if IBM has failed to move quickly enough. The company's shares have tumbled in recent days and analysts are questioning whether the company needs a reboot similar to the one former CEO Lou Gerstner gave it two decades ago.
"Overall, this looks like a marriage of equals where both stand to gain by working harmoniously together," said PundIT Analyst Charles King. Forrester Research Analyst James Staten agreed. "IBM and Microsoft both need each other in this regard so a nice quid quo pro here," he said.
For Microsoft, adding IBM to the mix is just the latest in a spate of cloud partnerships. In addition to its partnership with Oracle last year, Microsoft recently announced a once-unthinkable cloud partnership with Salesforce.com and just tapped Dell to deliver its latest offering, the new Cloud Platform System, which the company describes as an "Azure-consistent cloud in a box" that it will begin offering to customers next month.
It also appears that IBM and Microsoft held back some of their crown jewels in this partnership. There was no mention of IBM's Watson or Big SQL, which is part of its InfoSphere Platform on Hadoop, based on a Hadoop Distributed File System (HDFS). During a briefing last week at Strata + Hadoop World in New York, IBM VP for Big Data Anjul Bhambhri described the recent third release of Big SQL in use with some big insurance companies. "Some of their queries which they were using on Hive, were taking 45 minutes to run," she said. "In Big SQL those kinds of things are 17 rejoins is now less than 5 minutes."
Likewise, the announcement doesn't seem to cover Microsoft's Azure Machine Learning or AzureHD Insights offerings. I checked with both companies and while both are looking into it, there was no response as of this posting. It also wasn't immediately clear when the offerings announced would be available.
Update: A Microsoft spokeswoman responded to some questions posed on the rollout of the services on both companies' cloud. Regarding the availability of IBM's software on Azure: "In the coming weeks, Microsoft Open Technologies, a wholly owned subsidiary of Microsoft, will publish license-included virtual machine images with key IBM software pre-installed," she stated. "Customers can take advantage of these virtual machines to use the included IBM software in a 'pay-per-use' fashion. Effective immediately, IBM has updated its policies to allow customers to bring their own license to Microsoft Azure by installing supported IBM software on a virtual machine in Azure."
As it pertains to using Microsoft's software in the IBM cloud, she noted: "Windows Server and SQL Server are available for use on IBM Cloud effective immediately. IBM will be offering a limited preview of .NET on IBM Cloud in the near future." And regarding plans to offer improves support for Hyper-V in the IBM Cloud: "Hyper-V is ready to run very well on IBM SoftLayer to provide virtualized infrastructure and apps. IBM is expanding its product support for Hyper-V."
Posted by Jeffrey Schwartz on 10/22/2014 at 2:22 PM0 comments
The launch today of the new Apple Pay service for users of the newest iPhone and iPad -- and ultimately the Apple Watch -- is a stark reminder that Microsoft has remained largely quiet about its plans to pursue this market when it comes to Windows Phone or through any other channels.
If smartphone-based payments or the ability to pay for goods with other peripherals such as watches does take off in the coming year, it could be the latest reason consumers shun Windows Phone, which despite a growing number of apps, still is way behind the two market leaders.
So if payments become the new killer app for smartphones, is it too late for Microsoft to add it to Windows Phone? The bigger question should be is it too late for Microsoft as a company? Perhaps the simplest way to jump in would be to buy PayPal, the company eBay last month said it will spin off. The problem there is eBay has an estimated market valuation of $65 billion -- too steep even for Microsoft.
If Microsoft still wants to get into e-payment -- which, in addition to boosting Windows Phone, could benefit Microsoft in other ways including its Xbox, Dynamics and Skype businesses, among others -- the company could buy an emerging e-payment company such as Square, which is said to be valued at a still-steep (but more comfortable) $6 billion.
Just as Microsoft's Bill Gates had visions of bringing Windows to smartphones nearly two decades ago, he also foresaw an e-payments market similar to the one now emerging. Gates was reminded of the fact that he described possible e-payment tech in his book, "The Road Ahead," by Bloomberg Television's Erik Schatzker in an interview released Oct. 2.
"Apple Pay is a great example of how a cellphone that identifies its user in a pretty strong way lets you make a transaction that should be very, very inexpensive," Gates said. "The fact that in any application I can buy something, that's fantastic. The fact I don't need a physical card any more -- I just do that transaction and you're going to be quite sure about who it is on the other end -- that is a real contribution. And all the platforms, whether it's Apple's or Google's or Microsoft, you'll see this payment capability get built in. That's built on industry standard protocols, NFC and these companies have all participated in getting those going. Apple will help make sure it gets critical mass for all the devices."
Given his onetime desire to lead Microsoft in offering digital wallet and payment technology, Schatzker asked Gates why Microsoft hasn't entered this market already? "Microsoft has a lot of banks using their technology to do this type of thing," Gates said. "In the mobile devices, the idea that a payment capability and storing the card in a nice secret way, that's going to be there on all the different platforms. Microsoft had a really good vision in this." Gates then subtly reminded Schatzker the point of their interview was to talk about the work of the Bill and Melinda Gates Foundation.
But before shifting back to that topic, Schatzker worked in another couple of questions, notably should Microsoft be a player the way that Apple is looking to become (and Google has) with its digital wallet? "Certainly Microsoft should do as well or better but of all the things that Microsoft needs to do in terms of making people more productive in their work, helping them communicate in new ways, it's a long list of opportunities," he said. "Microsoft has to innovate and taking Office and making it dramatically better would be really high on the list. That's the kind of thing I'm trying to help make sure they move fast on."
For those wishful that Microsoft does have plans in this emerging segment, there's hope. Iain Kennedy last month left Amazon.com where he managed the company's local commerce team to take on the new role of senior director of product management for Microsoft's new commerce platform strategy, according to his LinkedIn profile. Before joining Amazon, Kennedy spent four years at American Express.
Together with Gates' remarks, it's safe to presume that Microsoft isn't ignoring the future of digital payments and e-commerce. One sign is that Microsoft is getting ready to launch a smartwatch within the next few weeks that is focused on fitness. While that doesn't address e-payments, it's certainly a reasonable way to get into the game. According to a Forbes report today, it will be a multiple operating system watch.
It's unclear what role Windows Phone will play in bringing a payments service to market but it's looking less like it will have a starring role. As a "productivity and platforms" company, despite Gates' shifting the conversation to Office, it may not portend that Microsoft has plans for the e-payments market. If the company moves soon, it may not be too late.
Posted by Jeffrey Schwartz on 10/20/2014 at 1:49 PM0 comments
Apparently stung by his remarks last week that women shouldn't ask for raises but instead look for "karma," Microsoft CEO Satya Nadella said he is putting controls in place that will require all employees to attend diversity training workshops to ensure not just equal pay for women but opportunities for advancement regardless of gender or race.
Nadella had quickly apologized for his remarks at last week's Hopper Celebration of Women in Computing in Phoenix but apparently he's putting his money where his mouth is. Microsoft's HR team reported to Nadella that women in the U.S. last year earned 99.7 percent of what men earned at the same title and rank, according to an e-mail sent to employees Wednesday that was procured by GeekWire.
"In any given year, any particular group may be slightly above or slightly below 100 percent," he said. "But this obscures an important point: We must ensure not only that everyone receives equal pay for equal work, but that they have the opportunity to do equal work."
Given the attention Microsoft's CEO brought to the issue over the past week, it begs the question: do women in your company earn the same amount as men for the same job title and responsibilities and have the same opportunities for advancement or is there a clear bias? IT is an industry dominated by men though educators are trying to convince more women to take up computer science.
Posted by Jeffrey Schwartz on 10/17/2014 at 12:59 PM0 comments
Nearly a year after launching its Hadoop-based Azure HDInsight cloud analytics service, Microsoft believes it's a better and broader solution for real-time analytics and predictive analysis than IBM's widely touted Watson. Big Blue this year has begun commercializing its Watson technology, made famous in 2011 when it came out of the research labs to appear and win on the television game show Jeopardy.
Both companies had a large presence at this year's Strata + Hadoop World Conference in New York, attended by 5,000 Big Data geeks. At the Microsoft booth, Eron Kelly, general manager for SQL Server product marketing, highlighted some key improvements to Microsoft's overall Big Data portfolio since last year's release of Azure HDInsight including SQL Server 2014 with support for in-memory processing, PowerBI and the launch in June of Azure Machine Learning. In addition to bolstering the offering, Microsoft showcased Azure ML's ability to perform real-time predictive analytics for the retail chain Pier One.
"I think it's very similar," in terms of the machine learning capabilities of Watson and Azure ML, Kelly said. "We look at our offering as a self-service on the Web solution where you grab a couple of predictive model clips and you're in production. With Watson, you call in the consultants. It's just a difference fundamentally [that] goes to market versus IBM. I think we have a good advantage of getting scale and broad reach."
Not surprisingly, Anjul Bhambhri, vice president of Big Data for IBM's software group disagreed. "There are certain applications which could be very complicated which require consulting to get it right," she said. "There's also a lot of innovation that IBM has brought to market around exploration, visualization and discovery of Big Data which doesn't require any consulting." In addition to Watson, IBM offers its InfoSphere BigInsights for Hadoop and Big SQL offerings.
As it broadens its approach with a new "data culture," Microsoft has come on strong with Azure ML, noting it shares many of the real-time predictive analytics of the new personal assistant in Windows Phone called Cortana. Now Microsoft is looking to further broaden the reach of Azure ML with the launch of a new app store-type marketplace where Microsoft and its partners will offer APIs consisting of predictive models that can plug into Azure Machine Learning.
Kicking off the new marketplace, Joseph Sirosh, Microsoft's corporate VP for information management and machine learning, gave a talk at the Strata + Hadoop conference this morning. "Now's the time for us to try to build the new data science economy," he said in his presentation. "Let's see how we might be able to build that. What do data science and machine learning people do typically? They build analytical models. But can you buy them?"
Sirosh said with Microsoft's new data section of the Azure Marketplace, marketplace developers and IT pros can search for predictive analytics components. It consists of APIs developed both by Microsoft and partners. Among those APIs from Microsoft are Frequently Bought Together, Anomaly Detection, Cluster Manager and Lexicon Sentiment Analysis. Third parties selling their APIs and models include Datafinder, MapMechanics and Versium Analytics.
Microsoft's goal is to build up the marketplace for these data models. "As more of you data scientists publish APIs into that marketplace, that marketplace will become just like other online app stores -- an enormous of selection of intelligent APIs. And we all know as data scientists that selection is important," Sirosh said. "Imagine a million APIs appearing in a marketplace and a virtual cycle like this that us data scientists can tap into."
Also enabling the real-time predictive analytics support is support for Apache Storm clusters, announced today. Though it's in preview, Kelly said Microsoft is adhering to its SLAs with use of the Apache Storm capability, which enables complex event processing and stream analytics, providing much faster responses to queries.
Microsoft also said it would support the forthcoming Hortonworks Data Platform, which has automatic backup to Azure BLOB storage, Kelly said. "Any Hortonworks customer can back up all their data to an Azure Blob in a real low cost way of storing their data, and similarly once that data is in Azure, it makes it real easy for them to apply some of these machine learning models to it for analysis with Power BI [or other tools]."
Hortonworks is also bringing HDP to Azure Virtual Machines as an Azure certified partner. This will bring Azure HDInsight to customers who want more control over it in an infrastructure-as-a-service model, Kelly said. Azure HDInsight is currently a platform as a service that is managed by Microsoft.
Posted by Jeffrey Schwartz on 10/17/2014 at 9:54 AM0 comments
In perhaps its greatest embrace of the open source Linux community to date, Microsoft is teaming up with Docker to develop Docker containers that will run on the next version of Windows Server, the two companies announced today. Currently Docker containers, which are designed to enable application portability using code developed as micro-services, can only run on Linux servers.
While Microsoft has stepped up its efforts to work with Linux and other open source software over the years, this latest surprise move marks a key initiative to help make containers portable among respective Windows Server and Linux server environments and cloud infrastructures. It also underscores a willingness to extend its ties with the open source community as a key contributor to make that happen.
In addition to making applications portable, proponents say containers could someday supersede the traditional virtual machine. Thanks to their lightweight composition, containers can provide the speed and scale needed for next generation applications and infrastructure components. Those next generation applications include those that make use of Big Data and processing complex computations.
Containers have long existed, particularly in the Linux community and by third parties such as Parallels. But they have always had their own implementations. Docker has taken the open source and computing world by storm over the past year since the company, launched less than two years ago, released a standard container that created a de-facto standard for how applications can extend from one platform to another running as micro-services.
Many companies have jumped on the Docker bandwagon in recent months including Amazon, Google, IBM, Red Hat and VMware, among others. Microsoft in May said it would enable Docker containers to run in its Azure infrastructure as a service cloud. The collaboration between Docker and Microsoft was a closely held secret.
Microsoft Azure CTO Mark Russinovich had talked about the company's work with Docker to support its containers in Azure in a panel at the Interop show in New York Sept. 30 and later in an interview. Russinovich alluded to Microsoft's own effort to develop Windows containers, called Drawbridge. Describing it as an internal effort, Russinovich revealed the container technology is in use within the company internally and is now available for customers that run their own machine learning-based code in the Azure service.
"Obviously spinning up a VM for [machine learning] is not acceptable in terms of the experience," Russinovich said during the panel discussion. "We are figuring out how to make that kind of technology available publicly on Windows."
At the time, Russinovich was tight-lipped about Microsoft's work with Docker and the two companies' stealth effort. Russinovich emphasized Microsoft's support for Linux containers on Azure and when pressed about Drawbridge he described it as a more superior container technology, arguing its containers are more secure for deploying micro-services.
As we now know, Microsoft has been working quietly behind the scenes with Docker to enable the Docker Engine, originally architected only to run in a Linux server, to operate with Windows Server as well. The two companies are working together to enable the Docker Engine to work in the next version of Windows Server.
Microsoft is working to enable Docker Engine images for Windows Server that will be available in Docker Hub, an open source repository housing more than 45,000 Docker applications via shared developer communities. As a result, Docker images will be available for both Linux and Windows Server.
Furthermore, the Docker Hub will run in the Microsoft Azure public cloud, accessible via the Azure Management Portal and Azure Gallery and Management Portal. This will allow cloud developers including its ISV partners to access the images. Microsoft also said it will support Docker orchestration APIs, which will let developers and administrators manage applications across both Windows and Linux platforms using common tooling. This will provide portability across different infrastructure, such as on-premises servers to cloud. It bears noting the individual containers remain tied to the operating system they are derived from.
The Docker Engine for Windows Server will be part of the Docker open source project where Microsoft said it intends to be an active participant. The result is that developers will now be able to use preconfigured Docker containers in both Linux and Windows environments.
Microsoft is not saying when it will appear, noting it is in the hands of the open source community, according to Ross Gardler, senior technology evangelist for Microsoft Open Technologies. To what extent Microsoft will share the underlying Windows code is not clear. Nor would he say to what extent, if any, the work from Docker will appear in this effort other than to say the company has gained deep knowledge from that project.
"This announcement is about a partnership of the bringing of Docker to Windows Server to insure we have interoperability between Docker containers," Gardler said. "The underlying implementation of that is not overly important. What is important is the fact that we'll have compatibility in the APIs between the Docker containers on Linux, and the Docker container on Windows."
David Messina, vice president of marketing at Docker, said the collaboration and integration between the two companies on the Docker Hub and the Azure Gallery, will lead to the merging of the best application content from both communities.
"If I'm a developer and I'm trying to build a differentiated application, what I want to focus on is a core service that's going to be unique to my enterprise or my organization and I want to pull in other content that's already there to be components for the application," Messina said. "So you're going to get faster innovation and the ability to focus on core differentiating capabilities and then leveraging investments from everybody else."
In addition to leading to faster development cycles, it appears containers will place less focus on the operating system over time. "It's less about dependencies on the operating system and more about being able to choose the technologies that are most appropriate and execute those on the platform," Microsoft's Gardler said.
Microosft Azure Corporate VP Jason Zander described the company's reasoning and plan to support Docker in Windows Server and Azure in a blog post. Zander explained how they will work:
Windows Server containers provide applications an isolated, portable and resource controlled operating environment. This isolation enables containerized applications to run without risk of dependencies and environmental configuration affecting the application. By sharing the same kernel and other key system components, containers exhibit rapid startup times and reduced resource overhead. Rapid startup helps in development and testing scenarios and continuous integration environments, while the reduced resource overhead makes them ideal for service-oriented architectures.
The Windows Server container infrastructure allows for sharing, publishing and shipping of containers to anywhere the next wave of Windows Server is running. With this new technology millions of Windows developers familiar with technologies such as .NET, ASP.NET, PowerShell, and more will be able to leverage container technology. No longer will developers have to choose between the advantages of containers and using Windows Server technologies.
IDC Analyst Al Hilwasaid in an e-mail that Microsoft has taken a significant step toward advancing container technology. "This is a big step for both Microsoft and the Docker technology," he said. "Some of the things I look forward to figuring out is how Docker will perform on Windows and how easy it will be to run or convert Linux Docker apps on Windows."
Posted by Jeffrey Schwartz on 10/15/2014 at 2:10 PM0 comments
Satya Nadella's comments suggesting that women shouldn't ask for pay raises or promotions have prompted outrage on social media. But to his credit, he swiftly apologized, saying he didn't mean what he said.
To be sure, Nadella's answer to the question of "What is your advice?" to women uncomfortable asking for a raise was, indeed, insulting to women. Nadella said "karma" is the best way women should expect a salary increase or career advancement, a comment the male CEO couldn't have made at a worse place: The Grace Hopper Celebration of Women in Computing in Phoenix, where he was interviewed onstage by Maria Klawe, president of Harvey Mudd College in Claremont, Calif. Even more unfortunate, Klawe is a Microsoft board member, one of the people Nadella reports to.
Here's exactly what Nadella said:
"It's not really just asking for the raise but knowing and having faith that the system will give you the right raises as you go along. And I think it might be one of the additional super powers that, quite frankly, women who don't ask for a raise have. Because that's good karma, it will come back, because somebody's going to know that 'that's the kind of person that I want to trust. That's the kind of person that I want to really give more responsibility to,' and in the long-term efficiency, things catch up."
Accentuating his poor choice of words was Klawe's immediate and firm challenge when she responded, "This is one of the very few things I disagree with you on," which was followed by a rousing applause. But it also gave Klawe an opportunity to tell women how not to make the same mistake she has in the past.
Klawe explained that she was among those who could easily advocate for someone who works for her but not for herself. Klawe related how she got stiffed on getting fair pay when she took a job as dean of Princeton University's engineering school because she didn't advocate for herself. Instead of finding out how much she was worth, when the university asked Klawe how much she wanted to be paid, she told her boss, who was a woman, "Just pay me what you think is right." Princeton paid her $50,000 less than the going scale for that position, Klawe said.
Now she's learned her lesson and offered the following advice: "Do your homework. Make sure you know what a reasonable salary is if you're being offered a job. Do not be as stupid as I was. Second, roleplay. Sit down with somebody you really trust and practice asking for the salary you deserve."
Certainly, the lack of equal pay and advancement for women has been a problem as long as I can remember. On occasion, high-profile lawsuits, often in the financial services industry, will bring it to the forefront and politicians will address it in their campaign speeches. The IT industry, perhaps even more so than the financial services industry, is dominated by men.
Nadella's apology appeared heartfelt. "I answered that question completely wrong," he said in an e-mail to employees almost immediately after making the remarks. "Without a doubt I wholeheartedly support programs at Microsoft and in the industry that bring more women into technology and close the pay gap. I believe men and women should get equal pay for equal work. And when it comes to career advice on getting a raise when you think it's deserved, Maria's advice was the right advice. If you think you deserve a raise, you should just ask. I said I was looking forward to the Grace Hopper Conference to learn, and I certainly learned a valuable lesson. I look forward to speaking with you at our monthly Q&A next week and am happy to answer any question you have."
Critics may believe Nadella's apology was nothing more than damage control. It's indeed the first major gaffe committed by the new CEO, but I'd take him at his word. Nadella, who has two daughters of his own, has encouraged employees to ask if they feel they deserve a raise. If Nadella's ill-chosen comments do nothing else, they'll elevate the discussion within Microsoft and throughout the IT industry and business world at large.
Of course, actions speak louder than words, and that's where the challenge remains.
Posted by Jeffrey Schwartz on 10/10/2014 at 3:00 PM0 comments
In its bid to replace the traditional Windows and client environment with virtual desktops, VMware will release major new upgrades of its VMware Workstation and VMware Player desktop virtualization offerings in December. Both will offer support for the latest software and hardware architectures and cloud services.
The new VMware Workstation 11, the company's complete virtual desktop offering and the company's flagship product launched 15 years ago, is widely used by IT administrators, developers and QA teams. VMware Workstation 11 will support the new Windows 10 Technical Preview for enterprise and commercial IT testers and developers who want to put Microsoft's latest PC operating system through the paces in a virtual desktop environment.
Built with nested virtualization, VMware Workstation can run other hypervisors inside the VM, including Microsoft's Hyper-V and VMware's own vSphere and ESXi. In addition to running the new Windows 10 Technical Preview, VMware Workstation 11 will add support for other operating systems including Windows 2012 R2 for servers, Ubuntu 14.10, RHEL 7, CentOS 7, Fedora 20, Debian 7.6 and more than 200 others, the company said.
Also new in VMware Workstation 11 is support for the most current 64-bit x86 processors including Intel's Haswell (released late last year). VMware claims that based on its own testing, using Haswell's new microprocessor architecture with VMware Workstation 11 will offer up to a 45 percent performance improvement for functions such as encryption and multimedia. It will let IT pros and developers build VMs with up to 16 vCPUs, 8TB virtual disks and up to 64GB of memory. It will also connect to vSphere and the vCloud Air public cloud.
For more mainstream users is the new VMware Player 7. Since it's targeted at everyday users rather than just IT pros and administrators, it has fewer of the bells and whistles, but it gains support for the current Windows 8.1 operating system, as well as offering continued support for Windows XP and Windows 7 in desktop virtual environments. "Our goal is to have zero-base support," said William Myrhang, senior product marketing manager at VMware.
VMware Player 7 adds support for the latest crop of PCs and tablets and will be able to run restricted VMs, which, as the name implies, are secure clients that are encrypted, password restricted and can shut off USB access. VMware said the restricted VMs, which can be built with VMware Workstation 11 or VMware Fusion 7 Pro, run in isolation between host and guest operating systems and can have time limits built in.
Posted by Jeffrey Schwartz on 10/08/2014 at 2:25 PM0 comments
Veeam today said it will offer a free Windows endpoint backup and recovery client. The move is a departure from its history of providing replication and backup and recovery software for virtual server environments However, company officials said the move is not a departure from focus, which will remain targeted on protection of server virtual machines, but rather a realization that most organizations are not entirely virtual.
The new Veeam Endpoint Backup software will run on Windows 7 and later OSes (though not Windows RT) to an internal or external (such as USB) disk or flash drive and will include a networked attached storage (NAS) share within the Veeam environment. The company will issue a beta in the coming weeks and the product is due to be officially released sometime next year. The surprise announcement came on the closing day of its inaugural customer and partner conference called VeeamON, held in Las Vegas.
Enterprise Strategy Group Analyst Jason Buffington, who follows the data protection market and has conducted research for Veeam, said offering endpoint client software was unexpected. "At first, I was a little surprised because it didn't seem congruent with that VM-centric approach to things," Buffington said. "But that's another great example of them adding a fringe utility. In this first release, while it's an endpoint solution, primarily, there's no reason you technically couldn't run it on a low-end Windows Server. I'm reasonably confident they are not going to go hog wild into the endpoint protection business. This is just their way to kind of test the code, test customers' willingness for it, as a way to vet that physical feature such that they have even a stronger stranglehold on that midsize org that's backing up everything except a few stragglers."
At a media briefing during the VeeamON conference, company officials emphasized that they remain focused on its core business of protecting server VMs as it plots its growth toward supporting various cloud environments as backup and disaster recovery targets. Doug Hazelman, Veeam vice president of product strategy, indicated that it could be used for various physical servers as well. Hazelman said that the company is largely looking to see how customers use the software, which can perform file-level recoveries. Furthermore he noted that the endpoint software doesn't require any of the company's software and vowed it would remain free as a standalone offering.
"We are not targeting this at an enterprise with 50,000 endpoints," Hazelman said. "We want to get it in the hands of the IT pros and typical Veeam customers and see how we can expand this product and see how we can grow it."
Indeed the VeeamON event was largely to launch a major new release of its flagship suite, to be called the Data Availability Suite v8. Many say Veeam is the fastest growing provider in its market since the company's launch in 2006. In his opening keynote address in the partner track, CEO Ratmir Timashev said that Veeam is on pace to post $500 million in booked revenue (non GAAP) and is aiming to double that to $1 billion by 2018.
In an interview following his keynote, Timashev said the company doesn't have near-term plans for an initial public offering (IPO) and insisted the company is not looking to be acquired. "We're not looking to sell the company," he said. "We believe we can grow. We have proven capabilities to find the next hot market and develop a brilliant product. And when you have this capability, you can continue growing, stay profitable and you don't need to sell."
Timashev added that Veeam can reach those fast-growth goals without deviating from its core mission of protecting virtual datacenters. Extending to a new network of cloud providers will be a key enabler, according to Timashev. The new Data Availability Suite v8, set for release next month (he didn't give an exact date), will incorporate a new interface called Cloud Connect that will let customers choose from a growing network of partners who are building cloud-based and hosted backup and disaster recovery services.
The new v8 suite offers a bevy of other features including what it calls "Explorers" that can now protect Microsoft's Active Directory and SQL Server and provides extended support for Exchange Server and SharePoint. Also added is extended WAN acceleration introduced in the last release to cover replication and a feature called Backup IO, which adds intelligent load balancing.
Posted by Jeffrey Schwartz on 10/08/2014 at 11:30 AM0 comments
Hewlett Packard for decades has resisted calls by Wall Street to divest itself into multiple companies but today it has heeded the call. The company said it would split itself into two separate publicly traded businesses next year. The two companies will leverage their existing storied brand, calling its PC and printing business HP Inc. and the infrastructure and cloud businesses HP Enterprise.
Once the split is complete, Meg Whitman will lead the new HP Enterprise as CEO and serve only as chairman of HP Inc. The move comes less than a week after eBay said it would spin off its PayPal business unit into a separately traded company. Ironically, Whitman, HP's current CEO, was the longtime CEO of eBay during the peak of the dotcom bubble and it too was recently under pressure by activist investors to spin off PayPal, believing both companies would fare better apart.
HP has long resisted calls by Wall Street to split itself into two or more companies. The pressure intensified following the early 2005 departure of CEO Carly Fiorina, whose disputed move to acquire Compaq remained controversial to this day. The company had strongly considered selling off or divesting its PC and printing businesses under previous CEO Leo Apotheker. When he was abruptly dismissed after just 11 months and Whitman took over, she continued the review but ultimately decided a "One HP" would make it a stronger company.
In deciding not to divest back in 2011, Whitman argued remaining together gave it more scale and would put it in a stronger position to compete. For instance, she argued HP was the largest buyer of CPUs, memory, drives and other components.
Now she's arguing that the market has changed profoundly. "We think this is the best alternative," she told CNBC's David Faber in an interview this morning. "The market has changed dramatically in terms of speed. We are in a position to position these two companies for growth"
Even though the question of HP divesting its business has always come up over the years, today's news was unexpected and comes after the company was rumored to be looking at an acquisition of storage giant EMC and earlier cloud giant Rackspace. It's not clear how serious talks, if there were any, were.
The decision to become smaller rather than larger by HP reflects growing pressure by large companies to become more competitive against nimbler competitors. Many of the large IT giants have faced similar pressure. Wall Street has been pushing EMC to split itself from VMware and IBM last week just completed the sale of its industry standard server business to Lenovo. And Dell, led by its founder and CEO Michael Dell, has become a private company.
Microsoft has also faced pressure to split itself up over the years, dating back to the U.S. government's antitrust case. Investors have continued to push Microsoft to consider splitting off or selling its gaming and potentially its devices business off since former CEO Steve Ballmer announced he was stepping down last year. The company's now controversial move to acquire Nokia's handset business for $7.2 billion and the selection of insider Satya Nadella as its new CEO has made that appear less likely. Nadella has said that Microsoft has no plans to divest any of its businesses. But HP's move shows how things can change.
Despite its own "One Microsoft" model, analysts will surely step up the pressure for Microsoft to consider its options. Yet Microsoft may have a better argument that it should keep its businesses intact, with the exception of perhaps Nokia if it becomes a drag on the company.
But as Whitman pointed out, "before a few months ago, we weren't positioned to do this. Now the time is right." And that's why never-say-never is the operative term in the IT industry.
Posted by Jeffrey Schwartz on 10/06/2014 at 11:42 AM0 comments
Microsoft may have succeeded in throwing a curve ball at the world by not naming the next version of its operating system Windows 9. But as William Shakespeare famously wrote in Romeo and Juliet, "A rose by any other name would smell as sweet." In other words, if Windows 10 is a stinker, it won't matter what Microsoft calls it.
In his First Look of the preview, Brien Posey wondered if trying to come up to speed with Apple's OS X had anything to do with the choice in names -- a theory that quickly came to mind by many others wondering what Microsoft is up to (the company apparently didn't say why it came up with Windows 10). Perhaps Microsoft's trying to appeal to the many Windows XP loyalists?
The name notwithstanding, I too downloaded the Windows 10 Preview, a process that was relatively simple. Posey's review encapsulates Microsoft's progress in unifying the modern interface with the desktop and gave his thoughts on where Microsoft needs to move forward before bringing Windows 10 to market. One thing he didn't touch upon is the status of the Charms feature. Introduced in Windows 8, Charms were intended to help find shortcuts in managing your device.
In the preview, the Charms are no longer accessible with a mouse, only with touch. If you got used to using the Charms with a mouse, you're going to have to readjust to using the Start Button again. For some, that may require some readjustment, especially if they continue to use the Charms with their fingers. Would you like to see Microsoft make the Charms available when using a mouse? What would be the downside?
Meanwhile, have you downloaded the Windows 10 Preview? Keep in mind, this is just the first preview and Microsoft is looking for user feedback to decide what features makes the final cut as it refines Windows 10. We'd love to hear your thoughts on where you'd like them to refine Windows 10 so that it doesn't become a stinker.
Posted by Jeffrey Schwartz on 10/03/2014 at 12:35 PM0 comments
As the capabilities of virtual machines reach their outer limits in the quest to build cloud-based software-defined datacenters, containers are quickly emerging as their potential successor. Though containers have long existed, notably in Linux, the rise of the Docker open source container has created a standard for building portable applications in the form of micro-services. As they become more mature, containers promise portability, automation, orchestration and scalability of applications across clouds and virtual machines.
Since releasing Docker as an open source container for Linux, just about every company has announced support for it either in their operating systems, virtual machines or cloud platforms including IBM, Google, Red Hat, VMware and even Microsoft, which in May said it would support Linux-based Docker containers in the infrastructure-as-a-service (IaaS) component of its Azure cloud service. Docker is not available in the Microsoft platform as a service (PaaS) because it doesn't yet support Linux, though it appears only a matter of time before that happens.
"We're thinking about it," said Mark Russinovich, who Microsoft last month officially named CTO of its Azure cloud. "We hear customers want Linux on PaaS on Azure."
Russinovich confirmed that Microsoft is looking to commercialize its own container technology, code-named "Drawbridge," a library OS effort kicked off in 2008 by Microsoft Research Partner Manager Galen Hunt, who in 2011 detailed a working prototype of a Windows 7 library operating system that ran then-current releases of Excel, PowerPoint and Internet Explorer. In the desktop prototype, Microsoft said the securely isolated library operating system instances worked via the reuse of networking protocols. In a keynote address at the August TechMentor conference (which, like Redmond magazine, is produced by 1105 Media) on the Microsoft campus, Redmond magazine columnist Don Jones told attendees about the effort and questioned its future.
During a panel discussion at the Interop conference in New York yesterday, Russinovich acknowledged Drawbridge as alive and well. While he couldn't speak for plans on the Windows client he also stopped short of saying Microsoft plans to include it in Windows Server and Hyper-V. But he left little doubt that that's in the pipeline for Windows Server and Azure. Russinovich said Microsoft has already used the Drawbridge container technology in its new Azure-based machine learning technology.
"Obviously spinning up a VM for them is not acceptable in terms of the experience," Russinovich said. "So we built with the help of Microsoft Research our own secure container technology, called Drawbridge. That's what we used internally. We are figuring out how to make that kind of technology available publicly on Windows." Russinovich wouldn't say whether it will be discussed at the TechEd conference in Barcelona later this month.
Sam Ramji, who left his role as leader of Microsoft's emerging open source and Linux strategy five years ago, heard about Drawbridge for the first time in yesterday's session. In an interview he argued that if Windows Server is going to remain competitive with Linux, it needs to have its own containers. "It's a must-have," said Ramji, who is now VP of strategy at Apigee, a provider of cloud-based APIs. "If they don't have a container in the next 12 months, I think they will probably lose market share."
Despite Microsoft's caginess on its commercial plans for Drawbridge and containers, reading between the lines it appears they're a priority for the Azure team. While talking up Microsoft's support for Docker containers for Linux, Russinovich seemed to position Drawbridge as a superior container technology, arguing its containers are more secure for deploying micro-services.
"In a multi-tenant environment you're letting untrusted code from who knows where run on a platform and you need a security boundary around that," Russinovich said. "Most cloud platforms use the virtual machines as a security boundary. With a smaller, letter-grade secure container, we can make the deployment of that much more efficient," Russinovich said. "That's where Drawbridge comes into play. "
Ramji agreed that the ability to provide secure micro-services is a key differentiator between the open source Docker and Drawbridge. "It's going to make bigger promises for security, especially for third-party untrusted code," Ramji said.
Asked if cloud platforms like the open source OpenShift PaaS, led by Red Hat, can make containers more secure, Krishnan Subramanian, argued that's not their role. "They are not there to make containers more secure. Their role is for the orchestration side of things," Subramanian said. "Security comes with the underlying operating system that the container uses. If they're going to use one of those operating systems in the industry that are not enterprise ready, probably they're not secure."
Russinovich said customers do want to see Windows-based containers. Is that the case? How do you see them playing in your infrastructure and how imperative is it that they come sooner than later?
Posted by Jeffrey Schwartz on 10/01/2014 at 2:25 PM0 comments
Microsoft Research today opened its new online Prediction Lab in a move it said aims to reinvent the way polls and surveys are conducted. The new lab, open to anyone in the format of a game, seeks to provide more accurate predictions than current surveys can forecast today.
Led by David Rothschild, an economist at Microsoft Research and also a fellow at the Applied Statistics Center at Columbia University, the Prediction Lab boasts it already has a credible track record prior to its launch. In some examples released today, the lab predicted an 84 percent chance that Scottish voters would reject the election held to decide whether Scotland should secede from the United Kingdom.
The predictions, published in a blog post called A Data-Driven Crystal Ball, also included data of the winners of all 15 World Cup knockout games this year. And in the 2012 presidential election, the lab got the Obama versus Romney results right in 50 of 51 territories (including Washington, DC). The new interactive platform, released to the public, hopes to gather more data and sentiment of the general population.
"We're building an infrastructure that's incredibly scalable, so we can be answering questions along a massive continuum," Rothschild said in the blog post, where he described the Prediction Lab as "a great laboratory for researchers [and] a very socialized experience" for those who participate.
"By really reinventing survey research, we feel that we can open it up to a whole new realm of questions that, previously, people used to say you can only use a model for," Rothschild added. "From whom you survey to the questions you ask to the aggregation method that you utilize to the incentive structure, we see places to innovate. We're trying to be extremely disruptive."
Rothschild also explained why traditional poling technology is outdated and the need to research new methods like in the Prediction Lab in the era of big data. "First, I firmly believe the standard polling will reach a point where the response rate and the coverage is so low that something bad will happen. Then, the standard polling technology will be completely destroyed, so it is prudent to invest in alternative methods. Second, even if nothing ever happened to standard polling, nonprobability polling data will unlock market intelligence for us that no standard polling could ever provide. Ultimately, we will be able to gather data so quickly that the idea of a decision-maker waiting a few weeks for a poll will seem crazy."
Microsoft is hoping to keep participants engaged with its game-like polling technique, where participants can win or lose points based on making an accurate prediction (if you're wrong, you lose points). This week's "challenge" looks to predict whether President Obama will name a new attorney general before Oct. 5. The second question asks if the number of U.S. states recognizing gay marriages will change next week and the final poll asks if there will be American active combat soldiers in Syria by Oct. 5.
Whether the Microsoft Prediction Lab will gain the status of more popular surveys such as the Gallup polls remains to be seen. But the work in Microsoft Research shows an interesting use of applied quantitative research. Though Microsoft didn't outline plans to extend the Prediction Lab, perhaps some of its technology will have implication for the company's offerings such as Cortana, Bing and even Delve, the new Office Graph technology formerly code-named "Oslo" for SharePoint and Office 365. Now in preview, it's built on Microsoft's FAST enterprise search technology and is designed to work across Office 365 app silos.
Posted by Jeffrey Schwartz on 09/29/2014 at 11:54 AM0 comments
Rackspace, which over the past few years tried to transform itself into the leading OpenStack cloud provider, is shifting gears. The large San Antonio-based service provider last week began emphasizing a portfolio of dedicated managed services that let enterprises run their systems and applications on their choice of virtual platforms -- Microsoft's Hyper-V, VMware's ESX or the open source OpenStack platform.
The new Hyper-V-based managed services include for the first time the complete System Center 2012 R2 stack, including Windows Server and Storage Spaces. The orchestrated services are available as dedicated single-tenant managed services for testing and are set for general availability in the United States in November, followed by the United Kingdom and Sydney in the first quarter of next year. Insiders had long pushed the company to offer Hyper-V-based managed services but until recently, it was met with resistance by leadership that wanted to stick with VMware for hosted services.
Months after Rackspace CEO Lanham Napier stepped down in February and the company retained Morgan Stanley to seek a buyer in May, the company last week said it didn't find a buyer and decided to remain independent, naming Rackspace President Taylor Rhodes as the new CEO. A day later, the company said it's no longer just emphasizing OpenStack. Instead, the company is promoting its best-of-breed approach with Hyper-V, VMware and OpenStack.
I caught up with CTO John Engates at a day-long analyst and customer conference in New York, where he explained the shift. "The vast majority of IT is still done today in customer-run datacenters by IT guys," Engates explained. "The small fraction of what's going to the cloud today is early stage applications and they're built by sometimes a sliver of the IT organization that's sort of on the bleeding edge. But there are a lot of applications that still move to the cloud as datacenters get old, as servers go through refresh cycles. But they won't necessarily be able to go to Amazon's flavor of cloud. They will go to VMware cloud, or a Rackspace-hosted VMware cloud, or a Microsoft-based cloud."
In a way, Rackspace is going back to its roots as the company was born as a hosting provider until a few years ago when it decided to base its growth on competing with large cloud providers by building out its entire cloud infrastructure on OpenStack to offer an option to Amazon Web Services cloud offerings, with the added benefit of offering its so-called "fanatical support." Rackspace codeveloped OpenStack with NASA as an open source means of offering portable Amazon-compatible cloud services. While it continued to offer other hosting services including a Microsoft-centric Exchange, Lync and SharePoint managed services offering, it was a relatively small portion of its business, ran only on VMware and remained in the shadow of Rackspace's OpenStack push.
A report released by 451 Research shows OpenStack, though rapidly growing, accounts for $883 million of the $56 billion market for cloud and managed services. "Rackspace for a long time was seen part and parcel of the Amazon world with public cloud and they're clearly repositioning themselves around the fastest growing, most important part of the market, which is managed cloud and private cloud," said 451 Research Analyst and Senior VP Michelle Bailey. "They can do cloud with customer support, which is something you don't typically get with the larger public provides. They have guarantees around availability, and they'll sign a business agreement with customers, which is what you'll see from traditional hosting and service providers."
Like others, Bailey said there's growing demand for managed services based on Hyper-V-based single-tenant servers. "With the Microsoft relationship they're able to provide apps," she said. "So you're able to go up the stack. It's not just the infrastructure piece that you're getting with Microsoft but specifically Exchange, SharePoint and SQL Server. These are some of the most commonly used applications in the market and Microsoft has made it very good for their partners to be able to resell those services now. When you get into an app discussion with a customer, it's a completely different discussion."
Jeff DeVerter, general manager for the Microsoft private cloud practice at Rackspace, acknowledged it was a multi-year effort to get corporate buy-in for offering services on Hyper-V and ultimately the Microsoft Cloud OS stack. "I had to convince the senior leadership team at Rackspace that this was the right thing to do," said DeVerter. "It was easier to do this year than it was in previous years because we were still feeling our way from an OpenStack perspective. If you look at the whole stack that is Microsoft's Cloud OS, it really is a very similar thing to what the whole stack of OpenStack is. Rackspace has realized the world is not built on OpenStack because there really are traditional enterprise applications [Exchange, Lync and SharePoint] that don't fit there. They're not written for the OpenStack world."
DeVerter would know, having come to the company six years ago as a SharePoint architect and helped grow the SharePoint, Exchange and ultimately Lync business to $50 million in revenues. Aiding that growth was the Feb. 2012 acquisition of SharePoint 911, whose principals Shane Young and Todd Klindt and helped make the case for moving those platforms from VMware to Hyper-V.
Posted by Jeffrey Schwartz on 09/26/2014 at 7:46 AM0 comments
Within moments of last week's news that Larry Ellison has stepped down as Oracle's CEO to become CTO, social media lit up. Reaction such as "whoa!" and "wow!" preceded every tweet or Facebook post. In reality, it seemed like a superficial change in titles.
For all intents and purposes, the new CEOs, Mark Hurd and Safra Catz, were already running the company, while Ellison had final say in technical strategy. Hence it's primarily business as usual with some new formalities in place. Could it be a precursor to some bombshell in the coming days and weeks? We'll see but there's nothing obvious to suggest that.
It seems more likely Ellison will fade away from Oracle over time, rather than have a ceremonial departure like Bill Gates did when he left Microsoft in 2008. Don't be surprised if Ellison spends a lot more time on his yacht and on Lanai, the small island he bought in 2012 near Hawaii that he is seeking to make "the first economically viable, 100 percent green community," as reported by The New York Times Magazine this week.
For now, Hurd told The Times that "Larry's not going anywhere." In fact Hurd hinted despite incurring his wrath on SAP and Salesforce.com over the past decade, Ellison may revisit his old rivalry with Microsoft, where he and Scott McNealy, onetime CEO of Sun Microsystems (ironically now a part of Oracle), fought hard and tirelessly to end Microsoft's Windows PC dominance. They did so in lots of public speeches, lawsuits and a strong effort to displace traditional PCs with their network computers, which were ahead of their time. Ellison also did his part in helping spawn the Linux server movement by bringing new versions of the Oracle database on the open source platform first, and only much later on Windows Server.
While Oracle has fiercely competed with Microsoft in the database market and the Java versus .NET battle over the past decade, with little left to fight about, Ellison largely focused his ire on IBM, Salesforce and SAP. Nevertheless Oracle's agreement with Microsoft last year to support native Java and the Oracle database and virtual machines on the Microsoft Azure public cloud was intriguing as it was such an unlikely move at one time.
Now it appears Ellison, who years ago mocked cloud computing, has Azure envy due to it having one of the more built-out PaaS portfolios. Hurd told The Times Oracle is readying its own platform as a service (PaaS) that will compete with Azure that Ellison will reveal at next week's Oracle OpenWorld conference in San Francisco. Newly promoted CEO Hurd told The Times Ellison will announce a PaaS aimed at competing with the Azure PaaS and SQL Server in a keynote. Suggested (but not stated) was that Ellison will try to pitch it as optimized for Microsoft's .NET language.
Oracle's current pact with Microsoft is primarily focused on Azure's infrastructure-as-a-service (Iaas) offerings, not PaaS. Whether Oracle offers a serious alternative to running its wares on IaaS remains to be seen. If indeed Oracle aims to compete with Microsoft on the PaaS front, it's likely going to be an offering that will give existing Oracle customers an alternative cloud to customers who would never port their database and Java apps to Azure.
However Ellison positions this new offering, unless Oracle has covertly been building a few dozen datacenters globally that match the scale and capacity of Amazon, Azure, Google and Salesforce.com --which would set off a social media firestorm -- it's more likely to look like a better-late-than-never service well suited and awaited by many of Oracle's customers.
Posted by Jeffrey Schwartz on 09/24/2014 at 11:14 AM0 comments
Apple today said it has sold 10 million of its new iPhones over the first three days since they arrived in stores and at customers' doorsteps Friday. This exceeds analysts' and the company's forecasts. In the words of CEO Tim Cook, sales of its new iPhone 6 and iPhone 6 Plus models have led to the "best launch ever, shattering all previous sell-through records by a large margin."
Indeed analysts are noting that the figures are impressive, especially considering they haven't yet shipped in China, where there's large demand for the new iPhones. How and if the initial results will embolden the iPhone and iOS, in a market that has lost market leadership share to Android, remains to be seen. But it appears the company's unwillingness to deliver a larger phone earlier clearly must have cost it market share with those with pent up demand for larger smartphones. Most Android phones and Windows Phones are larger than the previous 4-inch iPhone 5s and the majority of devices are even larger than 4.5 inches these days.
The company didn't break out sales of the bigger iPhone 6 versus the even larger 5.5-inch iPhone 6 Plus, but some believe the latter may have had an edge even if they're more scarce. But if the sales Apple reported are primarily from existing iPhone users, that will only stabilize its existing share, not extend it. However, as the market share for Windows Phone declines, demand for the iPhone will grow on the back of features such as Apple Pay and the forthcoming iWatch that saturate the media. This won't help the market for Microsoft-based phones (and could bring back some Android users to Apple).
It doesn't appear the new Amazon Fire phones, technically Android-based devices, are gaining meaningful share. Meanwhile BlackBerry is readying its first new phone since the release of the BlackBerry 10 last year. Despite miniscule market share, BlackBerry CEO John Chen told The Wall Street Journal that the new 4.5-inch display that will come with its new Passport will help to make it an enterprise-grade device that's targeted at productivity. It will also boast a battery that can power the phone for 36 hours and a large antenna designed to provide better reception. In addition to enterprise users, the phone will be targeted at medical professionals.
With the growing move by some to so-called phablets, which the iPhone 6 Plus arguably is (some might say devices that are over 6 inches better fit that description), these larger devices are also expected to cut into sales of 7-inch tablets. In Apple's case, that includes the iPad mini. But given the iPad Mini's price and the fact that not all models have built-in cellular connectivity, the iPhone 6 Plus could bolster Apple more than hurt it.
Despite Microsoft's efforts to talk up improvements to Windows Phone 8.1 and its emphasis on Cortana, it appears the noise from Apple and the coverage surrounding it is all but drowning it out. As the noise from Apple subsides in the coming weeks, Microsoft will need to step up the volume.
Posted by Jeffrey Schwartz on 09/22/2014 at 12:03 PM0 comments
Microsoft earlier this week said two more longtime board members are stepping down and the company has already named their replacements, which will take effect Oct. 1.
Among them are David Marquardt, a venture capitalist who was an early investor in Microsoft, and Dina Dublon, the onetime chief financial officer of J.P. Morgan. Dublon was on Microsoft's audit committee and chaired its compensation, The Wall Street journal noted. The paper also raised an interesting question: Would losing the two board members now and others over the past two years result in a gap in "institutional knowledge?" Marquardt, with his Silicon Valley ties, played a key role in helping Microsoft "get off the ground and is a direct link to the company's earliest days."
The two new board members are Teri List-Stoll, chief financial officer of Kraft Foods and Visa CEO Charles Scharf, who once ran J.P. Morgan Chase's retail banking and private investment operations. Of course the moves come just weeks after former CEO Steve Ballmer stepped down.
Nadella will be reporting to a board with six new members out of a total of 10 since 2012. Indeed that should please Wall Street investors who were pining for new blood during last year's search for Ballmer's replacement and didn't want to see an insider get the job.
But the real question to be determined is will these new voices teamed with Nadella strike a balance that is needed for Microsoft to thrive in the fiercest competitive market it has faced to date?
Posted by Jeffrey Schwartz on 09/18/2014 at 3:32 PM0 comments
IBM's New M5 Servers Include Editions for Hyper-V and SQL Server
In what could be its last major rollout of new x86 systems if the company's January deal to sell its commodity server business to Lenovo for $2.3 billion goes through, IBM launched the new System x M5 line. The new lineup of servers includes systems designed to operate the latest versions of Microsoft's Hyper-V and SQL Server.
IBM said the new M5 line offers improved performance and security. With a number of models for a variety of solution types, the new M5 includes various tower, rack, blade and integrated systems that target everything from small workloads to infrastructure for private clouds, big data and analytic applications. The two systems targeting Microsoft workloads include the new IBM System x Solution for Microsoft Fast Track DW for SQL Server 2014 and an upgraded IBM Flex System Solution for Microsoft Hyper-V.
The IBM System x Solution for Microsoft SQL Data Warehouse on X6 is designed for data warehouse workloads running the new SQL Server 2014, which shipped in April. IBM said it offers rapid response to data queries and enables scalability as workloads increase. Specifically, the new systems are powered by Intel Xeon E7-4800/88 v2 processors. IBM said the new systems offer 100 percent faster database performance than the prior release, with three times the memory capacity and a third of the latency of PCIe-based flash. The systems can pull up to 12TB of flash memory-channel storage near the processor as well.
As a Microsoft Fast Track Partner, IBM also added its IBM System x3850 X6 Solution for Microsoft Hyper-V. Targeting business-critical systems, the two-node configuration uses Microsoft Failover Clustering, aimed at eliminating any single point of failure, according to IBM's reference architecture. The Hyper-V role installed on each clustered server, which hosts virtual machines.
Unitrends Adds Reporting Tools To Monitor Capacity and Storage Inventory
Backup and recovery appliance supplier Unitrends has added new tools track storage inventory and capacity, designed to help administrators more accurately gauge their system requirements in order to lower costs.
The set of free tools provide views of storage, file capacity and utilization. Unitrends said they let administrators calculate the amount of storage they need to make available for backups and prioritize files that need to be backed up. The tools provide single point-in-time snapshots of storage and files distributed throughout organizations' datacenters to help determine how much capacity they need and prioritize which files get backed up based on how much storage is available.
There are two separate tools. The Unitrends Backup Capacity Tool provides the snapshot of all files distributed throughout an organization on both storage systems as well as servers, providing file-level views of data to plan backups. It provides reports for planning, while outlining file usage. The other, the Unitrends Storage Inventory Tool, provides a view of an organization's entire storage infrastructure, which the company says offers detailed inventories of storage assets in use and where there are risks of exceeding available capacity.
These new tools follow July's release of BC/DR Link, an online free tool that helps organizations create global disaster recovery plans. It includes 1GB of centralized storage in the Unitrends cloud, where customers can store critical document.
CloudLink Adds Microsoft BitLocker To Secure Workloads Running in Amazon
Security vendor CloudLink's SecureVM, which provides Microsoft BitLocker encryption, has added Amazon Web Services to its list of supported cloud platforms. The tool lets customers implement the native Windows encryption to their virtual machines -- both desktop and servers -- in the Amazon cloud. In addition to Amazon, it works in Microsoft Azure and VMware vCloud Air, among other public clouds.
Because virtual and cloud environments naturally can't utilize the BitLocker encryption keys typically stored on TPM and USB hardware, SecureVM emulates that functionality by providing centralized management of the keys. It also lets customers encrypt their VMs outside of AWS.
Customers can start their VMs only in the intended environment. It's policy based to ensure the VMs only launch when authorized. It also provides an audit trail tracking then VMs are launched, with other information such as IP addresses, hosts and operating systems, among other factors.
Data volumes designated to an instance are encrypted, allowing customers to encrypt other data volumes, according to the company. Enterprises maintain control of encryption key management including the option to store keys in-house.
Posted by Jeffrey Schwartz on 09/18/2014 at 12:09 PM0 comments
As the IT industry looks at the future of the virtual machine, containers have jumped out as the next big thing and every key player with an interest in the future of the datacenter is circling the wagons around Silicon Valley startup Docker. That includes IBM, Google, Red Hat, VMware and even Microsoft. Whether it is cause or effect, big money is following Docker as well.
Today Sequoia Capital has pumped $40 million in Series C funding, bringing its total funding to $66 million and estimated valuation at $400 million. Early investors in Docker include Benchmark, Greylock Partners, Insight Ventures, Trinity Ventures and Yahoo Cofounder Jerry Yang. Docker's containers aim to move beyond the traditional virtual machine with its open source platform for building, shipping and running distributed applications.
As Docker puts it, the limitation of virtual machines is that they include not only the application, but the required binaries, libraries and an entire guest OS, which could weigh tens of gigabytes, compared with just a small number of megabytes for the actual app. By comparison, the Docker Engine container consists of just the application and its dependencies.
"It runs as an isolated process in user space on the host operating system, sharing the kernel with other containers," according to the company's description. "Thus, it enjoys the resource isolation and allocation benefits of VMs but is much more portable and efficient."
With the release of Docker in June, Microsoft announced support for the Linux-based containers by updating the command-line interface in Azure, allowing customers to build and deploy Docker-based containers in Azure. Microsoft also said customers can manage the virtual machines with the Docker client. As reported here by John Waters, Microsoft's Corey Sanders, manager of the Azure compute runtime, demonstrated this capability at DockerCon at the time.
On the Microsoft Open Technologies blog, Evangelist Ross Gardler outlined how to set up and use Docker on the Azure cloud service. According to Gardler, common use cases for Docker include:
- Automating the packaging and deployment of applications
- Creation of lightweight, private PaaS environments
- Automated testing and continuous integration/deployment
- Deploying and scaling web apps, databases and backend services
At VMworld last month, VMware talked up its support for Docker, saying it has teamed with the company, joined by its sister company Pivotal as well as Google, to enable their collective enterprise customers to run and manage apps in containers in public, private and hybrid cloud scenarios as well as on existing VMware infrastructure.
Containers are a technology to watch, whether you're a developer or IT pro. The entire IT industry has embraced (at least publicly) it as the next generation of virtual infrastructure. And for now, Docker seems to be setting the agenda for this new technology.
Posted by Jeffrey Schwartz on 09/16/2014 at 11:57 AM0 comments
Apple today said preorders of its new iPhone 6 and its larger sibling, the 6 Plus, totaled a record 4 million in the first 24 hours, which doubled the preorders that the iPhone 5 received two years ago. But those numbers may suggest a rosier outlook than they actually portend.
Since Apple didn't release similar figures for last year's release of the iPhone 5s, which was for the most part an incremental upgrade over its then year-old predecessor as well as the lower-end 5c, it suggests customers are sitting on a number of aging iPhones. That includes earlier models which are now reaching the point of sluggishness due to upgrades to iOS running on slower processors and the fact they can only run on 3G networks.
Perhaps boosting demand was the fact that Apple for the first time offered an attractive promotion -- it will offer a $200 trade-in for an older iPhone if it's in working condition. For now, that offer is only good through this Friday but it wouldn't be surprising if Apple extended it or reintroduced it through the holiday season.
Also while I visited a local Verizon owned and operated store on Saturday, I noticed a few customers interested in switching out their larger Android phones for the new 6 Plus. But most of those orders Apple reported must have come from online because the vast majority were there looking at Android-based phones. Those wanting a new iPhone, especially the larger one, will have to wait at least a month or more, although Apple always seems to like to play those supply shortages early on when releasing new devices.
Many customers these days are less loyal to any one phone platform and are willing to switch if one has hardware specs that meet their needs -- perhaps its size, the camera or even the design. For example, I observed one woman who wanted to replace her damaged iPhone with iPhone 6 Plus but when the rep told her she'd have to wait a few weeks, she said she'd just take an Android phone since she needed a new one right away. I saw another man warning his son that if he switched to an Android phone, he'd lose all his iOS apps. The teenager was unphased by that and also bought an Android Phone.
Meanwhile, no one was looking at the Nokia Lumia 928s or Icons and the store employees told me that they sell few Windows Phones, which they estimated accounted for less than 5 percent of phone sales. Perhaps that will change, following Microsoft's deal to acquire Mojang, purveyor of the popular Minecraft game? That's a discussion for my post about today's announcement by Microsoft to acquire Mojang for $2.5 billion.
For those who did purchase a new iPhone, it appears there was more demand for the larger unit with the 5.5-inch display compared with its junior counterpart, which measures 4.7-inches (still larger than the earlier 5/5s models). Apple didn't provide a breakdown. If you're an iPhone user, do you plan to upgrade to a newer one or switch to another platform? What size do you find most appealing?
Posted by Jeffrey Schwartz on 09/15/2014 at 12:08 PM0 comments
Microsoft has pulled the trigger on a $2.5 billion deal to acquire Mojang, the developer of the popular Minecraft game. Rumors that a deal was in the works surfaced last week, though the price tag was initially said to be $2 billion. It looks like the founders of the 5-year-old startup squeezed another half-billion dollars out of Microsoft over the weekend.
Minecraft is the largest selling game on Microsoft's popular Xbox platform. At first glance, this move could be a play to make it exclusive to Microsoft's gaming system to keep it out of the hands of the likes of Sony. It could even signal to boost its declining Windows Phone business or even its Windows PC and tablet software. However if you listen to Xbox Head Phil Spencer and Microsoft's push to support all device platforms, that doesn't appear to be the plan.
"This is a game that has found its audience on touch devices, on phones, on iPads, on console and obviously its true home on PC. Whether you're playing on an Xbox, whether you're playing on a PlayStation, an Android or iOS device, our goal is to continue to evolve with and innovate with Minecraft across all those platforms," Spencer said in a prerecorded announcement on his blog.
If you consider CEO Satya Nadella's proclamation in July that Microsoft is the "productivity and platforms company" and it spent more than double what it cost to acquire enterprise social media company Yammer on Mojang, it may have you wondering how this fits into that focus. In the press release announcing the deal, Nadella stated: "Minecraft is more than a great game franchise -- it is an open world platform, driven by a vibrant community we care deeply about, and rich with new opportunities for that community and for Microsoft."
That could at least hint that the thinking is the platform and community the founders of Mojang created could play a role in UI design that doesn't rely on Windows or even Xbox. Others have speculated that this is a move to make Microsoft's gaming business ripe for a spinoff or sale, something investors want but a move the Nadella has indicated is not looking to make.
"The single biggest digital life category, measured in both time and money spent, in a mobile-first world is gaming," Nadella said in his lengthy July 10 memo, announcing the company's focus moving forward. "We also benefit from many technologies flowing from our gaming efforts into our productivity efforts --core graphics and NUI in Windows, speech recognition in Skype, camera technology in Kinect for Windows, Azure cloud enhancements for GPU simulation and many more. Bottom line, we will continue to innovate and grow our fan base with Xbox while also creating additive business value for Microsoft."
The deal also makes sense from another perspective: Minecraft is hugely popular, especially with younger people -- a demographic that is critical to the success in any productivity tool or platform. Clearly Nadella is telling the market and especially critics that it's not game over for Microsoft.
Posted by Jeffrey Schwartz on 09/15/2014 at 12:24 PM0 comments
Hewlett Packard's surprising news that it has agreed to acquire Eucalyptus potentially throws a monkey wrench into Microsoft's recently stepped-up push to enable users to migrate workloads from Amazon Web Services to the Microsoft Azure public cloud.
As I noted last week, Microsoft announced its new Migration Accelerator, which migrates workloads running on the Amazon Web Services cloud to Azure. It's the latest in a push to accelerate its public cloud service, which analysts have recently said is gaining ground.
By acquiring Eucalyptus, HP gains a tool sanctioned by Amazon to enable AWS-enabled workloads in its private cloud service. Eucalyptus signed a compatibility pact with Amazon in March 2012 that enables it to use Amazon APIs including Amazon Machine Images (AMIs) for its open source private cloud operating system software.
The deal, announced late yesterday, also means Eucalyptus CEO Marten Mickos will become general manager of HP's Cloud services and will report to HP Chairman and CEO Meg Whitman. Mickos, the onetime CEO of MySQL, has become a respected figure in infrastructure-as-a-service (IaaS) cloud circles. But the move certainly raised eyebrows.
Investor and consultant Ben Kepes in a Forbes blog post questioned whether Eucalyptus ran out of money and was forced into a fire sale or if the acquisition was a desperate move by HP to give a push to its cloud business. HP has had numerous management and strategy shifts with its cloud business.
"One needs only to look at the employee changes in the HP cloud division." Kepes wrote. "Its main executive, Biri Singh, left last year. Martin Fink has been running the business since then and now Mickos will take over -- he'll apparently be reporting directly to CEO Meg Whitman but whether anything can be achieved given Whitman's broad range of issues to focus on is anyone's guess."
Ironically on Redmond magazine's sister site Virtualization Review, well-known infrastructure analyst Dan Kusnetzky had just talked with Bill Hilf, senior vice president, product and service management, HP Cloud. He shared his observations on HP's new Helion cloud offering prior to the Eucalyptus deal announcement. Helicon is built on OpenStack, though HP also has partnerships with Microsoft, VMware and CloudStack.
"The deal represents HP's recognition of the reality that much of what enterprise developers, teams and lines of business do revolves around Amazon Web Services," said Al Sadowski, a research director at 451 Research.
"HP clearly is trying to differentiate itself from Cisco, Dell and IBM by having its own AWS-compatible approach," Kusnetzky added. "I'm wondering what these players are going to do once Eucalyptus is an HP product. Some are likely to steer clients to OpenStack or Azure as a way to reduce HP's influence in their customer bases."
It also raises questions about the future of Helicon, which despite HP's partnerships, emphasizes OpenStack -- something Mickos has been a longtime and vocal supporter of. "We are seeing the OpenStack Project become one of the largest and fastest growing open source projects in the world today," Mickos was quoted as saying on an HP blog post.
Hmm. According to a research report released by 451 Research, OpenStack revenues were $883 million of IaaS revenues, or about 13 percent. They are forecast to double to about $1.7 billion of the $10 billion in 2016, or 17 percent. Not trivial but not a huge chunk of the market either.
HP clearly made this move to counter IBM's apparent headway with its cloud service, even though it appears the three front runners are Amazon, Microsoft and Google. In order to remain a player, HP needs to have compatibility with all three, as well as OpenStack, and acquiring Eucalyptus gives it a boost in offering Amazon compatibility, even if it comes at the expense of its server business, as noted by The New York Times.
In my chats with Mickos over the years, Eucalyptus hadn't ruled out Azure compatibility but admitted it hadn't gone very far last time we spoke over a year ago. Time will tell if this becomes a greater priority for both companies.
Regardless, 451 Research's Sadowski indicated that HP's move likely targets IBM more than Microsoft. "HP is hoping to capture more of the enterprise market as organizations make their way beyond Amazon (and Azure) and build out their private and hybrid cloud deployments," he said. "We would guess that in acquiring Eucalyptus, the company is seeking to replicate the story that IBM has built with its SoftLayer buy and simultaneous OpenStack support."
Posted by Jeffrey Schwartz on 09/12/2014 at 12:51 PM0 comments
Apple's much-anticipated launch event yesterday "sucked the air out of the room" during the opening keynote session at the annual Tableau Customer conference, taking place in Seattle, as my friend Ellis Booker remarked on Facebook yesterday.
Regardless how you view Apple, the launch of its larger iPhone 6 and 6 Plus models, along with the new payment service and smartwatch, were hard to ignore. Positioned as an event on par with the launch of the original Macintosh 30 years ago, the iPod, iPhone and iPad, the new Apple Watch and Apple Pay made for the largest launch event the company has staged in over four years. It was arguably the largest number of new products showcased at a single Apple launch event. Despite all the hype, it remains to be seen if it will be remembered as disruptive as Apple's prior launches. Yet given its history, I wouldn't quickly dismiss the potential in what Apple announced yesterday.
The Apple Watch was the icing on the cake for many fans looking for the company to show it can create new markets. But critics were quickly disappointed when they learned the new Apple Watch would only work if linked it to your new iPhone (or last year's iPhone 5s). Many were surprised to learn that it will be some time before the component circuitry for providing all of the communications functionality of a phone or tablet can be miniaturized for a watch.
This is not dissimilar to other smartwatches on the market. But this technology is not yet the smartwatch Dick Tracy would get excited about. No one knows if the Apple Watch (or any smartwatch) will be the next revolution in mobile computing or if it will be a flop. And even if it is the next big thing, it isn't a given that Apple, or any other player, will own the market.
Yet Apple does deserve some benefit of the doubt. Remember when the first iPod came out and it only worked with a Mac? Once Apple added Windows compatibility, everything changed. Likewise when the first iPhone came out in 2007, it carried a carrier-subsidized price tag of $599. After few sold, Apple quickly reduced them to $399. But it wasn't until later when the company offered $199 iPhones did the smartphone market take off. It's reasonable to expect there will be affordable smartwatches if it does become a mass market. I'd think the sweet spot will be under $150, but who knows for sure.
While Apple was expected to introduce a watch, the variety of watches it will offer was surprising. Those pining for an Apple Watch will have to wait until early next year but it's hard to see these flying off the shelves in their current iteration. I am less certain Apple will open up its watch to communicate with Android and Windows Phones, though that would open the market for its new devices just as adding Windows support to its iPods did.
Still, it's looking more likely that those with Android and Windows Phones will choose watches running on the same platforms. Indeed there are a number of Android-based alternatives such as the new Android-based Moto 360 for $249 and the Motorola Mobility LG G Watch, available now for $179 in the Google Play store. They too require connectivity to an Android Phone.
For its part, Microsoft is planning its own smartwatch with a thin form factor resembling many of the fitness-oriented watches. It will have 11 sensors and will be cross platform, according to a Tom's Hardware report.
As I noted, the Apple Watch was icing on an event intended to announce the pending ability of two the new iPhones. Arriving as expected, one measures 4.7 inches and the other is a 5.5-inch phablet that is almost the size of an iPad mini. They're poised to appeal to those who want something like Samsung's large Galaxy line of phones but aren't enamored by Android. The new iPhones will also put pressure on Microsoft to promote its large 6-inch Nokia Lumia 1520, which sports a 20 MP camera and a 1920x1080 display. Though Apple says its camera only has 8 megapixels, the company emphasized a new burst mode (60 fps) that can intelligently pull the best photo in a series of images. The new iPhone 6 Plus also has a 1920x1080 display and starts at $299 for a 16GB model (the standard iPhone 6 is still $199).
Besides some other incremental improvements in the new iPhones, perhaps the most notable new capability will be their support for the company's new Apple Pay service. This new capability will allow individuals to make payments at supporting merchants by using the phone's fingerprint recognition interface called Touch ID, which accesses a NFC chip on the phone that stores encrypted credit card and shipping information.
If Apple (and ultimately all smartphone suppliers) can convince customers and merchants that this is a more secure way of handling payments than traditional credit cards, we could see the dawn of a new era in how transactions are made. A number of high-profile breaches including this week's acknowledgment by Home Depot that its payment systems were compromised, could, over the long run, hasten demand for this technology if it's proven to be more secure. Of course, Apple has to convince sceptics that last week's iCloud breach was an isolated incident.
Regardless of your platform preference or if you use best of breed, we now have a better picture of Apple's next generation of wares. We can expect to hear what's in the pipeline for the iPad in the next month or two. Reports suggest a larger one is in the works.
Posted by Jeffrey Schwartz on 09/10/2014 at 12:07 PM0 comments
In critiquing the lack of availability of apps for Windows Phone last week, dozens of readers took issue with my complaints, effectively describing them as trivial and ill-informed. The good news is there are a lot of passionate Windows Phone users out there, but, alas, not enough -- at least for now -- to make it a strong enough No. 3 player against the iPhone and Android-based devices. Though the odds for it becoming a solid three-horse race appear to be fading, I really do hope that changes.
While I noted the increased number of apps for Windows Phone, many readers came at me saying I overstated the fact that many apps are still missing. "I am not sure what all the griping is about," Sam wrote. "I personally think that this whole 'app thing' is a step backward from a decade ago when we were moving toward accessing information using just a browser. Now you buy dumb devices and apps are a must."
Charles Sullivan agreed. "I have a Windows Phone 7, which is now nearly 4 years old and I rarely think about apps," he said. "Apps, for the most part, are for people that cannot spell their own name, which is a lot of people." Regarding the lack of a Starbucks app, Sam added, "Seriously, if you need an app to buy coffee, your choice of a phone is the least of your problems." Several critics said the apps I cited were missing were indeed available. "Yeah, I took two seconds and searched on my Windows Phone and found most of the apps. Seems like zero research went into this," Darthjr said.
Actually, I did search the Windows Store for every app I listed, some on numerous occasions, a few players I even called. If I overlooked some exact apps that are actually there, I apologize. But the truth remains that there's still a vast number of apps available for the iPhone and Android that aren't available for Windows Phone. In many cases they aren't even in the pipeline. Just watch a commercial or look at an ad in print or online and you'll often be directed to download their apps on either iOS or Android.
In some instances, there are similar apps offered by different developers. "Sorry, but your examples are poor and just continue to perpetuate the notion that if you have a Windows Phone, you are helpless," reader The WinPhan lamented. "Everything you listed as your 'daily go-to's' in app usage can be found in the Windows Phone Store, maybe not by same developers/name, but with same functionality and end result, with the exception of your local newspaper and cable. Please do better research."
Acknowledging the lead iOS and Android have, Curtis8 believes the fragmentation of Android will become a bigger issue and Windows Phone will incrementally gain share. "As Windows Phone gets better, with more markets and coverage, devs will start to support it more," Curtis8 noted, adding that iPhone will likely remain a major player for some time. "But I do see Windows Phone gaining traction regardless of the numbers people try to make up. One of our biggest fights is not the consumer, it is the carriers and sales people. Walk into any store and ask about Windows Phone: selection is crap, stock is crap and the sales people will try to convince you on an iPhone or Samsung device. Many people do not feel the missing apps as much if they stay on Windows Phone and find what we have."
That's absolutely true. As a Verizon customer, I've gone into a number of company-owned stores and the story remains the same. You're lucky if you can even find Windows Phones if you're looking for them in the several Verizon Stores I've visited. Forget about it if you're not looking for one. Ask a sales rep about Windows Phone and they'll effectively say that no one's buying them. The selection of Windows Phones on AT&T is better thanks to a partnership with Nokia.
Some respondents argued that the question of apps will become a moot point as Cortana, the voice-activated feature in the latest version of Windows Phone 8.1, catches on, especially if it's also included in the anticipated new Windows 9, code-named "Threshold," as is rumored to be a possibility.
Prabhujeet Singh, who has a Nokia Lumia 1020 indicated he loves the 41 megapixel camera (indeed a key differentiator), "Cortana is amazing. I tell it to set an alarm, set a reminder and when I am driving, I just speak the address. It understands my accent and I am not a native speaker of English. Do I miss apps? Nope. Not at all."
Tomorrow could very well mark an inflection point as Apple launches its next generation of mobile phones -- the larger iPhone 6 models -- and its widely anticipated iWatch. While that's a topic for another day, a report in The Wall Street Journal Saturday said the new Apple iWatch will be enabled by a health app to be integrated in iOS 8 called HealthKit. Several key health care institutions including Sloan Memorial Cancer Center in New York, insurance giant Kaiser Permanente and the Mayo Clinic were on board in some fashion.
If Apple has -- or can gain -- the broad support of the health care industry it had of the music industry when it introduced the iPod 13 years ago despite a crowded market of MP3 music players, it could spawn a new market. To date others including Google with Google Health and Microsoft with its HealthVault community has seen only limited success. On the other hand, some argue that winning in the health care market is a much smaller fish to reel in than capturing the huge music and entertainment market over a decade ago. Nevertheless, Apple's success in the health and fitness market would be a boon to the company at the expense of others.
As long as Microsoft says it's committed to its mobile phone business, it would be foolish to write off the platform. Much of the future of its Nokia Lumia phone business could be tied to Microsoft's commitment to hardware (including its Surface business). As Mary Jo Foley noted in her Redmond magazine column this month: "Like the Lumia phones, the role of the Surface in the new Microsoft will be a supporting, not a starring one. As a result, I could see Microsoft's investment in the Surface -- from both a monetary and staffing perspective -- being downsized, accordingly."
Even if that turns out to be the case, it doesn't portend the end for Windows and Windows Phone, especially if it can get OEMs on board. For now, that's a big if, despite removing fees to OEMs building devices smaller than nine inches. Fortunately for Microsoft, as recently reported, the company's mobility strategy doesn't depend merely on Windows, but on its support for all platforms, as well. The good news is this market is still evolving. For now, though, as long as apps matter, despite some of its unique niceties, Windows Phone faces an uphill battle.
Posted by Jeffrey Schwartz on 09/08/2014 at 2:52 PM0 comments
Back in the 1990s when America Online ruled the day, Microsoft's entry with MSN followed in AOL's footsteps. Microsoft is hoping its latest MSN refresh tailored for the mobile and cloud era will take hold and its new interface is quite compelling.
Launched as a public preview today, the new MSN portal is a gateway to popular apps such as Office 365, Outlook.com, Skype, OneDrive, Xbox Music as well as some outside services, notably Facebook and Twitter. The interface to those services is the Service Stripe. After just spending a short amount of time with it, I'm already considering replacing My Yahoo as my longtime default browser portal home page.
"We have rebuilt MSN from the ground up for a mobile-first, cloud-first world," wrote Brian MacDonald, Microsoft's corporate VP for information and content experiences, in a blog post. "The new MSN brings together the world's best media sources along with data and services to enable users to do more in News, Sports, Money, Travel, Food & Drink, Health & Fitness, and more. It focuses on the primary digital daily habits in people's lives and helps them complete tasks across all of their devices. Information and personalized settings are roamed through the cloud to keep users in the know wherever they are."
If anything, MacDonald downplayed its ability to act as an interface to some of the most widely used services, while providing a rich offering of its own information and access to services utilized for personal use. The preview is available now for Windows and will be coming shortly to iOS and Android. Could the new MSN be the portal that gives Microsoft's Bing search engine the boost it needs and brings more users someday to Cortana?
Posted by Jeffrey Schwartz on 09/08/2014 at 2:22 PM0 comments
Microsoft is readying a tool that it says will "seamlessly" migrate physical and virtual workloads to its Azure public cloud service. A limited preview of the new Migration Accelerator, released yesterday, moves workloads to Microsoft Azure from physical machines, VMs (both VMware and Hyper-V-based) and those running in the Amazon Web Services public cloud.
The launch of the new migration tool comes as Microsoft officials are talking up the growth of its Azure cloud service at the expense of Amazon Web Services. Microsoft Technical Fellow in the Cloud and Enterprise Mark Russinovich emphasized that point in a speech at last month's TechMentor conference, which like Redmond magazine is produced by 1105 Media.
Migration Accelerator "automates all aspects of migration including discovery of source workloads, remote agent installation, network adaptation and endpoint configuration," wrote Srinath Vasireddy, a lead principal program manager for enterprise and cloud at Microsoft, in a post on the Microsoft Azure Blog yesterday . "With MA, you reduce cost and risk of your migration project."
The technology enabling the workload migrations comes from Microsoft's July acquisition of InMage, whose Scout software appliances for Windows and Linux physical and virtual instances captures data on a continuous basis as those changes occur. It then simultaneously performs local backups or remote replication via a single data stream. A week after announcing the acquisition, Microsoft said the InMage Scout software will be included in its Azure Site Recovery subscription licenses.
While the tool looks to give Microsoft a better replication story, it appears Microsoft's Migration Accelerator is pushing customers to use Azure for more than just disaster recovery and business continuity, although that has emerged as a popular application for all public cloud usage.
For example Vasireddy pointed to the Migration Accelerator's capability of migrating multitier production systems that he said offers consistency of applications that are orchestrated across all tiers. "This ensures multitier applications run the same in Azure, as they ran at the source," he said. "Application startup order is even honored, without the need for any manual configuration."
Vasireddy outlined in his blog post how the Migration Accelerator works and its components:
- Mobility Service: A lightweight (guest based) centrally deployed agent which gets installed on source servers (on-premises physical or virtual) to be migrated to the target virtual machines on Azure. It is responsible for real time data capture and synchronization of the selected volumes of source servers to target servers.
- Process Server (PS): A physical or virtual server that is installed on-premises. It facilitates the communication between the Mobility Service and target virtual machines in Azure. It provides caching, queuing, compression, encryption and bandwidth management.
- Master Target (MT): A target for replicating disks of on-premises servers. It is installed within a dedicated Azure VM in your Azure subscription. Disks are attached to the MT to maintain duplicate copies.
- Configuration Server (CS): Manages the communication between the Master Target and the MA Portal. It is installed on a dedicated Azure VM in your Azure subscription. Regular synchronization occurs between the CS and MA Portal.
- MA Portal: A multitenant portal to discover, configure protection and migrate your on premise workloads into Azure.
Microsoft's new cloud migration tool also offers automated asset discovery and migration, cutovers to Azure within minutes using in-memory change-tracking, ensuring target VMs are dormant during migrations to lower compute costs and ensure automated provisioning, lightweight agents on targets to enable continuous replication and support for automated network adaptation and endpoint reconfiguration, he said. Those interested in testing the Migration Accelerator must sign up here for the preview.
Posted by Jeffrey Schwartz on 09/05/2014 at 12:33 PM0 comments
In my monthly Redmond View column I raised the question, "Can Windows Phone Defy Odds and Gain Share?" It's a valid question given the platform's recent falloff in shipments and the lack of widespread enthusiasm for Windows Phone.
Some of you disagree but the numbers speak for themselves. Nevertheless reader John Fitzgerald feels I need to go deeper. "Your article is pretty short on specifics," he wrote. "I have over 100 apps installed, including many useful ones and have no app gap in my life."
Given that was my monthly column for the print magazine, I was constrained by the word count to elaborate. But Fitzgerald raised a valid point: If all the apps (or reasonable substitutes) you use on your iPhone or Android phone are available for Windows Phone, you're in business. Indeed if Tyler McCabe, the millennial son of SMB Group Analyst Laurie McCabe, objected to the lack of apps in the Lumia 1520 (he evaluated and described on his mother's blog), he didn't mention it. The computer engineering student is accustomed to iPhones and Android devices.
In today's mobile consumer-driven world, apps do rule, at least for now. As such, looking at my iPhone, there are a number of apps I use daily that are not available on Windows Phone or the functionality is different. Among them are the Starbucks app, which I use to pay for purchases and manage my account, and the MLB app to track baseball scores and news. When I board flights I use my iPhone to access my boarding pass. That's currently not an option with Windows Phone, though Delta and Southwest offer apps for major phones. When I want to check the train schedules, New York's MTA offers an app for iOS and Google Play but not for Windows Phone. The United States Post Office app I use to check rates and track packages is available only for the iPhone, Android and BlackBerry. Google recently made a version of its Chrome Web browser for iOS, while the company last said in March it was investigating bringing it to Windows Phone as well.
An app I use throughout the day to access articles on Newsday, my local newspaper, isn't available for Windows Phone and I was told no plans are under way to develop one. Nor does Newsday's parent company, Cablevision Systems, offer one to provide mobile DVR controls and other capabilities.
The bottom line is if I were to switch to a Windows Phone, I'd be giving up a lot of convenience I've grown accustomed to over the years. If others feel the same way, that could explain why, despite some otherwise compelling features, it could be an uphill battle for the prospects of Windows Phone gaining substantial share. Alas, I believe that ship has sailed.
Posted by Jeffrey Schwartz on 09/03/2014 at 12:55 PM0 comments
Two key suppliers of enterprise tools yesterday said they're being acquired and another well-known software-as-a-service provider is reportedly in play. BeyondTrust and Compuware have agreed to be acquired by private equity firms while shares of SaaS provider Concur have traded higher following rumors that several software vendors have approached it, including Microsoft, Oracle and SAP, after hiring an investment bank to gauge interest.
Veritas Capital said it will acquire cybersecurity software vendor BeyondTrust, provider of software used to manage user access and privileges and risk management assessment tools. BeyondTrust's PowerBroker software is especially popular among those looking to manage access rights in Microsoft's Active Directory.
BeyondTrust said it has 4,000 customers in a variety of industry sectors including government, technology, aerospace and defense, media/entertainment, telecommunications, healthcare/pharmaceutical, education and financial services. In addition to managing user access, BeyondTrust offers tools designed to defend against cyber attacks. Terms of the deal weren't disclosed.
Application performance management software supplier Compuware yesterday said it has agreed to be acquired by private equity firm Thoma Bravo for $2.5 billion. The deal puts a 12 percent premium on the company's shares based on its closing price at the end of the trading day Sept. 2 and 17 percent over Friday's close.
Compuware is a leading provider of APM software used to monitor the performance of mobile, enterprise and mobile application infrastructure as well as a line of network testing and management tools. The company also offers a line of mainframe management tools.
Concur, whose popular SaaS software is used to manage business travel and entertainment expenses, had a stock price increase of 6.3 percent yesterday, giving it a market cap of $6.1 billion. According to a Bloomberg report, Oracle has decided to pass on it, while Microsoft and SAP are companies of interest, though none of the companies noted commented on the possible acquisition.
Posted by Jeffrey Schwartz on 09/03/2014 at 12:54 PM0 comments
Steve Ballmer's decision to step down from Microsoft's board six months after "retiring" as the company's CEO (announced a year ago tomorrow) has once again put him in the limelight in the IT world, at least for a few days. The spotlight has already pointed to Ballmer in the sports world now that he shelled out an unprecedented $2 billion to purchase the Los Angeles Clippers and gave an infamous rally cry earlier this week on how his team will light the NBA on fire.
More power to him. I hope he brings the Clippers a championship ring -- maybe they become a dynasty. But at the same time, Ballmer's decision to step down from Microsoft's board appears to be a good move, whatever may have actually motivated him. Some may argue it's symbolic. Even though Ballmer is no longer on the board with voting rights, he's still the company's largest shareholder and if he was to align himself with activist investors, he could make his already audible voice heard. As such it would be foolish to write him off.
Following the news of his resignation from the board, I've seen a number of tweets and commentaries saying critics of Ballmer don't appreciate all he has done for Microsoft. Indeed Microsoft performed well during most of his 13-year reign, with consistent year-over-year revenue and profit growth. And products such as SharePoint grew from nothing to multi-billion-dollar businesses. Yet under Ballmer's watch, Microsoft seemed to have lost its way and the company that cut off Netscape's "air supply" lost its caché to companies like Apple and Google.
To those who think Ballmer is unappreciated, did Microsoft perform well because of or in spite of him? Consider this: As manager of the Atlanta Braves, New York Mets and St. Louis Cardinals, Joe Torre's teams performed abysmally. Yet his four World Series rings and year-over year appearances in the playoffs as skipper of the Yankees brought him praise as a great manager and a place in the Hall of Fame. As CEO of Novell, Eric Schmidt was unable to return the one-time networking giant to its former glory. As CEO of Google, he was a rock star. If Novell had hired Ballmer away from Microsoft, could he have saved the company from its continued decline? We'll never know for sure.
In my view, it all started with Ballmer's ill-advised attempt to acquire Yahoo in 2008 for $45 billion. Yahoo CEO Jerry Yang's foolhardy refusal to accept that astronomically generous offer at the time probably saved Microsoft from disaster. In retrospect, perhaps Ballmer was desperate, whether he knew it or not at the time. In 2008, Microsoft was late to the party with a hypervisor, delivering the first iteration of Hyper-V even though VMware had already set the standard for server virtualization.
Also at the time, maybe he realized, consciously or otherwise, that shrugging off the iPhone a year earlier sealed Microsoft's fate in the mobile phone business. Tablets and mobile phones were a market Microsoft could have owned or at least become a major player in had there been more vision. I remember Microsoft talking up technologies like the WinPad in the mid to late '90s and making feeble attempts to offer tablets that were effectively laptops which accepted pen input.
I wasn't planning to weigh in on Ballmer stepping down from the board until I read a review of the latest Windows Phone from HTC by Joanna Stern of The Wall Street Journal. The headline said it all: "HTC One for Windows: Another Great Phone You Probably Won't Buy." While praising the new phone, which actually is based on its popular Android-based device, Stern effectively said while Windows Phones are good, so were Sony Betamaxes (that's my analogy, not hers). Stern noted IDC's latest smartphone market share report, showing it sits at a mere 2.5 percent and not growing. In fact it has declined this quarter.
That got me thinking, once again, that the missteps with Windows Phone and Windows tablets all happened on the watch of Ballmer, who also let Apple and Google dominate the tablet market. To his credit, Ballmer acknowledged earlier this year that missing that market shift was his biggest regret. That still brings little consolation. Now Ballmer's successor, CEO Satya Nadella, is trying to move Microsoft forward with the hand he was dealt. Founder and former CEO Bill Gates sits on the board and is working with Nadella closely on strategic technical matters. Does Nadella also need the person whose mess he's cleaning up (including the fiefdoms that exist in Redmond) looking over his shoulder as a board member?
No one can overlook the passion Ballmer had at Microsoft, but his "developers, developers, developers" rant only went so far. Today most developers would rather program for iOS, Android and other platforms. Hence, it's fortunate that Ballmer has found a new passion. Maybe he can do with the Clippers what he couldn't do for Microsoft.
Posted by Jeffrey Schwartz on 08/22/2014 at 12:51 PM0 comments
NetApp Adds Hybrid Cloud Storage for Azure
NetApp Inc. is now offering a private storage solution that supports Microsoft private clouds and the Microsoft Azure cloud service. The new NetApp Private Storage for Microsoft Azure is designed to provide storage for hybrid clouds, letting organizations elastically extend storage capacity from their own datacenters to the public cloud service. The offering utilizes the Microsoft FlexPod private cloud solution consisting of converged storage, compute and network infrastructure from Cisco Systems Inc., NetApp and the Windows Server software stack. Organizations can create internal private clouds with the Microsoft Cloud OS, which includes the combination of Windows Server, System Center and the Windows Azure Pack. Those which require scale can use the Azure cloud service and Microsoft's new Azure ExpressRoute offering, which through partners such as AT&T, BT, Equinix, Level 3 Communications and Verizon Communications, provides high-speed, dedicated and secure links rather than relying upon the public Internet. NetApp Private Storage for Microsoft Azure offers single or multiple region disaster recovery and uses the public cloud service only when failover is necessary or for planned scenarios such as testing. When used with System Center and the NetApp PowerShell Toolkit, customers can manage data mobility between the private cloud and on-site storage connected to the Azure cloud.
Netwrix Adds Compliance to Auditor
Organizations faced with meeting regulatory compliance requirements need to ensure their IT audits are in line with various standards. The new Netwrix Auditor for 5 Compliance Standards addresses five key standards: the Payment Card Industry Data Security Specification (PCI/DSS), Health Insurance Portability and Accountability Act (HIPAA), Sarbanes-Oxley Act (SOX), Federal Information Security Management Act (FISMA) and Gramm-Leach-Bliley Act (GLBA). Netwrix Corp. released the new compliance tool last month. It's designed to help ensure IT passes compliance audits by accessing the mandated reports from the Netwrix Auditor AuditArchive, which the company describes as two-tiered storage that can hold data for upward of 10 years. It tracks unauthorized changes to configurations and can issue real-time alerts. It also comes with 200 preconfigured reports. In addition, it monitors user activities and audits access control changes, which helps discover and preempt theft of confidential information. It builds on the built-in auditing capabilities of widely used platforms including Active Directory, Exchange Server, file servers, SharePoint, SQL Server, VMware and Windows Server, though the company said it covers others, as well.
Advanced Systems Concepts Ties to Configuration Manager
Advanced Systems Concepts Inc. is now offering an extension for Microsoft System Center 2012 R2 Configuration Manager. Along with its other System Center extensions, the new release provides common management across on-premises, services provider and Azure environments. By integrating ActiveBatch with Configuration Manager, the company said IT organizations can reduce administration requirements and lower overall cost of PC ownership by automating such functions as OS deployment, patch management, client device management and other systems administration functions. It's designed to eliminate the need for scripting. Among the Configuration Manager functions it offers are production-ready job steps, which include commonly used Configuration Manager objects and functions such as creating job steps to build packages, deployment programs and folders. Among other job steps, it lets administrators modify and delete, as well as assign packages to distribution. ActiveBatch already supports most other key components of System Center. It already has extensions for System Center Operations Manager, System Center Service Manager, System Center Orchestrator and System Center Virtual Machine Manager, as well as Active Directory, SQL Server, Exchange, Azure and Windows Server with the latest rev of Hyper-V. It also works with other key platforms including Amazon Web Services EC2 and VMware ESX.
Posted by Jeffrey Schwartz on 08/22/2014 at 11:43 AM0 comments
Microsoft will release the third version of its Virtual Machine Converter (MVMC) this fall and the key new feature will be the return of support for physical to virtual conversions. MVMC is Microsoft's free tool for migrating VMware virtual machines to Hyper-V and when it released the 2.0 version earlier this year, the company removed the P2V support and only allowed for V2V conversions.
That "disappointed" some customers, said Matt McSpirit, a Microsoft senior technical product marketing manager on the company's cloud and enterprise marketing group. McSpirit joined me in a panel discussion on choosing a hypervisor at last week's TechMentor conference, held on the Microsoft campus in Redmond. While the panel covered many interesting implications of converting to Hyper-V, which I will break out in future posts, one key area of discussion was converting existing VMware VMs to Hyper-V.
Released in late March, Microsoft Virtual Machine Converter 2.0 upped the ante over the inaugural release by incorporating support for vCenter and ESX 5.5, VMware virtual hardware version 4 through 10 support and Linux guest OS migration support including CentOS, Debian, Oracle, Red Hat Enterprise, SuSE Enterprise and Ubuntu, as I reported at the time. It also added an on-premises VM to Azure VM conversion tool for migrating VMware VMs directly to Azure.
Another key feature is its native PowerShell interface, enabling customers and partners to script key MVMC 2.0 tasks. These scripts can also be integrated with other automation and workflow tools such as System Center Orchestrator, among others. In addition, Microsoft has released the Migration Automation Toolkit for MVMC 2.0, which is a series of prebuilt PowerShell scripts to drive scaled MVMC 2.0 conversions without any dependency on automation and workflow tools.
The reason for killing the P2V support in that release was apparently to emphasize and encourage that it's a VM to VM conversion tool. During the panel discussion, McSpirit described P2V as less specific to virtual conversion but more focused on physical to virtual.
"A while back, we had within System Center a P2V capability," he explained. "So for customers who were just getting started with virtualization or have some workloads they need to convert from the physical world, we have P2V built into VMM [System Center Virtual Machine Manager]. So as an admin, you've got your Hyper-V host being managed, [you] select a physical server that will actually convert either online or offline, and VMM will handle [the conversion to a virtual machine] and bring it [into its management tool]. And that functionality was deprecated in 2012 R2 and removed. And thus, for a period of time, no P2V tool from Microsoft. Yes there's disk to VHD and some other tools but no [fully] supported, production ready tool [from Microsoft, though there are third-party tools]."
In a followup e-mail to clarify that point, McSpirit said that "P2V stands for physical to virtual, thus by definition, it's less focused on virtual conversion and more focused on physical to virtual, but that's not to say it can't be used for converting VMs," he noted. "The P2V wizard in the previous release of System Center Virtual Machine Manager (2012 SP1) could still be pointed at an existing VMware virtual machine, after all, regardless of whether it's converting a physical machine or not, it's the OS, and it's data that's being captured, not the hardware itself. Thus, you could use P2V, to do a V2V."
Microsoft confirmed in April that P2V was planned for the fall release. "With MVMC 3, that comes in the fall P2V is coming back into that, which pleases a lot of customers because it doesn't have that requirement for System Center which for smaller environments is more applicable and it enables you to perform P2Vs with a supported tool," McSpirit said.
Update: Just to clarify, MVMC never had P2V functionality. Rather it was offered in System Center 2012 SP1 Virtual Machine Manager (and earlier). When McSpirit said P2V was deprecated, he was referring to System Center 2012 R2 Virtual Machine Manager, which offered V2V only. With the MVMC 3.0 release this fall, P2V will once again be available.
The session confirmed what many already know: MVMC is a good tool for converting a handful of hypervisors but it still requires manual configuration and "lacks any sort of bulk conversion mechanism (unless you want to script the conversion process through Windows PowerShell)," wrote Brien Posey in a tutorial outlining how to use MVMC 2.0 earlier this year.
But if you plan to migrate numerous VMware VMs to Hyper-V, you may want to consider third-party tools from the likes of Vision Solutions, NetApp and 5nine Software.
Are you migrating your VMware VMs to Hyper-V? If so, are you using MVMC or one of the third-party tools?
Posted by Jeffrey Schwartz on 08/20/2014 at 9:22 AM0 comments
Over the weekend I downloaded the Chrome Web browser on my iPad after it showed up as a suggested download. Somehow I forgot Chrome is now available on iOS and I was thrilled to see I could not only run the browser, but instantly have access to my bookmarks and Web browsing history from other PCs on which I work. Then I wondered if Internet Explorer would ever find its way into the iTunes App Store. After all, Office is now available on iOS, as well as other popular Microsoft offerings. However, it doesn't appear that Internet Explorer on iOS is in the cards.
Apparently, Microsoft put cold water on that idea last week during a Redit Ask Me Anything (AMA) chat. As reported by "All About Microsoft" blogger Mary Jo Foley, Internet Explorer is one of Redmond's offerings that the company won't deliver on iOS or Android. "Right now, we're focused on building a great mobile browser for Windows Phone and have made some great progress lately," said Charles Morris, a member of the Internet Explorer platform team, during the AMA chat. "So, no current plans for Android/iOS."
That's unfortunate, especially given the rate of growth for Windows Phone, which actually dropped last quarter. It appears Microsoft is still fixed on the notion that its Internet Explorer Web browser is inextricably tied to Windows.
Also, as noted by Forbes, the Internet Explorer team acknowledged during the AMA chat that Microsoft has considered renaming the browser, presumably to make it more hip. That's a questionable idea considering Internet Explorer is still the most widely used browser, especially among enterprises.
When asked about the potential for a name change, Jonathan Sampson, also on the Internet Explorer platform team, responded: "It's been suggested internally; I remember a particularly long e-mail thread where numerous people were passionately debating it. Plenty of ideas get kicked around about how we can separate ourselves from negative perceptions that no longer reflect our product today."
Asked why the company didn't just change the name, Sampson responded: "The discussion I recall seeing was a very recent one (just a few weeks ago). Who knows what the future holds." Changing the name, of course, won't do much to help the browser's reputation if it's known for the same problems it has had in the past -- namely security and privacy flaws.
If something has a very bad reputation, sometimes rebranding it is the only course of action. But as the old saying goes, "a rose is still a rose ..." Despite the fact it's no longer the dominant browser, Internet Explorer still has the largest market share, according to Net Applications. Even if its share continues its incremental decline, Internet Explorer is a well-known browser and not merely for its flaws. People who don't use Microsoft products, particularly those who prefer Apple offerings or the LAMP stack, aren't going to be moved by a new name for anything coming out of Redmond. Produce a browser that has fewer flaws and advanced features and it'll be popular no matter what it's called.
Regardless of the name, would making it available on iOS and Android stem any declines on a platform that's still widely used? Or would it facilitate it? Perhaps the company sees Internet Explorer as critical to holding onto the Windows franchise. If that's the case, the company might wait. In the meantime, I look forward to using the Chrome browser on my iPad and will stick to using Internet Explorer only on Windows.
Posted by Jeffrey Schwartz on 08/18/2014 at 1:31 PM0 comments
Could there be a day when the desktop or mobile operating system you use and developers program to don't matter? The age-old question may not be answered any time soon and don't expect to see any of this in Microsoft's next client OS. But IT pros should prepare themselves for the possibility that Microsoft or other players may someday successfully commercialize a "library operating system" where developers rely primarily on APIs to run their applications, not a client OS or even a virtual machine.
While researchers have bandied about the concept of a library OS for some time, Microsoft Research revealed its own work on one back in 2011 with a project called Drawbridge. Microsoft Research Partner Manager Galen Hunt joined others at the time to outline in detail their working prototype of a Windows 7 library operating system that ran then-current releases of Excel, PowerPoint and Internet Explorer. In this desktop prototype, they said the securely isolated library operating system instances worked via the reuse of networking protocols.
"Each instance has significantly lower overhead than a full VM bundled with an application: a typical application adds just 16MB of working set and 64MB of disk footprint," according to a paper published by the widely regarded Associated for Computer Machinery (ACM) at the time. "We contribute a new ABI [application binary interface] below the library OS that enables application mobility. We also show that our library OS can address many of the current uses of hardware virtual machines at a fraction of the overheads."
Little has been said by Microsoft's library OS efforts since then. Yet three years later, Microsoft's stronghold on the desktop has been diminished by the rapid growth of BYOD. While Microsoft hopes to revive enthusiasm for Windows with a new offering next year, others believe it may be too little, too late. That remains to be seen of course. Nevertheless, Microsoft could have a lot to gain by someday delivering a library OS.
Don Jones, a Microsoft MVP and a Redmond magazine columnist, who recently joined IT training firm Pluralsight, revived the idea of why a library OS could be the right approach for Microsoft to take. And he did so right in Microsoft's house. Speaking in a keynote address at the TechMentor conference this week at the Microsoft Conference Center on the Redmond campus, Jones explained why a library OS could reduce existing operating systems to the firmware layer, putting the emphasis on APIs and application delivery rather than low-level services.
"We tend to say that developers develop for a specific operating system -- iOS, Android, Windows, Macintosh or Linux," Jones told the audience of IT pros. "It's not actually true. They don't. What they are writing for is an operating environment. What they are writing for is a set of APIs."
The APIs which developers code against today are tied to a specific operating system, whether it is Apple's interfaces to iOS or Microsoft's .NET Framework. Efforts to make APIs that work across operating system s has remain ed unsuccessful, Jones noted
"Even when we make efforts to duplicate those APIs for other operating systems with projects like Mono, attempting to duplicate .NET onto the Linux platform, they are rarely successful because the ties between the APIs and the operating system are so integrated," Jones said. "It's really tough to pull them apart and duplicate them elsewhere."
Jones called for a smarter BIOS so when users turn on their computers, it knows how to access the network, write to a display or disk and other basic low-level services, including CPU and memory access and storing user credentials in a secure area.
"This is going to sound crazy but if you take all of Windows and you refactor it into a bunch of APIs that talk to that low-level firmware and if a developer wants to code to those, he or she can," Jones said. "But at the same time, if someone is more comfortable programming against a Linux set of APIs, there's no reason those same APIs can't live on that same machine and run in parallel at the same time because it's not taking over the machine, it's the firmware that runs everything. These are just different ways of getting at that firmware that provides a different comfort level for developers."
In this type of scenario, Jones added: "Windows essentially becomes a set of DLLs, which essentially is not that far from what it is. So we move the operating system to a lower level and everything else is a set of APIs that run on top of that. Meaning you gain the ability to run multiple operating system personalities with APIs side by side because none of them owns the machine."
Can, or will, Microsoft bring this to market? Certainly Hunt believed it's feasible. "Our experience shows that the long-promised benefits of the library OS approach -- better protection of system integrity and rapid system evolution -- are readily obtainable," he wrote at the time.
Jones said not to expect it anytime soon, and warned it's possible it may never see the light of day. But he pointed to some meaningful reasons why Microsoft could be leaning in that direction, or at the very least could benefit by doing so sometime in the future. Perhaps most obvious is the company's key emphasis and investment is on building out the Microsoft Azure cloud and the Windows Server/System Center-based cloud OS that underpins it.
Jones also pointed to Microsoft's new Azure RemoteApp service, which the company announced at TechEd in May and is available in preview. Microsoft Azure RemoteApp, an application delivery service, will let IT deliver Windows applications to almost any device including Windows tablets, Macs, iPads and Android tablets via the cloud. It will work with 1,300 Windows apps, Microsoft said at the time, though the company has yet to announce its plans to offer Azure RemoteApp commercially.
"RemoteApp delivers an app, not a desktop, not an operating system," Jones said. "What the user gets is an app, because the user has been trained that's all they need." Jones also likes the idea of a library OS because it eliminates virtualization, he added. "We don't have to have an operating system and a virtual machine rolling it up. We just have these little API silos," he said. "Suddenly the idea of running applications in the cloud and delivering them becomes a lot clearer and a lot easier and you can get a lot better densities on your hosts."
The role of desktop administrators may not disappear overnight but it would not be a bad idea for them to study and gain expertise on the datacenter side of things if this move comes to play. "Look at what's happening in your environment, and whether you think it's a good idea or not, try to predict what you think Microsoft will or might do and then try to make a couple of side investments to line yourself up so you can ride the bus instead of being hit by it."
Would you like to see Microsoft deliver a library OS? Share your views on the potential benefits or pitfalls of such a computing model.
Posted by Jeffrey Schwartz on 08/15/2014 at 12:01 PM0 comments
Growing use of the Microsoft Azure cloud service, rapidly expanding features and a price war are putting the squeeze on cloud market leader Amazon Web Services, according to Microsoft Technical Fellow in the Cloud and Enterprise Division Mark Russinovich.
In the opening keynote at the TechMentor conference at the Microsoft Conference Center on the company's Redmond campus, Russinovich showcased the edge Microsoft sees that it has over its key rival. Russinovich argued that the company's squeeze contributed to Amazon's disappointing quarterly earnings report last month, which rattled investors. "Amazon Web Services is struggling in the face of pricing pressure by us and Google," Russinovich said. "People are starting to realize we're in it to win."
Indeed, market researchers last month pointed to Microsoft and IBM as those who are gaining the most ground on Amazon in the public cloud. When asked how many TechMentor attendees have logged into the Microsoft Azure Portal, only about one-quarter of the IT pros in the audience said they have. Disclosure: TechMentor, like Redmond magazine, is produced by 1105 Media.
Pointing to the growth of Azure, Russinovich showcased some recently released figures. The service hosts 300,000 active Web sites and 1 million active database services. The Azure Storage service has just surpassed 30 trillion objects with 3 million active requests per second at peak times. Russinovich also said that 57 percent of the Fortune 500 companies are using Azure. And Azure Active Directory, which organizations can federate with their on-premises versions of Active Directory, now has 300 million accounts and 13 billion authentication requests per week.
Russinovich emphasized Microsoft's advantage with Azure Active Directory and the cloud service's emphasis on identity. "Azure Active Directory is the center of our universe," Russinovich said. "When you take a look at all the places where you need identity, whether it's a [third-party] SaaS service or whether Microsoft's own, like Office 365, you look at custom line-of-business apps that enterprises are developing and they want to integrate their own identity services."
Throughout his talk, Russinovich emphasized Microsoft's focus on its hybrid cloud delivery model and the security layers that can extend to the public Azure cloud. To that point, Azure Active Directory's role in supporting hybrid clouds is "it supports ways you can foist identities to the cloud so that they're accessible from all of these targets, line-of-business apps in the cloud and Microsoft's own cloud platforms."
The primary way to achieve that is to federate an Azure Active Directory tenant with an on-premises Active Directory service, he said. "What that means is when someone goes and authenticates, they get pointed at Azure Active Directory in the cloud. Their identities are already there and the authentication flow goes to an Active Directory federated server on-premises that verifies the password and produces the token." It uses OAuth 2.0 for authentication, he said.
Microsoft plans to integrate OAuth 2.0 into Windows Server and all Azure services, though Russinovich noted it supports other token services such as SAML and other identity services from the likes of Facebook, Google and others.
One area Microsoft needs to deliver on for Azure is role-based access control, which is in preview now, Russinovich said. "This will be fleshed out so we have a consistent authorization story all pointing to Azure Active Directory, which lets you not just connect on-premises to Active Directory through [Active Directory Federation Services], but these third-party consumer services, as well. That's a key point, you want identity to be enabled in the cloud," he said.
Another key area of leadership Microsoft has gained in the cloud is the company's defiance in giving in to law enforcement agency efforts to subpoena user data, alluding to last week's order to turn over data in a Dublin datacenter, which Microsoft said it will appeal. "Brad Smith, our head of legal, is probably now the face of the tech industry going out and fighting the United States government's efforts to get access to data that they probably won't get access to," he said. "We're really trying to make sure we're a great partner for customers that care about data privacy."
When it comes to the public cloud, Russinovich discussed how there's significant demand for cloud services such as storage and disaster recovery. Russinovich demonstrated a few offerings including the new Azure ExpressRoute, which through third-party ISPs and colocation operators, can provide high-speed dedicated private connections between two datacenters using Azure as a bridge or to connect directly to Azure. Other demos included point-to-site VPN and Web apps using Azure.
Russinovich also gave a nod to Azure's broad support for those who want to create Windows PowerShell scripts to add more automation. "Everything in Azure is PowerShell-enabled because Jeffrey Snover [its revered inventor and Microsoft distinguished engineer and lead architect] makes us enable everything from PowerShell," Russinovich quipped.
During the evening reception at TechMentor, I reconnected with some attendees who had told me the previous evening they wanted to learn more about Azure. Some saw opportunities, but others weren't sure how it fits into what they're doing. Nevertheless, most said they were surprised to learn how Azure has advanced. This is exactly what Russinovich aimed to do in his talk -- to make a case that Microsoft has more to offer than Amazon and other cloud providers.
To be sure, Amazon still has a growing cloud business and is seen as the leader for enterprise Infrastructure as a Service. As noted by investment researcher Zacks, Amazon reported a 90 percent usage growth for its AWS cloud business. While Amazon doesn't break out revenues and profits for AWS, it falls in the category of "Other," now accounting for 6 percent of revenues. Experts believe AWS generated the majority of revenues in that "Other" category.
"AWS continues to launch new services and enhance the security of its services," according to the Zacks report. "Amazon again reduced prices significantly, which was the reason for the revenue decline in the last quarter."
By all indications here at TechMentor and messaging from Microsoft at the recent Worldwide Partner Conference (and TechEd before that), Microsoft is not letting up on its Azure push. While many attendees saw the depth and progress Microsoft has made with Azure for the first time, Rome wasn't built in a day.
Posted by Jeffrey Schwartz on 08/13/2014 at 11:34 AM0 comments
Google wants Web sites to become more secure and said Wednesday it will do its part by motivating organizations to build stronger encryption for their sites. The company is giving a pretty significant incentive: it will reward those who do so by ranking them higher than sites lacking the added support to Transport Layer Security, also known as HTTPS encryption. Another way to look at it is Google will punish those who lack the extra encryption.
It's always troubling to hear reports that allege Google is playing with its search algorithm in a way that can unfairly benefit some to the detriment of others. Given its dominance in search, any action, real or perceived, places it under scrutiny and risk of regulators getting on the company's case.
Yet one could argue Google is now putting a stake in the ground in the interest of everyone who uses the Web. By forcing sites to implement stronger encryption by implementing TLS, the company is using its clout to make it a safer place. This could have major consequences to many businesses that live and die by how well they appear in Google search results. That's especially the case for those who expend efforts in search engine optimization, or SEO. But Google is doing so by trying to force those with insecure sites to step to implement TLS. While not a panacea, it's a step up.
Google has talked up "HTTP by Default" for years. It means Search, Gmail and Google Drive automatically direct secure connections to the Google sites. At its recent Google IO developer conference, the company introduced its HTTPS Everywhere push. Webmaster trends analysts Zineb Ait Bahajji and Gary Illyes explained in a post Wednesday how the company plans to rank sites based on their HTTPS/TLS support.
"Over the past few months we've been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms," they wrote. "We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal -- affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content -- while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we'd like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the Web."
In the coming weeks Google said it will publish detailed best practices on how to make it easier to implement TLS at its help center. In the meantime, Google offered the following tips:
- Decide the kind of certificate you need: single, multi-domain or wildcard certificate.
- Use 2048-bit key certificates.
- Use relative URLs for resources that reside on the same secure domain.
- Use protocol relative URLs for all other domains.
- Check out our Site move article for more guidelines on how to change your Web site's address.
- Don't block your HTTPS site from crawling using robots.txt.
- Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.
Google is also recommending those with sites already serving HTTPS should test the security levels and configuration using Qualys SSL Server Test tool.
What's your take on Google's effort to force the hand of organizations to make their sites more secure? Is it a heavy handed and unfair move by taking advantage of its search dominance or an altruistic use of its clout that could make the Web safer for everyone?
Posted by Jeffrey Schwartz on 08/08/2014 at 12:34 PM0 comments
Microsoft has extended its relationship with the NFL in a new deal for its Surface Pro 3 tablet-PCs to be used by coaches on the sidelines during games. Viewers of NFL games this season will prominently see team personnel and coaches using Surface Pro 3 devices to review images of plays on screen rather than on printed photos.
The NFL struck a deal last year with Microsoft for the Surface to be the league's official tablet. Now a new $400 million deal calls for Surface Pro 3s to be used on the sidelines during games. The arrangement calls for coaches to use the Surface Pro 3s for five years, according to a Wall Street Journal blog post.
Microsoft said the Surface devices used by teams are equipped with the Sideline Viewing System (SVS). The app, developed by Microsoft, picks up the same digital feed previously sent to the printers. By replacing the printouts with digital images sent to the tablets, coaches will get them faster (four to five seconds versus 30 seconds) and have the ability to use the high resolution touch display to zoom in on a portion of the image and make annotations, Microsoft said.
While coaches will be able to receive images, the tablets won't have access to the public Internet and team personnel won't be permitted to use the devices to take photos or videos during games. It remains to be seen whether teams in future seasons will use the Surface Pro 3s for making play-by-play decisions.
Perhaps more telling will be whether the promotional value of this deal will boost sales of the Surface line of tablets. Certainly with the vast viewership of NFL games it's a nice boost. But as Computerworld reported Monday, Microsoft's losses on Surface have swelled to $1.7 billion since their 2012 launch, according to a July 22 8-K statement. The publication also calculated that Microsoft saw a loss of $363 million for the last fiscal quarter. That makes it the largest quarterly loss for the tablets since the company began reporting revenues for them. Hence, Microsoft needs to score many more deals like this to keep the Surface in the game.
What's your take on this deal? Did Microsoft score big or was this just a Hail Mary pass?
Posted by Jeffrey Schwartz on 08/06/2014 at 11:00 AM0 comments
In a setback for U.S. cloud providers looking to ensure privacy of data stored in foreign countries, a search warrant ordering Microsoft to turn over e-mail stored in its Dublin, Ireland datacenter was upheld. Judge Loretta Preska of the U.S. District Court for the Southern District of New York upheld the search warrant by a domestic magistrate judge ruling. The identity and locale of the suspect, which is suspected in an illegal drug-related matter, is not known. Microsoft has said it will appeal last week's ruling.
Several tech companies had filed briefs in support of Microsoft's appeal of the warrant during a two-hour hearing held Tuesday, including Apple, AT&T, Cisco and Verizon, along with the Electronic Frontier Foundation, according to published reports. The outcome of this case can potentially set a precedent for all U.S.-based cloud providers storing data abroad.
"Under the Fourth Amendment of the U.S. Constitution, users have a right to keep their e-mail communications private," wrote Microsoft Chief Counsel Brad Smith in April. "We need our government to uphold Constitutional privacy protections and adhere to the privacy rules established by law. We're convinced that the law and the U.S. Constitution are on our side, and we are committed to pursuing this case as far and as long as needed."
In an OpEd piece published in The Wall Street Journal last week prior to the hearing, Smith furthered that view. "Microsoft believes you own e-mails stored in the cloud, and they have the same privacy protection as paper letters sent by mail," he said. "This means, in our view, that the U.S. government can obtain e-mails only subject to the full legal protections of the Constitution's Fourth Amendment. It means, in this case, that the U.S. Government must have a warrant. But under well-established case law, a search warrant cannot reach beyond U.S. shores."
In upholding the warrant Judge Preska argued that's not the point. "It is a question of control, not a question of the location of that information," Preska said, according to a report byThe Guardian. Hanni Fakhoury, staff attorney for the Electronic Frontier, told Computerworld he wasn't surprised by the ruling, saying it ultimately will be decided by the Second Circuit Court of Appeals. "I hope the Second Circuit looks closely at the magistrate's reasoning and realizes that its decision radically rewrote the Stored Communications Act when it interpreted 'warrant' to not capture all of the limitations inherent in a warrant, including extraterritoriality," he said.
The outcome of this case will have significant ramifications on the privacy of data stored in foreign datacenters. But that's not all Microsoft and its supporters have at stake. Should the warrant ultimately be upheld, The Guardian noted, U.S. companies are concerned they could lose billions of dollars in revenues to foreign competitors not subject to seizure by U.S. law enforcement agencies.
Posted by Jeffrey Schwartz on 08/04/2014 at 12:52 PM0 comments
Microsoft has substantial plans for its flagship Office suite, as noted by Mary Jo Foley in her August Redmond magazine column. But for now its version for the iPad is getting all the love. Just four months after the long-awaited release of Office for iPad, Microsoft has upgraded it with some noteworthy new features. The 1.1 versions of Word, Excel and PowerPoint are now available in Apple's iTunes App Store. Microsoft also updated OneNote for the iPad with its version 2.3 release.
While Microsoft added new features specific to the three key components of Office, the most noteworthy addition across the board is the ability to let users save files in the Adobe PDF format (even for those who don't have Office 365 subscriptions). That was one of the three top feature requests, according to a post in Microsoft's Office Blog announcing the upgrade. Second is the ability to edit photos on the iPad, which users can now do in Word. Though limited, the photo editing feature enables cropping and resetting. The third key feature is support for third-party fonts.
In Excel, Microsoft has added a new sorting capability which lets users filter, extend, collapse, display details and refresh PivotTables in an Excel workbook as well as changing the way they're displayed. Another new feature called "flick gesture" aims to simplify the use of workbooks including the selection of data fields with large ranges. "Simply grab the selection handle, flick it in any direction and Excel will automatically select from where you started to the next blank cell," read the blog post, adding if "you're at the top of a column of data and want to select all the way to the bottom, just flick down and the column is selected automatically."
The new PowerPoint app supports the use of multimedia capabilities on the iPad, specifically audio and video embedded into slides. It also lets users insert videos and photos from their iPad camera rolls. When projecting a presentation from the iPad, users can now enable Presenter View to see their own notes on the device.
Microsoft pointed out this is part of its new effort to offer continuous updates to iPad for Office. When you download the update, don't do it when you're in a rush -- give yourself about 15 to 30 minutes, depending on your network. I found Word and Excel in the Update section, though I had to search the store for the PowerPoint version.
Share your thoughts on the new update.
Posted by Jeffrey Schwartz on 08/01/2014 at 11:35 AM0 comments
If you've been holding off on buying Microsoft's Surface Pro 3, awaiting either a less expensive version or the higher-end model with an Intel Core i7 processor, they're now available. Microsoft announced the release of the new Surface Pro 3 models today.
The first units started shipping in June and featured mid-range models with i5 processors, priced between $999 and $1,299 (not including the $130 removable keyboard). Now the other units are available, as previously indicated. The unit with an i3 processor is priced at $799 for the device, while the i7 models, targeted at those with high-performance requirements such as those working in Adobe Photoshop or computer aided design-type applications, will run you quite a bit more. A Surface Pro 3 with a 256GB solid-state drive and 8GB of RAM costs $1,559. If you want to go whole-hog, one with a 512GB SSD will set you back $1,949.
Microsoft also said the $200 docking station for the Surface Pro 3 is on pace to ship in two weeks. The dock will have 5 USB ports (three USB 3.0 and two USB 2.0), an Ethernet input and a mini DisplayPort that renders 3840 x 2600 DPI resolution.
If you missed it, here's my first take on the mid-range Surface Pro 3. Have you looked at the Surface Pro 3? What's your take? Is it a device you're considering or are you sticking with a device from one of Microsoft's OEM partners?
Posted by Jeffrey Schwartz on 08/01/2014 at 11:39 AM0 comments
In its latest effort to apparently reign in large U.S. tech companies from expanding their presence in China, government investigators in China this week raided Microsoft offices throughout the country, according to several reports.
The raids by China's State Administration for Industry and Commerce included offices in Beijing, Shanghai, Guangzhou and Chengdu, according to a report Monday in The New York Times, citing a spokeswoman who declined to elaborate due to the sensitivity of the issue. Another Microsoft spokesman told The Wall Street Journal that the company's business practices are designed to comply with Chinese law.
Accusing Microsoft of monopolistic practices and other violations that are less clear, the raids are the latest salvo in tensions between the two countries over the past few years which have been recently escalated by spying, malware attacks and hacking allegations. The move could also be retaliation by the Chinese government following indictments by the U.S. government in May that charged five of China's Army officers with cyber attacks, The New York Times added.
The raids follow a visit to China last week by Qualcomm CEO Steven Mollenkopf, according to the report, which said he held talks with government officials and announced a $150 million "strategic venture fund" to invest in Chinese technology start-up companies, though it's unclear whether the visit sparked the escalation against Microsoft.
Microsoft, which, like many U.S. corporations, has identified China as one of its largest growth markets, is not the only company in the country's crosshairs these days. Government officials have also started to explore the reliance by Chinese banks on IBM mainframes and servers, though Big Blue has agreed to sell its x86 x Series server group to Lenovo for $2.3 billion. Apple, Cisco and Google have also faced heighted scrutiny, according to reports.
Approximately 100 investigators raided Microsoft's offices in China, according to The Wall Street Journal, which reported Tuesday that China's State Administration for Industry and Commerce vaguely accused Microsoft of not disclosing certain security features and how the company integrates its various products. China's government, through state-controlled news outlet CCTV, has raised concerns about the security of Windows 8 and also has deemed Apple's iPhone as a danger to the country's national security.
Disclosures by Edward Snowden of the surveillance efforts have also escalated concerns by China's government, several reports noted. Yet some wonder if the moves are more about protectionism of companies based in China than concerns about security.
Posted by Jeffrey Schwartz on 07/30/2014 at 12:37 PM0 comments
Hedge fund holding company Elliot Management has taken a $1 billion stake in storage giant EMC, according to a report in The Wall Street Journal last week. The activist investor with an estimated $25 billion under management is reportedly looking for the company to spin off VMware and Pivotal.
EMC has long opposed spinning off VMware, which is a potential move often floated. It most recently came up in late May when two Wells Fargo analysts advocated the spin off of VMware. Elliot is among those impatient with EMC's stock performance, which has risen only 163% over the past decade, compared with the Dow Jones U.S. Computer Hardware Index's 309% gain.
There's no reason to believe EMC CEO Joe Tucci would now welcome such a move, according to the report, noting his planned retirement is set for next year. Having VMware and Pivotal operate as separate companies, which EMC has a majority stake in, lets it operate under its "federation strategy." This would allow EMC, VMware, Pivitol and RSA to work together or individually with fewer competitive conflicts. An EMC spokesman told The Journal the company always welcomes discussions with shareholders but had no further comment. Elliot Management did not respond to an inquiry.
Freeing EMC of those companies could make it a potential takeover target for the likes of Cisco, Hewlett Packard or Oracle, though, the report noted, due to EMC's 31 percent decline in market share in the external storage market. EMC's stake in VMware is approximately 80 percent.
Elliot Management has taken many stakes in companies including BMC, Juniper and EMC rival NetApp. The company also made a $2 billion bid to acquire Novell in 2010, which the company rejected. Novell was later acquired by Attachmate.
Posted by Jeffrey Schwartz on 07/28/2014 at 12:29 PM0 comments
The release of the Amazon Fire Phone this week adds another wildcard to the smartphone race largely dominated by Apple's iPhone and devices based on Google's Android OS. With Windows Phone in a distant but solid third place in market share, it remains to be seen if it's too late for anyone to make a serious dent in the market at this point.
Microsoft must believe the new Amazon Fire Phone has a chance of gaining at least some amount of share and as a result is offering its Skype and OneNote apps in the Amazon App Store for Android right out of the gate. Even if the Windows Phone team doesn't see the Amazon Fire Phone as a threat, Microsoft also seems to realize that the new phone's success or failure won't rest on adding apps such as Skype and OneNote. And if the Amazon Fire Phone should prove to be a wild success, Microsoft, which already has acknowledged that we're no longer in an all-Windows world, likely doesn't want its apps and services to be supplanted by others. This move is not without precedent: Microsoft has offered Amazon Kindle apps in the Windows Store since the launch of Windows 7 back in 2009. Amazon also offers Windows Server and SQL Server instances in its Amazon Web Services public cloud
If any new entrant can gain share in the mobile phone market, it's Amazon. But even when Facebook was reportedly floating the idea of offering its own phone, it appears it realized that an apparent single-purpose device was a risky bet. Amazon has more to gain -- its phones are inherently designed to let people buy goods in its retail store. It even lets consumers capture an image of an item in a store and see if they can get it cheaper, which, they more often than not, can. And customers who have an Amazon Prime account, which all new subscribers get with the phone for a full year, can purchase goods with the phone.
Amazon already has experience in the device market with its Kindle Fire tablets with respectable but not dominant share. And Amazon recently launched its own TV set top box. The Amazon Fire Phone, like the Kindle Fire tablets, runs on a forked version of Android that doesn't support apps from the Google Play store. Amazon's new phone introduces some interesting new features including the ability to flip pages in a calendar and other apps without touching the screen but rather moving your hand. It comes standard with 32GB of storage and has the ability to recognize products.
Critics argue some of these "whiz bang" features are nice and some other appealing features like unlimited photo storage in the cloud are also nice touches. But it also it has a relatively basic design and form factor. While I haven't seen the phone myself, it appears it will appeal to those who use the Amazon Prime service extensively. As an Amazon Prime subscriber too, I'll settle for using whatever apps Amazon offers for iOS or Windows Phone when selecting my next phone.
Amazon has shown a willingness to break even or even sell products at a loss if it will further its overall business goals. On the other hand, with yesterday's disappointing earnings report where its net loss of $27 per share was substantially higher than the $16 loss expected by analysts, it remains to be seen how far Amazon can sustain losses, especially in new segments. Investor pressure aside, CEO Jeff Bezos shows no sign of backing off on his company's investments in distribution centers for its retail operation, datacenter capacity for the Amazon Web Service public cloud and now its entry into the low-margin phone business.
What's your take on the Amazon Fire Phone? Will it be a hot commodity or will it vindicate Facebook for choosing not to enter the market?
Posted by Jeffrey Schwartz on 07/25/2014 at 10:18 AM0 comments
Veeam this week has released a new management tool to provide visibility into Hyper-V as well as VMware environments. While previous versions of the Veeam Management Pack only supported VMware, the new v7 release now provides common visibility and management of Hyper-V, Microsoft's hypervisor. Administrators can use the Veeam Management Pack from within System Center Operations Manager.
Veeam had demonstrated a near-ready release at the TechEd conference in Houston back in May, among other System Center Operations Manager management packs. Within System Center Operations Manager, the new Veeam Management Pack version 7 for System Center offers a common dashboard that provides monitoring, capacity planning and reporting for organizations using Veeam Backup & Replication.
With the new management pack, System Center Operations Manager administrators can manage both their vSphere and Hyper-V environments together with complete visibility to physical and virtual components and their dependencies. In addition to offering deeper visibility into both hypervisors within a given infrastructure, the new Veeam Management Pack provides contextual views using color-coded heat maps for viewing various metrics and it provides real-time data feeds.
It also lets administrators manage the Veeam Backup & Replication for Hyper-V platform to determine if and when a host or virtual machine (VM) is at risk of running out of storage capacity, Doug Hazelman, the company's vice president of product strategy, said during a meeting at TechEd. "We provide views on networking, storage, heat maps -- the smart analysis monitors, as we call them," Hazelman said. "This is something you don't see in general in System Center."
If memory pressure is too high on a specific VM, the Veeam Management Pack can analyze the environment such as host metrics, the properties of the VM or whether it's configured with too little memory. Or perhaps the host has exhausted its resources. In that case a dynamic recommendation is provided. While administrators typically default to the Windows Task Manager to determine gauge utilization of CPU, memory and other common resources on a physical server, Hazelman pointed out that the common utility isn't designed to do so for VMs. The Veeam Task Manager addresses that.
The new release will be available with two licensing options. The Enterprise Plus Edition provides complete real-time forecasting, monitoring and management of the entire infrastructure including clusters and the complete virtual environment. It's available as a free update to existing Veeam Management Pack 6.5 customers.
A new lower-end Enterprise Edition is a scaled-down release that provides management and monitoring but not the full level of reporting of the Enterprise Plus version. The company is offering 100 free cores of the new Enterprise Edition free of charge, including maintenance for one year. This offer is available through the end of this year.
Posted by Jeffrey Schwartz on 07/25/2014 at 3:20 PM0 comments
Longtime Redmond magazine columnists Don Jones and Greg Shields have joined online IT training firm Pluralsight, where they will provide courses for IT administrators.
The two will continue to write their respective Decision Maker and Windows Insider columns for Redmond magazine and other content to the publication's Web site. They will also continue to present at the TechMentor and Live! 360 conferences, which, like Redmond magazine, is produced by 1105 Media.
Pluralsight announced the hiring of Jones and Shields on Wednesday. "As thought leaders, Don and Greg are the cream of the crop in their field, bringing the kind of experience and expertise that will add immense richness to Pluralsight's IT offering," said Pluralsight CEO Aaron Skonnard in a statement. Both Jones and Shields are Microsoft MVPs and VMware vExperts, and Shields is also a Citrix Technology Professional (CTP).
The move means they are leaving the boutique consulting firm they created, Concentrated Technology. Jason Helmick will continue to provide PowerShell training for Concentrated. Helmick, Jones and Shields will be presenting at next month's TechMentor conference, to be held on the Microsoft campus in Redmond, Wash.
Jones' and Shields' Redmond columns cover the gamut of issues that relate to Windows IT professionals. Some columns offer hands-on tips, like a recent one on how to troubleshoot with Microsoft's RDS Quality Indicator and another on what's new in Group Policy settings. Other columns have led to heated debates, such as last summer's "14 Reasons to Fire Your IT Staff." Jones' recent Decision Maker columns have explained how organizations can create a culture of security.
Posted by Jeffrey Schwartz on 07/23/2014 at 2:11 PM0 comments
Despite seeing its profits shrink thanks to its acquisition of Nokia, Microsoft on Tuesday reported a nice uptick in its core business lines -- notably its datacenter offerings -- and strong growth for its cloud services including Office 365 and Azure.
CEO Satya Nadella appeared in his second quarterly call with analysts to discuss Microsoft's fourth quarter earnings for fiscal year 2014. The company exceeded its forecasts for Office 365 subscription growth and saw double-digit gains across its enterprise server lines.
One of the key questions among analysts is what the future holds for Windows and its struggling phone business, now exacerbated. Nadella underscored that bringing a common Windows core across all device types, including phones, tablets, PCs, Xbox and embedded systems, will strengthen Microsoft's push into mobility, as well as the cloud. This is the notion of what Microsoft described to partners last week as the next wave of Windows, which will come in different SKUs but will be built on a common platform -- what Nadella described as "one" Windows that supports "universal" apps.
"The reality is we actually did not have one Windows," Nadella said on Tuesday's call. "We had multiple Windows operating systems inside of Microsoft. We had one for phone, one for tablets and PCs, one for Xbox, one for even embedded. Now we have one team with a layered architecture that enables us to, in fact, for developers, bring [those] collective opportunities with one store, one commerce system, one discoverability mechanism. It also allows us to scale the UI across all screen sizes. It allows us to create this notion of universal Windows apps."
Responding to an analyst question about what it will take to incent developers to build not just for Apple's iOS and Google's Android but also for Windows Phone and Windows-based tablets, Nadella said he believes this concept of "dual use" -- in which people use their devices for work and their personal lives -- will make it attractive for reluctant developers.
Now that Microsoft has brought all of the disparate Windows engineering teams together into one organization, when the next version of Windows comes out next year, Nadella said it will allow customers to use even their core desktop apps on any device. He's betting that application portability will make it easier and economical for developers to build more apps for Windows.
"The fact that even an app that runs with a mouse and desktop can be in the store and have the same app in a touch-first, in a mobile-first way, gives developers the entire volume of Windows, which you see on a plethora of units as opposed to just our 4 percent share of mobile in the U.S. or 10 percent in some counties," Nadella said. "That is the reason why we are actually making sure that universal Windows apps are available and developers are taking advantage of it. We have great tooling. That's the way we are going to be able to create the broadest opportunity to your very point about developers getting an ROI for building for Windows."
Yet between the lines, the fact that Nadella two weeks ago said Microsoft is a "productivity and platforms" company rather the previous "devices and services" descriptor suggests that the emphasis of Windows is a common platform tied with Office and OneDrive. Microsoft's goal is that this will allow users to do their work more easily, while making it easy for them to use their devices for their personal activities without the two crossing paths. And the most likely way to succeed is to ensure developers who have always built for the Windows platform continue to do so.
Posted by Jeffrey Schwartz on 07/23/2014 at 11:26 AM0 comments
After a protracted decline in PC sales, Intel last week said that enterprises of all sizes are refreshing their portable and desktop computers. In its second quarter earnings report, Intel said PC shipments rose for the third consecutive quarter. While the company acknowledged that the end of life of Windows XP has helped fuel the revival, the company appears optimistic the trend will continue.
Though company officials didn't give specific guidance for future quarters, the company is optimistic that the pending delivery of new systems based on its new 14mm Broadwell processor will propel demand in the following quarters. The new smaller CPU is expected to offer systems that are lighter and offer better battery life.
Intel said its PC group's revenues of $8.7 billion represented a 6 percent increase over the same period last year and a percent jump over the prior quarter. The second quarter Intel reported last week covers the period when Microsoft officially stopped releasing regular patches for its Windows XP operating system.
"The installed base of PCs that are at least four years old is now roughly 600 million units and we are seeing clear signs of a refresh in the enterprise in small and medium businesses," said Intel CEO Brian Krzanich, during the company's earnings call. "While there are some signs of renewed consumer interest and activity, the consumer segment remains challenging, primarily in the emerging markets."
Krzanich was particularly optimistic about the arrival of newest ultramobile systems that will arrive from the 14nm Llama Mountain reference design, which he said will result in fanless, detachable two—in-one systems that are 7.2 mm and weigh 24 ounces. OEMs demonstrated some of these new systems at the recent Computex show in Taipei. Microsoft showcased many new Windows PCs in the pipeline at last week's Worldwide Partner Conference in Washington, D.C.
Posted by Jeffrey Schwartz on 07/21/2014 at 1:41 PM0 comments
Microsoft moved quickly after last week's acquisition of InMage Systems to say that the InMage Scout software appliances for Windows and Linux physical and virtual instances will be included in its Azure Site Recovery subscription licenses.
Azure Site Recovery, a service announced at Microsoft's TechEd conference in Houston in May, is the rebranded Hyper-V Recovery Manager. Azure Site Recovery, unlike its predecessor, allows customers to use the Microsoft Azure public cloud as a backup target rather than requiring a second datacenter. Image Scout is an on-premises appliance which in real time captures data on a continuous basis as those changes occur. It then simultaneously performs local backups or remote replication via a single data stream.
With the addition of InMage Scout subscription licenses to the Azure Site Recovery service, Microsoft said customers will be able to purchase annual protection for instances with the InMage Scout offering. Microsoft is licensing Azure Site Recovery on a per virtual or physical instance basis. The service will be available for customers with Enterprise Agreements on Aug. 1. For now, that's the only way Microsoft is letting customers purchase InMage Scout, though it's not required for Azure Site Recovery. The InMage Scout software is available for use on a trial basis through Aug. 1 via the Azure portal.
The company also quietly removed the InMage-4000 appliance from the portfolio, a converged system with compute storage and network interfaces. The InMage-4000 was available with up to 48 physical CPU cores, 96 threads, 1.1TB of memory and 240TB of raw storage capacity. It supports 10GigE storage networking and built-in GigE Ethernet connectivity. Though it was on the InMage Web site last Friday, by Sunday it was removed.
A Microsoft spokeswoman confirmed the company is no longer offering the turnkey appliance but hasn't ruled out offering a similar type system in the future
Posted by Jeffrey Schwartz on 07/18/2014 at 11:55 AM0 comments
Microsoft this morning announced what was widely rumored -- it will kick off the largest round of layoffs in the company's history. The company will reduce its workforce by 18,000 employees -- much greater than analysts had anticipated. More than two thirds of them -- 12,500 -- will affect workers in its Nokia factories with the rest impacting other parts of the company. The layoffs are aimed at creating a flatter and more responsive organization.
CEO Satya Nadella announced the job cuts just one week after indicating that major changes were in the works. Just yesterday in his keynote address at Microsoft's Worldwide Partner Conference in Washington, D.C. Nadella reiterated that the company must change its culture and be easier to do business with. Since Microsoft announced its intent to acquire Nokia last year for $7.2 billion dollars, critics were concerned it would be a drag on the company's earnings. Nadella clearly signaled he is moving to minimize its impact.
The larger than expected reductions from Nokia is the result of a plan to integrate the company's operations into Microsoft, Nadella said in an e-mail to employees announcing the job cuts, 13,000 of which will take place over the next six months. "We will realize the synergies to which we committed when we announced the acquisition last September," Nadella said. "The first-party phone portfolio will align to Microsoft's strategic direction. To win in the higher price tiers, we will focus on breakthrough innovation that expresses and enlivens Microsoft's digital work and digital life experiences. In addition, we plan to shift select Nokia X product designs to become Lumia products running Windows. This builds on our success in the affordable smartphone space and aligns with our focus on Windows Universal Apps."
Nadella also said that Microsoft is looking to simplify the way employees work by creating a more agile structure that can move faster than it has and make workers more accountable. "As part of modernizing our engineering processes the expectations we have from each of our disciplines will change," he said, noting there will be fewer layers of management to accelerate decision making.
"This includes flattening organizations and increasing the span of control of people managers," he added. "In addition, our business processes and support models will be more lean and efficient with greater trust between teams. The overall result of these changes will be more productive, impactful teams across Microsoft. These changes will affect both the Microsoft workforce and our vendor staff. Each organization is starting at different points and moving at different paces."
The layoffs don't mean Microsoft won't continue to hire in areas where the company needs further investment. Nadella said he would share more specific information on the technology investments Microsoft will make during its earnings call scheduled for July 22.
Posted by Jeffrey Schwartz on 07/17/2014 at 7:14 AM0 comments
In what Apple and IBM describe as a "landmark" partnership, the two companies have forged a deal to bring 100 industry specific, enterprise-grade iOS apps and provide cloud services such as security, analytics and mobile integration for iPads and iPhones. The pact also calls for the two companies to offer AppleCare support for enterprises and IBM will offer device activation, supply and management of devices.
This broad partnership is a significant arrangement for both companies in that it will help IBM advance its cloud and mobility management ambitions and Apple will gain its largest foothold to date into the enterprise. It bears noting, Apple rarely forms such partnerships, preferring to go it alone. At the same time, the buzz generated by this partnership, though noteworthy, may be overstating the impact it will have. The harm it will have on Android and Windows also appears marginal.
To date, Apple has benefited by the BYOD movement of the past few years and that's predominantly why so many iPads and Android-based tablets and smartphones are used by employees. While there's no shortage of enterprise mobile device management platforms to administer the proliferation of user-owned devices, Apple is hoping IBM's foothold in the enterprise, its strong bench of developer tools and its growing cloud infrastructure will lead to more native apps and software-as-a-service offerings.
"This alliance with Apple will build on our momentum in bringing these innovations to our clients globally," said Ginni Rometty, IBM chairman, president and CEO, in a statement. "For the first time ever we're putting IBM's renowned big data analytics at iOS users' fingertips, which opens up a large market opportunity for Apple," added Apple CEO Tim Cook. "This is a radical step for enterprise and something that only Apple and IBM can deliver."
While the pact certainly will give IBM more credibility with its customers, its benefit to Apple appears marginal, which is why the company's stock barely budged on the news last night. "We do not expect the partnership to have a measurable impact on the model given that Apple has already achieved 98 percent iOS penetration with Fortune 500 companies and 92 percent penetration with Global 500 companies," said Piper Jaffray Analyst and known Apple bull Gene Munster in a research note. "While we believe that the partnership could strength these existing relationships, we believe continued success with the consumer is the most important factor to Apple's model."
The Apple-IBM partnership certainly won't help Microsoft's efforts to keep its Windows foothold intact, which is already under siege. On the other hand, it's a larger threat to Android than to Windows. The obvious reason is that Android has more to lose with a much larger installed base of user-owned tablets. Even if the number of combined tablets and PCs running Windows drops to 30 percent by 2017, as Forrester Research is forecasting, enterprises still plan to use Windows for business functions because of its ability to join Active Directory domains and its ties to Windows Server, SharePoint, Office and the cloud (including OneDrive and Azure).
"It makes it more challenging for Windows Phone to gain ground in the enterprise, because IBM bolster's Apple's hardware in the enterprise, for both sales/support and enterprise apps," said Forrester analyst Frank Gillett. "And that indirectly makes it harder for Windows PCs to stay strong also, but that's incremental."
Pund-IT Analyst Charles King sees this deal having a more grim effect on Microsoft. "Microsoft is in the most dangerous position since the company is clearly focusing its nascent mobile efforts on the same organizations and users as IBM and Apple," he said in a research note. The partnership was announced at an unfortuitous time for Microsoft -- the company is rallying its partners around Windows, among other things, at its Worldwide Partner Conference in Washington, D.C. where Microsoft has talked up its commitment to advance Windows into a common platform for devices of all sizes, from phones to large-screen TVs. "The goal for us is to have them take our digital work-life experiences and have them shine," Microsoft CEO Satya Nadella said in the keynote address at WPC today.
While Apple and IBM described the partnership as exclusive, terms were not disclosed. Therefore it's not clear what exclusive means. Does that mean Apple can't work with other IT players? Can IBM work with Google and/or Microsoft in a similar way? At its Pulse conference in Las Vegas back in February, IBM said it would offer device management for Windows Phone devices through its recently acquired MaaS360 mobile device management platform.
Also while Apple may have broken new ground with its IBM partnership, Microsoft has made a number of arrangements with providers of enterprise and vertical applications to advance the modern Windows platform. Among them are Citrix, Epic, SAP, Autodesk and Salesforce.com (with Salesforce One being available for Windows apps this fall).
Munster predicted if half the Fortune 500 companies were to buy 2,000 iPhones and 1,000 iPads above what they were planning to purchase from this deal, it would translate to half of one percent of revenue in the 2015 calendar year. In addition, he believes IBM will offer similar solutions for Android. Even if Munster is underestimating the impact this deal will have on Apple, there's little reason to believe this pact will move the needle significantly, if at all, for Windows. The fate of Windows is in Microsoft's hands.
Posted by Jeffrey Schwartz on 07/16/2014 at 12:35 PM0 comments
It's no secret that big changes are coming to Microsoft. CEO Satya Nadella made that clear in his 3,100-word memo to employees late last week. The key takeaways of that message were that Microsoft is now a platforms and productivity company and it intends to become leaner in a way that it can bring products to market faster.
While the new platforms and productivity mantra doesn't mean it's doing away with its old devices and services model, Nadella is trying to shift the focus to what Microsoft is all about and it's sticking with a strong cloud and mobile emphasis.
The latter sounds like Nadella wants to accelerate the Microsoft One strategy introduced last year and, reading between the lines, he wants to break down the silos and fiefdoms. In his keynote address at Microsoft's Worldwide Partner Conference in Washington, D.C. today, Nadella said the company intends to change its culture.
"We change the core of who we are in terms of our organization and how we work and our value to our customers," Nadella said. "That's the hardest part really. The technology stuff is the simpler thing. We all know that but we need to move forward with the boldness that we can change our culture. It's not even this onetime change, it's this process of continuous renewal that [will] succeed with our customers."
Nadella's push comes when there is unease among the Microsoft ranks. Rumors persist that Microsoft is planning some layoffs. The move isn't unexpected, given Microsoft inherited 25,000 new employees from its $7.2 billion acquisition of Nokia. According to a Reuters report, Microsoft is planning on laying off 1,000 of those employees based in Finland. The layoffs are expected to be the most extensive in Microsoft's history, according to a Bloomberg report.
To date Nadella is not indicating any planned cutbacks but at the same time it appears Nadella is well aware that Microsoft needs to rid itself of those who aren't on board with the company's new way of doing business.
Nadella said Microsoft needs to "enable the employees to bring their A game, do their best work, find deeper meaning in what they do. And that's the journey ahead for us. It's a continuous journey and not an episodic journey. The right way to think about it is showing that courage in the face of opportunity."
Posted by Jeffrey Schwartz on 07/16/2014 at 12:40 PM0 comments
In his annual address to partners, Microsoft COO Kevin Turner said the company will not provide any government access to customer data. Microsoft will fight any requests by a government to turn over data, Turner told 16,000 attendees at the company's annual Worldwide Partner Conference, which kicked off today in Washington, D.C.
"We will not provide any government with direct unfettered access to customers' data. In fact we will take them to court if necessary," said Turner. "We will not provide any government with encryption keys or assist their efforts to break our encryption. We will not engineer backdoors in the products. We have never provided a business government data in response to a national security order. Never. And we will contest any attempt by the U.S. government or any government to disclose customer content stored exclusively in another place. That's our commitment."
Microsoft will notify business and government customers when it does receive legal orders, Turner added. "Microsoft will provide governments the ability to review our source code, to reassure themselves of its integrity and confirm no backdoors," he said.
The remarks were perhaps the most well received by the audience during his one-hour speech that also covered the progress Microsoft has made for its customers in numerous areas including the success of Office 365, Azure, virtualization gains over VMware, business intelligence including last year's boost to SQL Server and the release of Power BI, which included its new push into machine learning. While Microsoft Chief Counsel Brad Smith has issued a variety of blog posts providing updates and assurance that it will protect customer data, Turner's public remarks step up the tenor of Microsoft's position on the matter.
While not addressing former NSA contractor Edward Snowden by name, it was a firm and public rebuke to accusations last year that Microsoft provided backdoors to the government. Turner acknowledged that despite its 12-year-old Trustworthy Computing Initiative, its Security Development Lifecycle and a slew of other security efforts, Microsoft needs to (and intends) emphasize security further. "When you think about the cyber security issues, there's never been an issue like this past year," Turner said. "It is a CEO-level decision and issue."
Turner talked up Microsoft's existing efforts including its ISO standard certifications, operational security assurance Windows Defender, Trusted Platform Module, Bitlocker and various point products. He also played up the company's higher level offerings such as assessments, threat detection response services and its digital crimes unit.
Microsoft has other security offerings and/or efforts in the pipeline, Turner hinted. "We will continue to strengthen the encryption of customer data across our network and services," he said. "We will use world-class cryptography and best-in-class cryptography to do so."
Posted by Jeffrey Schwartz on 07/14/2014 at 2:47 PM0 comments
If you thought Microsoft was looking to disrupt the storage landscape earlier this week when it launched its Azure StorSimple appliances, the company has just upped the ante. Microsoft is adding to its growing storage portfolio with the acquisition of InMage, a San Jose, Calif.-based provider of converged disaster recovery and business continuity infrastructure that offers continuous data protection (CDP). Terms weren't disclosed.
InMage is known for its high-end Scout line of disaster recovery appliances. The converged systems are available in various configurations with compute storage and network interfaces. Its InMage-4000 is available with up to 48 physical CPU cores, 96 threads, 1.1TB of memory and 240TB of raw storage capacity. It supports 10GigE storage networking and built-in GigE Ethernet connectivity.
Over time InMage will be rolled into the Microsoft Azure Site Recovery service to add scale to the company's newly added disaster recovery and business continuity offering. Microsoft had earlier announced plans to enable data migration to Azure with Scout, InMage's flagship appliance.
"InMage Scout continuously captures data changes in real time as they occur and performs local backup or remote replication simultaneously with a single data stream," a description on the company's Web site explained. "It offers instantaneous and granular recovery of data locally and enables push-button application level failovers to remote sites to meet local backup and/or remote DR requirements, thus going above and beyond the protection offered by conventional replication backup and failover automation products alone."
It also collects data from production servers in real time into memory before they're written to disk and moves the data to the InMage Scout Server. This eliminates any added I/O load from the backup or replication process. It also has built-in encryption, compression and WAN acceleration. It supports backups of Hyper-V, VMware ESX and Xen virtual machines.
The Scout portfolio also protects Linux and various Unix environments, and the company offers specialized appliances for Exchange Server, SAP, Oracle SQL Server, SharePoint, virtualization and data migration.
"Our customers tell us that business continuity -- the ability to backup, replicate and quickly recover data and applications in case of a system failure -- is incredibly important," said Takeshi Numoto, Microsoft's corporate VP for cloud and enterprise marketing, in a blog post announcing the acquisition.
These products don't overlap. StorSimple offers primary storage with Azure as a tier, while InMage offers disaster recovery using the cloud or a secondary site as a target.
Posted by Jeffrey Schwartz on 07/11/2014 at 12:23 PM0 comments
When it comes to enterprise storage, companies such as EMC, Hewlett Packard, Hitachi, IBM and NetApp may come to mind first. But there are a lot of new players out there taking a piece of the pie. Among them are Fusion-io, GridStore, Nimble Storage, Nutanix, Pure Storage, SolidFire and Violin Memory, just to name a few high fliers. Another less obvious but potentially emerging player is Microsoft, which acquired storage appliance maker StorSimple in 2012.
As I noted a few weeks ago, Microsoft is aiming at commoditizing hardware with software-defined storage. In recent months Microsoft has also indicated it has big plans for making StorSimple a key component of its software defined storage strategy, which of course includes Storage Spaces in Windows Server. Microsoft this week announced it is launching the Azure StorSimple 8000 Series, which consists of two different arrays that offer tighter integration with the Microsoft Azure public cloud.
While Microsoft's StorSimple appliances always offered links to the public cloud, the new Azure StorSimple boxes with disks and flash-based solid-state drives use Azure Storage as an added tier of the storage architecture, enabling administrators to create virtual SANs in the cloud just as they do on premises. Using the cloud architecture, customers can allocate more capacity as needs require.
"The thing that's very unique about Microsoft Azure StorSimple is the integration of cloud services with on-premises storage," said Marc Farley, Microsoft's senior product marketing manager for StorSimple, during a press briefing this week to outline the new offering. "The union of the two delivers a great deal of economic and agility benefits to customers."
Making the new offering unique, Farley explained, is the two new integrated services: the Microsoft Azure StorSimple Manager in the Azure portal and the Azure StoreSimple Virtual Appliance. "It's the implementation of StorSimple technology as a service in the cloud that allows applications in the cloud, to access the data that has been uploaded from the enterprise datacenters by StorSimple arrays," Farley explained.
The StorSimple 8000 Series lets customers run applications in Azure that access snapshot virtual volumes which match the VMs on the arrays on-premises. It supports Windows Server and Hyper-V as well as Linux and VMware-based virtual machines. However unlike earlier StorSimple appliances, the new offering only connects to Microsoft Azure -- not other cloud service providers such as Amazon Web Services. Farley didn't rule out future releases enabling virtual appliances in other clouds.
The aforementioned new StorSimple Manager consolidates the management and views of the entire storage infrastructure consisting of the new arrays and the Azure Virtual Appliances. Administrators can also generate reports from the console's dashboard, letting them reallocate storage infrastructure as conditions require.
Farley emphasized that the new offering is suited for disaster recovery, noting it offers "thin recoveries." Data stored on the arrays in the datacenter can be recovered from copies of the data stored in the Azure Virtual Appliances.
The arrays support iSCSI connectivity as well as 10Gb/s Ethernet and inline deduplication. When using the Virtual Appliance, administrators can see file servers and create a virtual SAN in the Azure cloud. "If you can administer a SAN on-premises, you can administer the virtual SAN in Azure," Farley said.
Microsoft is releasing two new arrays: the StorSimple 8100, which has 15TB to 40TB of capacity (depending on the level of compression and deduplication implemented) and the StorSimple 8600, which ranges from 40TB to 100TB with a total capacity of 500TB when using Azure Virtual Appliances.
The StorSimple appliances are scheduled for release next month. Microsoft has not disclosed pricing but the per GB pricing will be more than the cost of the Microsoft Azure blog storage offering, taking into account bandwidth and transaction costs.
Posted by Jeffrey Schwartz on 07/09/2014 at 1:50 PM0 comments
A study commissioned by VMware finds enterprise users "overwhelmingly" prefer Macs over Windows PCs. According to the survey of 376 IT professionals conducted by Dimensional Research, 71 percent of enterprises now support Macs and 66 percent have employees who use them in the workplace.
VMware, which of course has a vested interest in the demise of traditional Windows PCs in the enterprise, didn't ask to what extent Macs are deployed within respondents' organizations. While the share of Macs in use overall has increased over the years, according to IDC, the share of Macs dropped slightly to 10.1 percent last quarter from 11 percent year-over-year. However, that may reflect the overall decline in PC hardware sales over the past year. Nevertheless with more employees using their personal Macs at work and execs often preferring them over PCs, their presence in the workplace continues to rise.
Consequently, VMware is asserting that the findings show the dominance of Windows is coming to an end. "For companies, the choice is very clear -- they need to respond to end-user demand for Macs in the enterprise or they will find it difficult to recruit and retain the best talent on the market," said Erik Frieberg, VMware's VP of marketing for End-User Computing in a blog post last week. "They also need to provide IT administrators the tools to support a heterogeneous desktop environment. Otherwise there will be disruption to the business."
Despite user preference, the VMware study shows that 39 percent of the IT pros surveyed believe Macs are more difficult to support and 75 percent don't believe they are any more secure. "While employees clearly prefer Macs, there are challenges from an IT perspective that Macs must overcome before they can replace Windows PCs in the enterprise," Freiberg noted.
Exacerbating the challenge, 47 percent said only some applications that employees need to do their jobs run on Macs and 17 percent report none of their apps can run on Macs.
That trend is good news for Parallels, whose popular Parallels Desktop for Mac allows Windows to run as a virtual machine on Macs. I happened to catch up with Parallels at TechEd in Houston in May, where the company also announced a management pack for Microsoft's System Center Configuration Manager (SCCM). The new tool gives admins full visibility of Macs on an enterprise network, Parallels claims. In addition to network discovery, it includes a self-service application portal, an OSX configuration profile editor and can enable FireVault 2 encryption. The management pack can also deploy packages and prebuilt OSX images as well as configuration files.
VMware naturally sees these findings as lending credibility to its desktop virtualization push, including its Fusion Professional offering, which lets IT create virtual desktops for PCs and Macs, as well as its Horizon desktop-as-a-service offerings.
The survey also found that less than half (49 percent) unofficially support user-owned PCs, while 27 percent officially have such policies in place. The remaining 24 percent don't support user-owned PCs.
Are you seeing the rise of Macs in your organization? If so, would you say an invasion of Macs in the enterprise is coming? How are you managing Macs within your shop?
Posted by Jeffrey Schwartz on 07/07/2014 at 9:30 AM0 comments
CA Technologies on Monday said it is selling off its Arcserve data protection software business to Marlin Equity Partners, whose holdings include Changepoint, Critical Path, Openwave, Tellabs and VantagePoint.
The new company will take on the Arcserve name. Mike Crest, the current general manager of CA's Arcserve business, will become CEO of the new company. Terms of the deal, expected to close at the end of this calendar quarter, were not disclosed.
Developed more than two decades ago by Cheyenne Software, which CA acquired in 1996, Arcserve has a following of enterprise customers who use it to protect mission-critical systems.
CA just released the latest version of Arcserve Unified Data Protection (UDP), which is available as a single offering for Linux, Unix and Windows systems. It includes extended agentless protection for Hyper-V and VMware virtual environments. However, the backup and recovery market has become competitive, and CA has been divesting a number of businesses since its new CEO, Mike Gregoire, took over last year.
Marlin has $3 billion in capital under management. "We are committed to providing the strategic and operational support necessary to create long-term value for Arcserve and look forward to working closely with CA Technologies through the transition," said Marlin VP Michael Anderson in a statement.
In a letter to partners, Chris Ross, VP for worldwide sales for CA's data-protection business, said the move will benefit Arcserve stakeholders. "Greater focus on company business functions, R&D and support will mean higher levels of service and customer satisfaction," Ross said. "Simply put, the new company will be purpose-built end-to-end for Arcserve's unique target customer segment, partner model and overall strategy."
Posted by Jeffrey Schwartz on 07/07/2014 at 9:32 AM0 comments
The preview of the next version of Windows could appear in the next few months and will have improvements for those who primarily use the traditional desktop environment for Win32-based applications, according to the latest rumors reported Monday.
"Threshold," which could be branded as Windows 9 (though that's by no means certain) will target large audience of Windows 7 user who want nothing to do with the Windows 8.x Modern user interface, according to a report by Mary Jo Foley in her All About Microsoft blog. At the same time, Microsoft will continue to enhance the Modern UI for tablet and hybrid laptop-tablet devices.
To accomplish this, the Threshold release will have multiple SKUs. For those who prefer the classic desktop and want to run Win32 apps, one SKU will put that front and center, according to Foley. Hybrid devices will continue to support switching between the Modern UI (also referred to as "Metro") and the more traditional desktop interface. And another SKU, aimed at phones and tablets only, will not have a desktop component, which may prove disappointing to some (myself included) who use tablets such as the Dell Venue 8 Pro. At the same time, it appears that SKU will be used for some Nokia tablets and one might presume a future Surface 3 model (and perhaps for a "mini" form factor).
As previously reported, Threshold will get a new Start menu. Microsoft in April released Windows 8.1 Update, which added a Start screen and various improvements for keyboard and mouse users, but no Start menu. Foley pointed out that the mini Start menu that was demonstrated at Microsoft's Build conference in April is expected to be customizable.
The Threshold release is expected to arrive in the spring of 2015. Meanwhile, Foley also noted a second and final Windows 8.1 Update is expected to arrive next month for Patch Tuesday, though users will have the option of opting out.
Though details of how Microsoft will improve the classic Windows desktop remain to be seen, this should be welcome news to Windows 7 shops (and perhaps some Windows XP holdouts) making long-term migration and system upgrade plans. Our research has suggested all along that shops that plan to pass on Windows 8.x will consider Windows Threshold.
Microsoft said it had no comment on the report.
Posted by Jeffrey Schwartz on 07/02/2014 at 8:09 AM0 comments
Microsoft's Surface Pro 3 could benefit all types of workers looking for a laptop that they can also use as a tablet. Among them are SharePoint administrators.
As soon as the new Surface Pro 3s went on sale at BestBuy 10 days ago, Tamir Orbach, Metalogix's director of product management for SharePoint migration product, went out and bought one. Having seen my first-look write up last week, he reached out, wanting to share with me his observations on the device in general and why he believes every SharePoint administrator would benefit by having one.
Many of his customers who are SharePoint administrators tend to have a small, low end Windows tablet or iPad and a heavy laptop or desktop on their desks. Orbach believes the Surface Pro 3's high resolution, light weight and the coming availability of a unit with an Intel Core i7 processor and 8GB of RAM will make the device suitable as a SharePoint administrator's only PC and tablet.
"Pretty much all of us professionals want or need both a laptop or desktop and a slate," Orbach said. "It's so light that you can carry it anywhere you want and you would barely even feel it. And the screen is big enough, the resolution is good, the functionality is powerful enough to be used as our day-to-day computer."
We chatted about various aspects of the device:
- New keyboard: The new keyboard is bigger and we both agreed the fact that it can be locked on an angle is a significant improvement over previous systems (which only could be used in a flat position). Orbach said one downside to that new angle is you can feel the bounce, which is true but it's not that bad in my opinion. "I'd definitely take it over the flat one though," he said.
- Cost and configuration: Orbach bought the unit configured with a 128GB SSD and 4GB of RAM. That unit cost $999 plus $129 for the keyboard. A SharePoint administrator would be better off with at least the system with a 256GB drive and 8GB of RAM but there's a $300 premium. For one with a i7 processor, you're up to $1,549 without the keyboard.
- Docking station: If the Surface Pro 3 becomes your only computer it would be worth adding the docking station if you have a primary work area.
If you're a SharePoint administrator or any type of IT pro, do you think the Surface Pro 3 would help you do your job better?
Posted by Jeffrey Schwartz on 06/30/2014 at 12:28 PM0 comments
I think it's great that Microsoft is now offering 1TB of capacity in its OneDrive service for Office 365 but that only makes the proverbial haystack even larger due to the lack of a suitable way of finding files when using the new Windows 8.x modern UI.
The major drawback of using OneDrive is that it doesn't sort files in the order they were last modified. I think it's fair to say that's the preferred way for most individuals who want to access files they're working with. Instead, when using the modern UI, it sorts them alphabetically. If you have hundreds or thousands of files in a folder, good luck finding a specific file in the way you're accustomed to.
Sure you can use the search interface or go to the traditional desktop (and that's what I'm forced to do). But if Microsoft wants to get people to use its Windows 8.x UI, wouldn't it make sense to make it easier to use?
Now if you use OneDrive for Business with SharePoint Online, it's my understanding you do have the ability to do custom sorts when using the modern UI. So why not offer the same capabilities to the core OneDrive offering? Surely if Microsoft wants to stave off cloud file service providers such as Dropbox, this would be an easy way to accomplish that.
Do you agree?
Posted by Jeffrey Schwartz on 06/27/2014 at 11:07 AM0 comments
Infrastructure systems management provider SolarWinds is extending into the Web performance monitoring market with the acquisition of Pingdom. Terms of the deal, announced last week, weren't disclosed. Pingdom's cloud-based service monitors the performance and uptime of Web servers and sites.
Web performance monitoring is a natural extension of its business and is a key requirement of those managing their infrastructure, said SolarWinds Executive VP Suaad Sait.
"Our product strategy has always been around delivering IT infrastructure and application performance management products to the market," Sait said. "We had a hole in our portfolio where we wanted to extend the capabilities for cloud-based applications and Web sites. We heard this request from our customers as well from the macro market. Instead of building it organically, we looked for a really good partner and that led to us acquiring Pingdom."
Sait said SolarWinds proactively looked for a company to acquire for two reasons. One is that Web sites are becoming a critical component not only as a company's presence but for lead generation. Second, to ensure availability of Web-based applications from remote sites requires they are monitored. "Pingdom rose to the top of the kind of company that fits into the market we serve but also our business model," he said.
Pingdom is a cloud-based offering and the company claims it has 500,000 users. The service monitors Web sites, DNS, e-mail servers and other infrastructure. Setup only requires a URL. The deal has already closed but Sait said the company hasn't determined its integration roadmap for Pingdom.
Posted by Jeffrey Schwartz on 06/27/2014 at 11:11 AM0 comments
While Microsoft has extended the storage features in Windows Server and its Azure cloud service for years, the company is stepping up its ability to deliver software-defined storage (SDS). Experts and vendors have various opinions on what SDS is, but in effect it pools all hardware and cloud services and automates processes such as tiering, snapshotting and replication.
During a webinar yesterday presented by Gigaom Research, analyst Anil Vasudeva, president and founder of IMEX Research, compared SDS to server virtualization. "Software-defined storage is a hypervisor for storage," Vasudeva said. "What a hypervisor is to virtualization for servers, SDS is going to do it for storage. All the benefits of virtualization, the reason why it took off was basically to create the volume-driven economics of the different parts of storage, servers and networks under the control of the hypervisor."
Prominent storage expert Marc Staimer, president and chief dragon slayer of Dragon Slayer Consulting, disagreed with Vasudeva's assessment. "In general, server virtualization was a way to get higher utilization out of x86 hardware," he said. "The concept of a hypervisor, which originally came about with storage virtualization, didn't take off because what happened with storage virtualization [and] the wonderful storage systems that were being commoditized underneath a storage virtualization layer. What you're seeing today is your commoditizing the hardware with software-defined storage."
Organizations are in the early stages when it comes to SDS. A snap poll during the webinar found that 18 percent have on-premises SDS deployed, while 11 percent have a hybrid cloud/on-premises SDS in place and 32 percent said they are using it indirectly via a cloud provider. GigaOM research director Andrew Brust, who moderated the panel discussion, warned that the numbers are not scientific but participants agreed the findings are not out of line with what they're seeing.
Siddhartha Roy, principal group program manager for Microsoft (which sponsored the webinar), agreed it is the early days for SDS, especially among enterprises. "Enterprises will be a lot more cautious for the right reasons, for geopolitical or compliance reasons. It's a journey," Roy said. "For service providers who are looking at cutting costs, they will be more assertive and aggressive in adopting SDS. You'll see patterns vary in terms of percentages but the rough pattern kind of sticks."
SDS deployments may be in their early stages today but analyst Vasudeva said it's going to define how organizations evolve their storage infrastructure. "Software defined storage is a key turning point," he said. "It may not appear today but it's going to become a very massive change in our IT and datacenters and in embracing the cloud."
Both analysts agree that the earliest adopters of SDS in cloud environments, besides service providers, will be small and midsize businesses. For Microsoft, its Storage Space technology in Windows Server is a core component of its SDS architecture. Storage Space lets administrators virtualize storage by grouping commodity drives into standard Server Message Block 3.0 pools that become virtual disks exposed and remoted to an application cluster.
"That end to end gives you a complete software-defined stack, which really gives you the benefit of a SAN array," Roy said. "We were very intentional about the software-defined storage stack when we started designing this from the ground up."
Meanwhile, as reported last week, Microsoft released Azure Site Recovery preview, which lets organizations use the public cloud as an alternate to a secondary datacenter or hot site and it has introduced Azure Files for testing. Azure Files exposes file shares using SMB 2.1, making it possible for apps running in Azure to more easily share files between virtual machines using standard APIs, such as ReadFile and WriteFile, and can be accessed via the REST interface to enable hybrid implementations.
Is SDS in your organization's future?
Posted by Jeffrey Schwartz on 06/25/2014 at 12:49 PM0 comments
Equinix, which operates the largest global colocation of over 100 datacenters, plans to join the OpenCompute Project and implement some of its specs by early next year. Open Compute is a consortium of vendors initiated by Facebook with a large roster of members that include AMD, Dell, Hewlett Packard, Intel, Microsoft, Rackspace and VMware.
Ihab Tarazi, CTO of Equinix, said the company hasn't officially announced its plan to join OpenCompute, but it probably won't come as a major surprise to observers since Facebook is one of its largest customers. Tarazi said the decision to participate goes beyond Facebook. "Our model is to support the needs of our customers," he said. "There's a whole community on OpenCompute we're going to support."
Among them is Microsoft, which also has a major partnership with Equinix, among several interconnection partners it announced at its TechEd conference last month. With Microsoft's new ExpressRoute service, the company will provide dedicated high-speed links to Equinix datacenters. Microsoft joined OpenCompute earlier this year, saying it plans to share some of its Windows Azure datacenter designs as part of that effort.
I sat down with Tarazi, who is in New York for a company investor conference. Despite jumping on the standards bandwagon, Tarazi said he agrees with comments by Microsoft Azure General Manager Steven Martin, who in a speech earlier this month said, "you have to innovate then commoditize and then you standardize."
In his speech Marin added: "When you attempt to standardize first and say 'I want you as vendors, customers and partners, to get together and agree on a single implementation that we're all going to use for years and years and years to come,' the only thing I know for sure is that you're going to stifle anything meaningful being accomplished for years."
Tarazi concurred. "Innovation takes off faster if you are not waiting on a standard, which is what Steve was saying," he said. "As long as you are able to still deliver the service that is the best way to go. You have to sometimes go for a standard where it's impossible to deliver the service without connectivity or standard."
There's good reason for Tarazi to agree. Equinix is stitching together its own Cloud Exchange, announced in late April, with the goal of providing interconnection between multiple cloud providers. In addition to Microsoft, which has started rolling out interconnectivity to some Azure datacenters (with all planned by year's end), Cloud Exchange also connects to Amazon Web Services through its DirectRoute dedicated links.
Others announced include telecommunications provider Orange, Level 3 Communications and TW Telcom (which the latter agreed to acquire last week). Tarazi said the company is in discussion with all of the players that have operations in its datacenters. "We have 970 plus networks inside our datacenters," he said. "All of those connect to Microsoft in one way or another."
Though he agreed with Martin that there's a time for standards, apparently Tarazi believes in addition to OpenCompute, the time has come to support the OpenStack platform. "If you want to move workloads between them, we're going to make that very simple," Tarazi said. "Or if you want to have a single connection and get to all of them, that's really doable as well."
Tarazi said Equinix also plans to support the IETF's Network Configuration (NETCONF) protocol and Yang modelling language to ease device and network configurations.
Posted by Jeffrey Schwartz on 06/25/2014 at 3:06 PM0 comments
I don't make a habit of attending grand opening ceremonies but when Microsoft opened its second retail store in my backyard Saturday, I decided to accept the company's invitation to check it out. Microsoft opened one of the largest stores to date at the Roosevelt Field mall in Garden City, N.Y. on Long Island (right outside New York City). It's the fifth store in New York and arrives less than two years after opening area locations in Huntington, N.Y. (also on Long Island) and in White Plains, N.Y. in Westchester County. Roosevelt Field is the largest shopping mall in the New York metro area and the ninth largest in the U.S., according to Wikipedia.
The store that opened this weekend is one of the company's largest at 3,775 square feet and 41 employees. It coincidentally opened a day after Friday's Surface Pro 3 launch. "It just worked out that way," said Fazal Din, the store manager, when asked if the opening was timed in coordination with the launch. "But it's a great way to open the store."
While Microsoft's retail stores are primarily intended to draw consumers and are often strategically located near Apple Stores (as this one is), the stores are also targeting business customers, Din said. "We want this store to be the IT department for small businesses," Din said. The store is also reaching out to local partners, he added.
Microsoft corporate VP Panos Panay was on hand for the ribbon cutting ceremony Saturday, where a number of customers later asked him to autograph their new Surface Pro 3s. "This is not only the 98th store, but it's also the 12th store in the tri-state area [New York, New Jersey and Connecticut]. It's kind of a big deal," Panay said. "This is a great area for Microsoft to show its technologies."
Hundreds, if not a few thousand, teenagers camped outside the store in the enclosed mall to score tickets for a free Demi Lovato concert Microsoft arranged in the outside parking lot. The company also gave $1 million in donations to the local divisions of several charities including Autism Speaks, United Way, Variety Child Learning Center and the Girl Scouts of Nassau County.
Nine additional stores are in the pipeline, one of which will open this week in The Woodlands, Texas this Thursday. Most of the stores are in the U.S. with a few in Canada and Puerto Rico. By comparison, Apple, which started opening stores years before Microsoft, has an estimated 424 stores worldwide and 225 in the U.S. With retail sales of over $20 billion for Apple's stores, they represented 12 percent of the company's revenues. Like Apple and Samsung, Microsoft also has its own specialty departments in Best Buy stores.
Though Microsoft is touting the 98th store, by my count only 59 are full retail stores. The rest are smaller specialty stores. It appears Microsoft is largely opening retail stores in the suburbs of large cities rather than in urban locations. For example the only location in New York City is a specialty store in the Time Warner Center in Columbus Circle.
Posted by Jeffrey Schwartz on 06/23/2014 at 1:28 PM0 comments
The preview of Microsoft's Azure Site Recovery is now available, the company said on Thursday. Among numerous offerings announced at last month's TechEd conference in Houston, Azure Site Recovery is the company's renamed Hyper-V Recovery Manager for disaster recovery.
But as I reported, Azure Site Recovery is more than just a name change. It represents Microsoft's effort to make Azure a hot site for data recovery. While Hyper-V Recovery Manager, released in January, provides point-to-point replication, Microsoft says Azure Site Recovery aims to eliminate the need to have a secondary datacenter or hot site just for backup and recovery.
"What if you don't have a secondary location?" Matt McSpirit, a Microsoft technical product manager, asked that question during the TechEd opening keynote. "Microsoft Azure Site Recovery, [provides] replication and recovery of your on-premises private clouds into the Microsoft Azure datacenters."
The original Hyper-V Recovery Manager required a secondary datacenter. "When first released, the service provided for replication and orchestrated recovery between two of your sites, or from your site to a supporting hoster's site," the company said in a blog post Thursday. "But now you can avoid the expense and complexity of building and managing your own secondary site for DR. You can replicate running virtual machines to Azure and recover there when needed."
Microsoft says both offer automated protection, continuous health monitoring and orchestrated recovery of applications. It also protects Microsoft's System Center Virtual Machine Manager clouds by setting up automated replication of the VMs, which can be performed based on policies. It integrates with Microsoft's Hyper-V Replica and the new SQL Server AlwaysOn feature.
The service monitors clouds with SCVMM remotely and continuously, according to Microsoft. All links with Azure are encrypted in transit with the option for encryption of replicated data at rest. Also, Microsoft said administrators can recover VMs in an orchestrated manor to enable quick recoveries, even in the case of multi-tier workloads.
Customers can test it in the Azure Preview Portal.
Posted by Jeffrey Schwartz on 06/20/2014 at 12:08 PM0 comments
A month after introducing the new Surface Pro 3 -- which Microsoft advertises as the tablet designed to replace your laptop -- the device is now available for purchase at select retail locations. But the first batch of units will require a quick firmware update to address an issue where Surface Pro 3 would occasionally fail to boot up even when fully charged.
After spending a month with the Surface Pro 3, I can say the device is a real impressive improvement over the first two versions. It's bigger yet still portable, weighing 1.76 pounds with a much thinner form factor. And it has a much more usable keyboard. See my take, which appears in the forthcoming July issue of Redmond magazine.
I didn't mention the problem booting up because I hadn't experienced it when my review went to press. In recent weeks, I have experience the bug quite regularly. When the problem occurs, typically when Windows goes into sleep mode, I have eventually managed to boot it up, though it has taken anywhere from 15 to 30 minutes to do so. Microsoft last week shared a tip on how to do it faster. It requires a strange combination of pressing the volume button in the up position and holding the power button for 10 seconds with the power adapter plugged in. I initially thought I had a flawed unit but Microsoft said it was a common problem and the firmware upgrade currently available aims to fix that.
The firmware update fixes the power problem. It's easy enough to install the update. Just go to the Settings Charm, touch or click Update and Recovery and then check for a Windows Update. I attempted to run it last night but I received an error message saying to make sure the system is fully charged and try again. I did so this morning without incident. It's too early to say that the patch worked for me.
Microsoft also issued an update which lets the Surface Pen users double click to capture and save screen grabs, which should be welcome since there's no Print Screen button on the keyboard. This requires the installation of the June 10 Windows and OneNote updates. With the included Surface Pen, users can also use it to boot the machine right into OneNote to start taking notes.
In my evaluation of the test unit, I noted that I experienced occasional problems with the system failing to find a network connection, which I did mention in my first look article. In fact, it would sometimes indicate in the device manager that there is no network adapter. It wasn't clear if this problem was unique to my test unit or a universal problem -- it turns out there are reports of others who have experienced this issue as well, Microsoft confirmed. The way to fix that is to reboot but a spokeswoman for Microsoft said a patch for that problem is forthcoming.
Units with the Intel Core i5 processors are available at Best Buy stores and the Microsoft Store (both in retail locations and online). Versions with i3 and i7 processors will ship in August, with preorders open now. The i7 model is good if you'll be using the Adobe Creative Cloud Suite, part of which the company this week optimized for photographers using the Surface Pro 3. The i5 will appeal to most mainstream workers who don't want or need to use it for any complex photo or video editing or computer aided design (CAD) work.
If you get to a Best Buy or Microsoft Store near you, check it out and share your thoughts.
Posted by Jeffrey Schwartz on 06/20/2014 at 10:32 AM0 comments
When Microsoft said it was targeting MacBook users with its new Surface Pro 3 last month, the company demonstrated how much lighter its latest device is by putting the two on a balancing scale. But to really tip the scales for the new tablet PC, Microsoft also talked up its new partnership with Adobe to enhance Photoshop and the rest of the Creative Cloud suite for the new Surface.
Adobe today delivered on that promise with the launch of its new Creative Cloud Photography offering. The company said it will offer the new suite at a starting price of $9.99 per month, which includes what the company calls its new Creative Cloud Photography plan. Part of the new Photoshop CC 2014 release, it is now optimized for Windows 8.1 for those who want to use an electronic stylus or touch.
"We're offering 200 percent DPI support, as well as touch gestures," said Zorana Gee, Adobe's senior product manager for Photoshop during a media briefing. "All of the touch gestures you were able to experience [on traditional PCs and Macs] -- pan, zoom, scale, etc., will now be supported on the new Windows 8 platform."
The optimization for the Surface Pro 3 is available in Photoshop 2014, though the company indicated it was looking to optimize other apps in the suite over time as well.
Adobe last year took the bold step of saying it would only offer its entire Creative Suite of apps, which include Photoshop, Dreamweaver, Illustrator and InDesign, as a cloud-based software as a service. At the Surface launch last month, Adobe was among a number of software vendors including Autodesk and SAP that said they're optimizing their apps for the touch and gesture features in Windows 8.x.
"It's really, really easy to interact with the screen," said Michael Gough, Adobe's VP of Experience Design, when demonstrating the new Windows 8.1-enabled Photoshop at the Surface Pro 3 launch. "The pen input is natural. The performance is great -- both the performance of the software and the hardware working together."
While Photoshop is bundled with the new plan and is optimized for Surface, the subscription also includes Lightroom and Lightroom mobile, which Adobe has designed for use with Apple's iPhone and iPad.
The new Photoshop release also has a number of other new features including content-aware color adaption improvements, Blur gallery motion effects, Perspective Warp and Focus Mask. The latter automatically selects areas in an image that are in focus. It's suited for headshots and other images with shallow depth of field.
Posted by Jeffrey Schwartz on 06/18/2014 at 8:34 AM0 comments
If you're wondering where Microsoft stands with cloud standardization efforts such as OpenStack and Cloud Foundry, the general manager for Microsoft Azure gave his take, saying providers should innovate first. In the keynote address at this week's Cloud Computing Expo Conference in New York, Microsoft's Steven Martin questioned providers that have emphasized the development of cloud standards.
"I think we can agree, you have to innovate, then commoditize and then you standardize," Martin said. "When you attempt to standardize first and say 'I want you as vendors, customers and partners, to get together and agree on a single implementation that we're all going to use for years and years and years to come,' the only thing I know for sure is that you're going to stifle anything meaningful being accomplished for years."
The remarks are clearly a veiled reference to major IT providers offering public cloud services such as IBM, Hewlett Packard, and Rackspace, along with VMware, which is pushing its Cloud Foundry effort. Amazon, Microsoft and Google have the largest cloud infrastructures. According to a report in The New York Times Thursday, Amazon and Google both have 10 million servers in their public clouds while Microsoft Azure has 1 million in 12 datacenters today and 16 planned to be operational by year's end. Despite the large footprints none of the big three have pushed running a standard cloud platforms stack.
Martin's statements about standards were rather telling, considering Microsoft has always had little to say publicly about OpenStack and Cloud Foundry. While Microsoft has participated in OpenStack working groups and has made Hyper-V-compatible in OpenStack clouds, the company has never indicated either way whether it sees its Azure cloud ever gaining some OpenStack compatibility, either natively or through some sort of interface.
"The best thing about cloud technology is in addition to the data, in addition to the access, is the market gets to decide," he said. "The market will pick winners and losers in this space, and we will continue to innovate." Asked after his session if he sees Microsoft ever supporting OpenStack, Martin reiterated that "we'll let the market decide."
Not long ago, one might have construed that as Microsoft talking up its proprietary platforms. However Martin was quick to point out that Microsoft Azure treats Linux like a first-class citizen. "Microsoft will use the technologies that make sense and contribute them back to the public," he said." What will matter is going to be the value that people generate, and how strong and robust the systems are, and the service level agreements you can get."
Posted by Jeffrey Schwartz on 06/13/2014 at 11:14 AM0 comments
As I reported the other day, Microsoft is getting tougher on surveillance reforms. Later that day, Microsoft stepped its battle of overreach up a notch by releasing a court filing seeking to overturn an order to turn over an e-mail stored in its Dublin datacenter. In its appeal released Monday, Microsoft is arguing the search warrant is in violation of international law.
It's believed Microsoft's move is the first time a major company has challenged a domestic search warrant for digital information overseas, The New York Times reported today. Privacy groups and other IT providers are concerned over the outcome of this case, according to the report, noting it has international repercussions. Foreign governments are already concerned their people's data are not adequately protected.
Microsoft filed its objection in the United States Southern New York Court last Friday, saying if the warrant to turn over the e-mail stored abroad is upheld, it "would violate international law and treaties, and reduce the privacy protection of everyone on the planet." The case involves a criminal inquiry, where a federal judge granted a search warrant in New York back in December. The customer's identity and country of origin is not known.
Search warrants seeking data oversees are rare, according to The Times report, but granting one could pave the way for further cases and international conflicts at a time when foreign governments are already unnerved by the surveillance activities by the United States. In its latest filing, Microsoft is seeking a reversal of the warrant. The report said the case could go on for some time with oral arguments scheduled for July 31.
The case could put pressure for revisions to the Electronic Communications Privacy Act of 1986, which was created before international electronic communications over the Internet was common.
Posted by Jeffrey Schwartz on 06/11/2014 at 11:54 AM0 comments
In the year since Edward Snowden stunned the world with revelations that the National Security Agency (NSA) had a widespread digital surveillance effort that included the covert PRISM eavesdropping and data mining program, Microsoft marked the anniversary last week by saying it had unfinished business in the quest for government reforms.
While most cynics presumed intelligence agencies intercepted some communications, Snowden exposed what he and many believe was broad overreach by the government. Many of the revelations that kicked off a year ago last Thursday even put into question the legality of the NSA's activities and the restrictions imposed on the telecommunications and IT industry by the Foreign Intelligence Security Act (FISA).
The leaked documents implicated the leading telcos, along with Microsoft, Google, Facebook, Yahoo and many others, saying they were giving the feds broader access to e-mail and communications of suspected terrorists than previously thought. While the government insisted it was only acting when it believed it had probable cause and on court orders, the NSA's broad activities and the compliance of Microsoft and others put into question about how private our data is when shared over the Internet, even when stored in cloud services.
Whether you think Snowden is a hero for risking his life and liberty for exposing what he believed defied core American freedoms or you feel he committed treason, as Netscape Cofounder and Silicon Valley heavyweight Marc Andreessen believes, the worldview and how individuals treat their data and communications is forever changed.
The revelations were a setback for Microsoft's efforts to move forward its "cloud-first" transformation because the leaked NSA documents found that the company was among those that often had to comply with court orders without the knowledge of those suspected. To his credit, Microsoft General Counsel Brad Smith used the revelations to help put a stop to the objectionable activities.
Both Microsoft and Google last week marked the anniversary by showing the progress both companies have made. Google used the occasion to announce its new end-to-end encryption plugin for the Google Chrome browser and a new section in its Transparency Report that tracks e-mail encryption by service providers. Google announced it is using the Transport Layer Security (TLS) protocol to encrypt e-mail across its Gmail service. Its reason for issuing the Transparency Report was to point out that a chain is only as strong as its weakest link, hoping it would pressure all e-mail providers to follow suit. The report last week showed Hotmail and Outlook.com only implementing TLS 50 percent of the time.
Microsoft has lately emphasized it is stepping up its encryption efforts this year. Encryption for Office 365 is coming, Microsoft said last month. The company will offer 2018-bit Private Forward Secrecy as the default decryption for Office 365, Azure, Outlook.com and OneDrive. Next month Microsoft will also offer new technology for SharePoint Online and OneDrive for Business that will move from a single encryption key per disk to offering a unique encryption key for each file.
Shortly after the Snowden revelations, Microsoft, Google and others filed a lawsuit challenging the Foreign Intelligence Surveillance Act's stipulation that made it illegal for the companies to be more transparent. In exchange for dropping that lawsuit, Microsoft and others were able to make some limited disclosures. But in his blog post last week, Smith said providers should be permitted to provide more details, arguing doing so wouldn't compromise national security.
The unfinished business Smith would like to see resolved includes in summary:
- Recognize that U.S. search warrants end at U.S. borders: The U.S. government should stop trying to force tech companies to circumvent treaties by turning over data in other countries. Under the Fourth Amendment of the U.S. Constitution, users have a right to keep their e-mail communications private. We need our government to uphold Constitutional privacy protections and adhere to the privacy rules established by law. That's why we recently went to court to challenge a search warrant seeking content held in our data center in Ireland. We're convinced that the law and the U.S. Constitution are on our side, and we are committed to pursuing this case as far and as long as needed.
- End bulk collection: While Microsoft has never received an order related to bulk collection of Internet data, we believe the USA Freedom Act should be strengthened to prohibit more clearly any such orders in the future.
- Reform the FISA Court: We need to increase the transparency of the FISA Court's proceedings and rulings, and introduce the adversarial process that is the hallmark of a fair judicial system.
- Commit not to hack data centers or cables: We believe our efforts to expand encryption across our services make it much harder for any government to successfully hack data in transit or at rest. Yet more than seven months after the Washington Post first reported that the National Security Agency hacked systems outside the U.S. to access data held by Yahoo and Google, the Executive Branch remains silent about its views of this practice.
- Continue to increase transparency: Earlier this year, we won the right to publish important data on the number of national security related demands that we receive. This helped to provide a broader understanding of the overall volume of government orders. It was a good step, but we believe even more detail can be provided without undermining national security.
President Obama has put forth some recommendations, though some believe they don't go far enough and have yet to affect any major changes. If you saw the interview NBC's Brian Williams conducted with Snowden in Moscow, it's clear, regardless of the legality of the leaks, this debate is far from over. But if you're concerned about your privacy, encrypting your data at rest and in transit is an important step moving forward.
Posted by Jeffrey Schwartz on 06/09/2014 at 1:06 PM0 comments
Asus, Dell and Hewlett Packard are among the PC suppliers extending the boundaries of Microsoft's new Windows 8.1 operating system with several new enterprise-grade hybrid PC-tablets being revealed at this week's annual Comutex trade show in Taipei.
Some of the devices could even offer an alternative to Microsoft's new Surface Pro 3, a device the company believes is finally suited to combine all the functions of a commercial-grade laptop and a tablet. If the new PC-tablets challenge the Surface Pro 3, that's a good thing for the advancement of Windows for Microsoft. "Surface is a reference design for Microsoft's OEM partners," said David Willis, Gartner's chief of research for mobility and communications, when I caught up with him yesterday at the Good Technology Xchange user conference in New York.
For example, the new HP Pro x2 612, launched today, has a 12.5-inch full high-definition (FHD) display that's just slightly larger than the Surface Pro 3. HP's detachable tablet is available with either an Intel Core i3 or i5 processor with vPro hardware-based security, solid-state drives and two USB 3.0 ports. It is also available with HP's UltraSlim dock. While the Surface Pro 3 is also available with a Core i7 processor, the latter two CPUs should serve the needs of most mainstream business users. And there's nothing to say that HP won't later offer an i7-equipped model down the road.
The HP Pro x2 612 will get 8.25 hours of battery life, though an optional power keyboard extends that to 14 hours, the company said. While the Surface Pro 2 is also available with a power keyboard, Microsoft didn't announce one yet for the new Surface Pro 3. In addition to offering hardware-based security with vPro, HP also added other features to offer improved security for the new device, including HP BIOS, HP Client Security, Smart Card Reader, HP TPM and an optional fingerprint scanner for authentication.
HP also announced a smaller version, the HP Pro x2 410, with an 11.6-inch display and a starting price of $849 for a unit with an i3 processor, 128GB of storage and 4GB of RAM. HP didn't announce pricing for the larger HP Pro x2 612, which ships in September.
Meanwhile, Asus rolled out several new Windows devices including the new Zenbook NX500, available with an i7 quad-core processor. It supports optional NVIDIA GeForce GTX 850M graphics adaptors with 2GB of GDDR5 video memory. The new system also includes a Broadcom 3-stream 802.11ac Wi-Fi, SATA 3 RAID 0 or PCIe x4 SSD storage.
Asus said the new NX500 is the first laptop offered by the company with a 4K/UHD display and VisualMaster technology. Its 15.6-inch device offers 3840x2160 resolution, and an IPS display. The company did not disclose pricing or availability.
And complementing its Venue Pro 8 tablets, Dell also launched several Inspiron models including the 7000 Series 2-in-1. Due to ship in September, it also is powered by Intel's latest Core processors and comes with a 13.3-inch capacitive touchscreen display. A lower-end 11.6-inch model, the 3000 Series, is also available with a starting price of $449.
In all, Microsoft showcased 40 new Windows PCs, tablets and phones at Computex, according to OEM Corporate VP Nick Parker, who gave a keynote address earlier today. "We're delivering on our vision today with rapid delivery of enhancements in Windows, new licensing and programs for an expanded set of partners," Parker said, in a blog post.
Of course, it wasn't all Windows at Computex. Intel said more than a dozen Android and Windows tablets debuted at the conference, with 130 in its radar for this year overall. And Dell revealed it will offer Ubuntu 14.04 LTS version of Linux as an option on its new Inspiron 2-in-1 laptop-tablets.
Posted by Jeffrey Schwartz on 06/04/2014 at 2:33 PM0 comments
While Amazon Web Services (AWS) remains by far the most widely used cloud provider by enterprises, it appears Microsoft's Azure cloud service has gained significant ground over the past year since releasing its Infrastructure as a Service (IaaS) offering.
Azure was the No. 2 cloud service behind Amazon last year, according to a Redmond magazine reader survey, and that finding remained consistent this year, as well. But given Redmond readers are predisposed to using Microsoft technology, it has always remained a mystery which cloud provider was the greatest alternative to Amazon in the broader IT universe.
Every major IT vendor -- including Google, IBM, Hewlett-Packard, Oracle and VMware -- and the telecommunication service providers offer enterprise public cloud services and want to expand their footprints. Many of them, notably Rackspace, AT&T, IBM and HP, are betting on OpenStack infrastructures, which, besides Amazon, is the most formidable alternative to Azure.
In the latest sign Azure is gaining ground, Gartner last week released its Magic Quadrant for IaaS providers, where only Amazon and Microsoft made the cut as leaders (a first for Microsoft in that category). Gartner published a measured assessment of Azure IaaS and all of the major cloud service providers.
"Microsoft has a vision of infrastructure and platform services that are not only leading stand-alone offerings, but that also seamlessly extend and interoperate with on-premises Microsoft infrastructure (rooted in Hyper-V, Windows Server, Active Directory and System Center) and applications, as well as Microsoft's SaaS offerings," according to Gartner's report.
"Its vision is global, and it is aggressively expanding into multiple international markets. It is second in terms of cloud IaaS market share -- albeit a distant second -- but far ahead of its smaller competitors. Microsoft has pledged to maintain AWS-comparable pricing for the general public, and Microsoft customers who sign a contract can receive their enterprise discount on the service, making it highly cost-competitive. Microsoft is also extending special pricing to Microsoft Developer Network (MSDN) subscribers."
The fact that Azure has a wide variety of features in its Platform as a Service (PaaS), as well, offers significant complementary offerings. Microsoft also was one of two vendors described as leaders in Gartner's application PaaS (which it calls aPaaS) Magic Quadrant back in January, bested only by Salesforce.com, now a Microsoft partner.
"The IaaS and PaaS components within Microsoft Azure feel and operate like part of a unified whole, and Microsoft is making an effort to integrate them with Visual Studio, Team Foundation Server, Active Directory, System Center and PowerShell. Conversely, Windows Azure Pack offers an Azure-like user experience for on-premises infrastructure," according to Gartner. "Microsoft has built an attractive, modern, easy-to-use UI that will appeal to Windows administrators and developers. The integration with existing Microsoft tools is particularly attractive to customers who want hybrid cloud solutions."
That's a pretty glowing assessment of Azure, but Gartner also issued some warnings to customers considering Microsoft's cloud service. Notably, Gartner cautioned that Microsoft's infrastructure services are still relatively new -- just over a year old -- while Amazon has offered IaaS since 2006.
"Customers who intend to adopt Azure strategically and migrate applications over a period of two years or more (finishing in 2016 or later) can begin to deploy some workloads now, but those with a broad range of immediate enterprise needs are likely to encounter challenges," according to the Gartner report.
Gartner also warned that Microsoft faces the challenge of operating and managing its Azure at cloud scale and enabling enterprises to automate their infrastructures. In addition, Microsoft is still in the early stages of building out its partner ecosystem and doesn't yet offer a software licensing marketplace, it pointed out. Despite offering some Linux services, Gartner believes Azure is still "Microsoft-centric," appealing primarily to .NET developers. That's an image Microsoft has begun working in earnest to shake. For example Microsoft has open-sourced some of its own .NET offerings, while making Java a first-class citizen on Azure.
Microsoft has 12 datacenters worldwide supporting Azure and that number will reach at least 16 by year's end, the company said. Azure is a key component of Microsoft's hybrid cloud strategy, called Cloud OS, which is based on running multitenant instances using Windows Server, System Center, the Azure Pack (for running Azure-like operations in a private datacenter) and the public cloud.
Azure took center stage at last month's TechEd conference in Houston. It was evident in the keynote, but also in talking with folks on the show floor. "I'm seeing more rapid adoption of Azure overall," said Randy DeMeno, CommVault's chief technologist for Windows.
And speaking during a TechEd session, BlueStripe CTO Vic Nyman noted the benefits of using Azure to scale on demand. "Using Azure, and particularly Platform as a Service and Infrastructure as a Service, is a simple, elegant solution whose presentation layers, turning up and down, is an interesting trend we see."
Are you looking at Azure to scale your infrastructure?
Posted by Jeffrey Schwartz on 06/02/2014 at 12:10 PM0 comments
As Google targets everything from serving ads in your thermostat, to making a driverless car, machine learning and now broadband communications with its reportedly planned $1 billion investment in satellite technology, the search giant is also stepping up its less glamorous effort of developing an alternative to everyday enterprise services offered by Microsoft.
Google has won its share of big conversions from Lotus Notes and Microsoft Exchange, but experts say the majority of enterprises moving their messaging and collaboration efforts to the cloud are going with Office 365. Now Google is looking to make the switch easier. Last week, Google said enterprises can migrate from Exchange Server to Google Apps with its cloud-based data migration service directory from the Admin console to the Gmail servers.
The direct migration offering replaces the need for the Google Apps Migration for Microsoft Exchange tool, which customers had to install on their local mail servers. The new migration service also lets administrators monitor the progress of the migration. The new migration service currently only works for e-mail, with calendar migration currently under development. Google is making the new e-mail migration service available on its Gmail servers over the next two weeks.
Google said the migration service currently is suitable for the following:
- Microsoft Exchange servers that support Exchange Web Services (EWS), specifically Office 365 and Exchange Server 2007 SP1 or higher.
- IMAP servers, including Gmail, Exchange 2003 or lower, and ISPs like GoDaddy.
Google last month also made it easier to manage retention of mail and documents on Google Apps via its Google Vault service. "The options for setting or modifying a retention period -- the length of time your company's messages are archived in Google Vault -- are now more and we've added safeguards when setting a retention period for a specified number of days," Google said in a blog post last month.
Organizations using Microsoft Outlook with Google Apps can now add, manage and join Hangout video calls by downloading a plug-in to Outlook.
Posted by Jeffrey Schwartz on 06/02/2014 at 8:51 AM0 comments
Would VMware and its parent EMC be better off as one company? A report last week by two Wells Fargo analysts suggesting the two should combine into one company was rejected by VMware CEO Pat Gelsinger. The analysts suggested the plans to offer federated solutions among the companies EMC controls, which, in addition to VMware, include RSA and the recently spun-out Pivotal, would make more business sense and offer more shareholder value.
At its annual EMC World conference earlier this month, the company launched what it calls EMC II, an effort to federate the four companies to offer software-defined datacenter solutions. Despite this new federated business model, EMC said it remains committed to letting customers choose best-of-breed solutions. Wells Fargo analysts Um and Jason Maynard issued a note suggesting that could be better accomplished by combining EMC and VMware into one company. EMC spun VMware off in 2007.
"What EMC and VMware call federated solutions is, to us, taking the next step in addressing a key trend in the market today of converged solutions," they wrote, as reported by Barron's Tiernan Ray. "Over the past few years, large OEMs such as IBM, HP, Oracle and Dell have built up or acquired a broader capability across the stack and are offering complete converged solutions rather than point products. Cooperation turned into coopetition and will likely become full-on competition -- to us, the friction is fairly evident and we expect this to continue to grow."
Pressed on the matter in an interview on CNBC's Fast Money program Tuesday during the grand opening of VMware's expanded campus in Palo Alto Calif., Gelsinger said there are no plans to combine the two organizations.
"Simple answer, no," Gelsinger said. "It is working so well. We have this federated model where each company has their own strategic role. We're independent, we're loosely coupled and we're executing like crazy. And it's working for shareholders, our ecosystems, our employees on this beautiful campus here. This has worked and we're going to stay on this model because it's been completely successful."
Speaking at the Sanford Bernstein conference yesterday, EMC chairman and CEO Joe Tucci reiterated the strategy. "In each of these companies the missions are aligned," Tucci said, according to the Seeking Alpha transcript of his remarks. "One depends on the other, built on the other. But, again, you can take these and you can use them as a card giving customers choice, which I think is going to help to find a winner in the third platform. We're not forcing you to use our technologies. You can use Pivotal without using VMware. You can use VMware without using EMC, but when they all work together you get a special layer of magic."
Even though they're separately traded companies, EMC holds an 80 percent stake in VMware and has 97 percent control of voting. Longtime storage industry analyst John Webster wrote the companies will have to deliver the so-called "third platform" it evangelizes more affordably for the EMC II federated strategy to be successful. "EMC will have to deliver on all three aspects of its redefined journey -- inclusion, value and affordability -- if its new Federated EMC strategy is to work as promised at EMC World," Webster noted.
For VMware, it's caught in a tough spot. Microsoft's Hyper-V is gaining ground on the dominant VMware virtualization platform and the Microsoft Azure public cloud also appears to have a strong head start over the VMware Hybrid cloud service A recent survey of Redmond magazine readers found that 21 percent who now use VMware as their primary virtualization platform plan to migrate to Hyper-V.
"I wouldn't want to be on EMC's board," one partner of both companies told me during a conversation at this month's Microsoft TechEd conference in Houston. The only way it appears VMware can stem the migrations to Hyper-V is by lowering its cost, experts say. "The problem is it's their 'Office,'" said one Microsoft exec during an informal discussion at TechEd.
It will be interesting to learn how VMware pushed forward in the next few months leading up to its VMworld 2014 conference in late August.
Posted by Jeffrey Schwartz on 05/30/2014 at 11:41 AM0 comments
One thing that was apparent at this month's TechEd conference in Houston is that apparently everyone is joining the Hyper-V parade. While VMware still offers the dominant virtualization platform, Hyper-V has increasingly gained share in recent years and as a result, quite a few tools have appeared that offer improved support for Microsoft's hypervisor offering.
Among those talking up their extended Hyper-V support at TechEd were Savision, Veeam and Vision Solutions. Last year's release of Windows Server 2012 R2 and the latest release of System Center included major revisions of Hyper-V, which critics said made it suitable for large deployments. Hyper-V is also the underlying virtualization technology in the Microsoft Azure public cloud.
When VMware's ESX hypervisor first emerged, customers were willing to pay the company's licensing fees because of the savings they achieved by eliminating physical servers in favor of virtual environments. But because Hyper-V is free with Windows Server and it's now up to snuff, many IT decision makers are making the switch. Or at the very least, they are adding it to new server deployments.
"I think 2012 R2 release gave Hyper-V the momentum it needed," said Doug Hazelman, vice president of product strategy at Veeam and one of the hypervisor's early supporters. "Hyper-V is the fastest segment of our backup and recovery business." The company has added Hyper-V support in its new Veeam Management Pack v7 for System Center.
The new management pack, which already supported VMware vSphere, can now run in Microsoft's System Center Operations Manager to provide improved management and monitoring of its Veeam Backup and Replication platform. Hazelman said administrators can use the Veeam Management Pack for organizations' VMware and Hyper-V environments. With the new management pack, administrators can access it right from the SCO console, he said.
For its part, Vision Solutions talked up its partnership with Microsoft (inked last fall) to help organizations migrate from VMware to Hyper-V. "We definitely have seen a pretty significant update with folks getting off of VMware and moving over to Hyper-V for multi-production servers," said Tim Laplante, director of product management at Irvine, Calif.-based Vision Solutions. "That's especially true when their VMware maintenance is coming due."
The company's Double Take Move migration tool got a good share of attention at this year's TechEd, even though Vision Solutions has offered it for a while. It's a viable alternative to Microsoft's own recently upgraded Virtual Machine Migrator.
At TechEd, Savision previewed a new release of its Cloud Reporter, which will generate reports on both Hyper-V and VMware infrastructure, said lead developer Steven Dwyer. "Its capacity planning, virtual machine rightsizing for Hyper V," Dwyer said of the new Cloud Reporter 1.7 release.
Cloud Reporter 1.7 will generate reports that show capacity of both VMware and Hyper-V together, Dwyer explained. In addition it will offer predictive analysis, which administrators can use for planning and budgeting.
Posted by Jeffrey Schwartz on 05/30/2014 at 10:38 AM0 comments
Microsoft set the bar for its new Surface Pro 3 last week when it compared the new device, designed to combine the functions of a tablet and a full-powered computer, to a MacBook Pro. At the launch event in New York last week, Panos Panay, the corporate VP for the Microsoft Surface group, put the two devices on a scale to show how the MacBook Pro weighs more. At the same time, Panay emphasized the optional Intel Core i7 processor with 8GB of RAM in the new Surface Pro 3 makes it powerful enough to run the Adobe Creative Suite, including Photoshop.
But the Surface Pro 3 also shares one of the most undesirable features of the latest crop of high-end MacBook Pros -- a factory sealed battery that isn't user replaceable. To change out the battery once the extended warranties expire, the cost is $200 for both systems. That was one of the top topics during a Reddit IAmA (Ask Me Anything) discussion Tuesday moderated by Panay.
Panay and his team fielded numerous questions about the new Surface Pro 3, a device that breaks quite a bit of ground in the full-featured laptop-tablet field. In addition to the battery-related questions, participants wanted to know when a Surface Mini is coming, how the different processor versions handle power and if Microsoft is walking away from Windows RT.
The battery issue appeared to raise some eyebrows among several participants who were seemingly sold on the device until Reddit member "Caliber" asked why it would cost $450 to replace the battery on existing Surface models. If your system is under warranty, it won't cost anything to have the battery replaced, but Panay (or someone on his team) said replacing the battery on the Surface Pro 3 after the warranty expires will cost $200.
That's still quite expensive -- in fact it's the same amount it costs to replace the battery in the Apple Retina MacBook Pro. Traditional MacBook Airs and Pros have user replaceable batteries, but two years ago with MacBook Retina, Apple upped the cost of the new battery, which requires a technician to upgrade.
So now the Surface and high-end MacBooks share that unpleasant cost, though users hopefully shouldn't need to replace them very often.
Microsoft's $200 price tag to replace the Surface Pro 3 battery should be more palatable than $450 for earlier models. Wondering if Caliber was given bad information from the Microsoft support rep, I called the Microsoft Store myself. While the rep didn't give me a price, she said it would be cheaper to replace the Surface than sending it back for a new battery (likewise with replacing the display if it cracks, she said). Most would agree, $450 for a battery is off the charts and while $200 is pretty high as well, I don't think it's a deal breaker.
Also, it's possible the cost of replacement batteries could come down in the next four-and-a-half years -- the amount of time Microsoft said it would take before anyone should notice deterioration of the battery. If the device is charged five times per day over that period, the Surface Pro 3 should still maintain 80 percent of its capacity by that point. If indeed this turns out to be the case, replacing a battery after about five years isn't too bad. At that point, many may be ready for a new system, anyway.
Fans of the Surface were also wondering when a Surface Mini will arrive, a question shared by many who expected Microsoft to launch one last week. "Please for the love of God give us some more concrete info on Surface Mini," wrote Reddit member "swanlee597." "I was really disappointed in the lack of a reveal. Is it real? Is it coming out this year? Should I just buy an OEM 8-inch Win 8 tablet instead of waiting for Surface Mini?"
As for the future of Windows RT, offered on the Surface, Surface 2 and some Nokia tablets, the Surface team said that "Windows on ARM continues to be an important part of the Windows strategy." Responding to questions regarding the difference in battery life between systems developed with the Intel Core i3, i5 and i7 processors, the company said they will offer the same performance. When it comes to compute performance, however, "the i7 will see benchmark scores approximately 15 to 20 percent better than the [Service Pro 3] i5."
One other question that struck a nerve: Why isn't there an i5-based Surface Pro 3 with a 128GB SSD and 8GB of RAM? Microsoft launched one with a 128GB drive and 4GB of RAM for $999 but the next step up is $300 more for a 256GB unit with 8GB of RAM. That question remained unanswered.
Posted by Jeffrey Schwartz on 05/28/2014 at 12:06 PM0 comments
While Microsoft CEO Satya Nadella is willing to make acquisitions, he emphasized he's more focused on organic growth than making a big deal.
Taking questions at the Code Conference Tuesday, organized by the operators of the new Re/code site, Nadella was among several CEOs on the roster including Google's Sergey Brin, Intel's Brian Krzanich, Salesforce.com's Marc Benioff and Reed Hastings from Netflix. When asked what companies Nadella would like Microsoft to buy, he didn't tip his hand.
"I think we have to build something big," Nadella said. "If along the way we have to buy things, that's fine. But we have to build something big. We've built three big things, three and half if [we] add Xbox into it. It's time for us to build the next big thing." The focus is on building new platforms and software for productivity, he said.
In a preview of one major effort along those lines, Nadella, joined on stage by Corporate VP Gurdeep Pall, demonstrated the new Skype Translator, which aims to provide real-time language translation. Pall, who leads the Lync and Skype organization, showed how the Skype Translator can enable him to have a conversation with a colleague who only speaks German.
"It's brain-like in the sense of its capabilities," Nadella said. "It's going to make sure you can communicate to anybody without language barriers." In a blog post Tuesday, Pall said Skype Translator is the result of decades of work and joint development by the Skype and Microsoft Translator teams. The demonstration showed near-real-time voice translation from English to German and vice versa. Pall said it combines Skype and instant messaging technology with Microsoft Translator and neural network-based speech recognition.
"We've invested in speech recognition, automatic translation and machine learning technologies for more than a decade, and now they're emerging as important components in this more personal computing era," Pall noted, adding Microsoft will make Skype Translator available as a Windows 8 beta app by year's end. Microsoft also released this post on the research initiative.
Of course while the Skype Translator may represent years of development, Microsoft did acquire Skype for $8.5 billion. Presumably that's what Nadella meant when he said the company may have to buy things along the way. Fortunately, the company has plenty of cash if it needs to fill in where needed.
Posted by Jeffrey Schwartz on 05/28/2014 at 2:10 PM0 comments
With the slew of announcements at TechEd last week, Microsoft's new RemoteApp was perhaps one of the most noteworthy ones. It certainly is something IT managers looking to offer secure remote applications or remote desktop services should consider.
Microsoft put a decisive stake in the ground with the preview of Azure RemoteApp, which uses the company's huge global cloud service to project data and applications to most major device types and Windows PCs -- but keeping the app and the data in the cloud. Or if you prefer, a hybrid version lets organizations run the apps and data on-premises and use the RemoteApp to distribute them and provide compute services.
I say Microsoft is taking a different approach in that Azure RemoteApp is not a desktop-as-a-service (DaaS) offering similar to Amazon Workspaces or VMware's Horizon. "We definitely see value in providing full desktop but at this point in time we went after the remote application model because that's what a lot of customers said they really wanted once they started working with this," said Klaas Langhout, principle director of program management for Microsoft's remote desktop team. Langhout demonstrated and let a handful of tech journalists test Azure RemoteApp at a workshop in Redmond earlier this month (just days in advance of the TechEd unveiling).
Microsoft released the preview of Azure RemoteApp last week. Organizations are currently permitted to use it with 20 users but it's scalable for much larger implementations, Langhout said. The preview I got to play with had the Microsoft Office apps but Microsoft said the complete service will support any application that can run on Windows Server 2012 R2. Langhout said other versions of Windows Server are under consideration, but the decision to only support the latest version was because "we need to look at this from an application compatibility standpoint," he said.
Azure RemoteApp is intended as an alternative to providing Remote Desktop Services (RDS) in an enterprise datacenter, which required hardware, storage and network infrastructure to quickly get to onboard employees who need a set of applications and access to data. It's also an alternative to Microsoft's App-V and VDI services. Microsoft may incorporate App-V in future offerings or Azure RemoteApp, though that's only under consideration for now.
"We want to provide these applications on any device anywhere while serving it from a multi-datacenter, highly scaled elastic cloud, which allows a very resilient compute fabric to provide these applications no matter where the end user is, and this is extremely fault tolerant," Langhout explained when describing the goal of Azure RemoteApp. "We also want the customer deploying this to be able to set this up without a large capital expense, no purchase order for a lot of servers to be deployed, no setup required for the management side of the infrastructure."
The management burden is removed from the perspective of not having to manage the infrastructure, discrete role services, licenses or RDS. As long as the remote or mobile user has a network connection, the application is projected via RDS to the endpoint device, which can include Windows 7, Windows 8, iOS, Mac OS X and Android. Windows RT support will be added to the preview in a month or so.
RemoteApp will also appear to organizations concerned about protecting data since the applications and data are never persistent on the device. In addition to protecting from data loss from the user perspective, Langhout said it also protects from denial of service and other attacks. It doesn't require any existing infrastructure including Active Directory, though a user needs either a Windows Live ID or an Azure Active Directory account. A user logs into the Azure portal and selects the RemoteApp Service. Then the administrator can select from the gallery image what applications you want to deploy for your users. A hybrid deployment of Azure RemoteApp does require Active Directory on-premises as well as a virtual network.
Microsoft's deliberate move to emphasize a remote application service versus DaaS looks as if the company is not concerned about Amazon WorksSpace or VMware Horizon. That's because Microsoft believes Azure RemoteApp is a better approach to desktop virtualization. "For a lot of scenarios, especially BYOD, they really don't want the Windows shell impeding with the usage of the application," Langhout said. "On iPad, I don't want to go to the Start menu, I just want to get to the application. As long as you can make it seamless to get to the applications, the Windows shell is not as necessary."
The company doesn't have an official delivery date for the service but Langhout indicated his group is shooting for the second half of this year. Microsoft hasn't determined specifics such as the pricing and subscription model.
If you have tested the preview over the past two weeks or you have comments on Azure RemoteApp, feel free to comment below or drop me an email at firstname.lastname@example.org. Is it a better alternative to DaaS?
Posted by Jeffrey Schwartz on 05/22/2014 at 1:18 PM0 comments
A few weeks ago, Microsoft sent out a press invite for "a small gathering" for news from the Surface team. It wasn't a stretch to presume Microsoft was planning to roll out a Surface "mini" to compete with the slew of 8-inch tablets based on Android, Windows 8.1 and of course the iPad Mini.
The lack of a Surface in that form factor represents a key gap in Microsoft's effort to make Windows a mainstream tablet platform. Analysts say small tablets account for half of all tablets sold. As we now know, there was no Surface "mini." Instead, Microsoft took the wraps off the Surface Pro 3. For IT pros and everyday workers who use both a tablet and PC, Microsoft may have broken new ground with the new Surface Pro because it promises to combine the two, as I told New York Post reporter Kaja Whitehouse yesterday.
Mobile industry analyst Jack Gold said in a research note yesterday that Microsoft made the right move in putting the mini it reportedly had in the queue on hold. "Microsoft finally seems to understand it cannot go head to head with Apple's iPad, and must offer a superior business device leveraging its installed base of infrastructure and applications, in particular the full Office suite," he wrote.
So was Microsoft's invitation a ruse? The prevailing thinking is it wasn't. It appears when the invites went out two weeks ago, Microsoft was planning a mini but Microsoft CEO Satya Nadella decided the device wasn't unique enough from other small tablets, according to a Bloomberg report. Microsoft would not comment, but according to Bloomberg, the company had a mini planned that was powered by an ARM-based Qualcomm processor running the Windows 8 RT office.
Another possible reason is that Microsoft realized that a mini without the Gemini touch-enhanced Office apps in development wouldn't make sense, noted All About Microsoft's Mary Jo Foley. Microsoft doesn't appear to have abandoned a mini and it would be a lost opportunity if they can't get one into the market before the holiday season this year. The only question is will it be Windows RT-based, Windows 8.1-based (like in the Windows Surface Pro) or both?
Posted by Jeffrey Schwartz on 05/21/2014 at 10:54 AM0 comments
If you were hoping that Microsoft was planning on launching the rumored Surface "mini" today, you'll have to wait another day. Instead, the company announced the Surface Pro 3, which appears to address key issues of the previous two versions. Microsoft debuted the Surface Pro 3 today at a press event in New York.
If you buy the notion that Microsoft's third attempt at a new product typically is when it lands a hit, the Surface Pro 3 at first glance should be in the queue to maintain that track record. The Surface Pro 3 is remarkably thinner and lighter than the Surface Pro 2 at just 0.36 inches and weighing 1.76 pounds. While the Surface Pro 2 only comes with an Intel Core i5 processor, the Surface Pro 3 is available with different Intel Core processors: i3, i5 and i7.
The new 12-inch device is slightly bigger than its 10.6-inch predecessor but at the same time has the feel of a full-sized laptop. In fact Panos Panay, Microsoft's corporate VP for Surface, described the Surface Pro 3 as "the tablet that can replace your laptop," during today's introduction.
I'll give you my take after I spend more time with the system but at first glance it feels marginally heavier than the Surface 2 -- which runs Windows RT 8.1 -- but significantly lighter than the Surface Pro 2 and is so much more pleasing on the eyes. At any rate, Microsoft positioned today's launch as a major event. "Our goal is to create new categories and spark new demand for our ecosystem," said Satya Nadella, who made brief introductory remarks at today's event. "Today is a major milestone in that journey."
The 12-inch ClearType HD display offers a much higher resolution: 2160x1440 and a 3:2 aspect ratio. It supports up to 8GB of RAM and the company said it can get up to nine hours of battery life. While offering a true tablet that could replace a laptop has been Microsoft's goal from the outset for the Surface team, there are a number of reasons that may now be the case, or at least why the company has gotten much closer, with the new model.
The systems are enterprise-ready in that a power user can now get a configuration with a powerful i7 Core processor, up to 8GB of RAM and a 512GB solid state drive. For those with more moderate needs, an Intel Core i3 processor is also available with as little as 64GB of storage. Price will of course vary on configuration but it starts at $799.
Microsoft also addressed a couple of key pet peeves of many Surface users, notably the difficulty in using it on your lap. The new keyboard that can click into the Surface Pro 3 was designed in such a way that it will not wobble in your lap. A new track pad in the keyboard is improved and you can adjust the Surface Pro 3 keyboard in any position.
Like the Surface Pro 2, the new unit is available with an optional docking station and a new aluminum Surface Pro Pen designed specifically for this device. The Surface Pro Pen offers 256 levels of pressure sensitivity.
Emphasizing that the Surface 3 is targeted at commercial use, the company said it is working with some key ISVs including SAP, Dassault Systèmes and Adobe, among others. Panay invited Michael Gough, Adobe's VP of Experience Design, on stage to reveal plans to optimize the Adobe Photoshop CC for touch on the new Surface Pro 3.
"It's a creator's dream come true," Gough said. "It's really, really easy to interact with the screen, the pen input is natural, the performance is great."
One key thing missing in the accessories lineup was its power keyboard, offered with the Surface Pro 2. Company officials wouldn't say if one comes packed in. Also, while Microsoft has made key strides in getting popular software vendors to offer their wares in the Windows Store, it still lags behind the Apple iTunes App Store and Google Play.
Microsoft will start accepting preorders tomorrow and the devices are slated to ship June 20.
Posted by Jeffrey Schwartz on 05/20/2014 at 1:44 PM0 comments
It wasn't the most prominent topic at this week's TechEd conference in Houston, but PowerShell certainly wasn't left in the dust either. I caught up with Don Jones, author of Redmond magazine's Decision Maker column, after his TechEd session: "A Practical Overview of Desired State Configuration."
We met up at The Scripting Guys booth, where Don was signing copies of the book he coauthored with Jeffery Hicks: "Learn Windows PowerShell in a Month of Lunches" (Manning 2012). The book sold out at TechEd and Don was inundated with questions ranging from specific scripting practices to IT management issues.
One such question was about how to deal with individuals who aren't on board with the new techniques PowerShell 4.0 introduced such as Desired State Configuration (DSC). Don explained in one of his Decision Maker columns last year that DSC can be disruptive.
"With DSC, you use PowerShell to write very simple (or complex, if you like) declarative scripts," he explained in the column. "Script isn't really the proper word, as there's no actual programming. You're really just building an exaggerated INI-esque file, where you specify configuration items that must or must not be present on a computer."
If you weren't at TechEd, you can view his session on Microsoft's Channel 9. If you want to spend some quality time with Don on the topic, he'll be giving a crash course on PowerShell at our TechMentor conference up in Redmond on the Microsoft campus. Also at TechMentor, which like Redmond magazine is produced by 1105 Media, Don will give a talk on DSC. The conference takes place Aug. 11-15.
Posted by Jeffrey Schwartz on 05/16/2014 at 12:36 PM0 comments
Rackspace, the largest independent cloud hosting provider, said that multiple bidders have expressed interest in acquiring it. As a result, Rackspace has made it official that it's looking to sell. The company has hired the investment bank Morgan Stanley to evaluate proposals and the company's options.
In a filing late Thursday with the SEC, Rackspace revealed the move, saying it has "been approached by multiple parties who have expressed interest in exploring a strategic relationship with Rackspace, ranging from partnership to acquisition." Rackspace's future has been in question since CEO Lanham Napier stepped down in February. At the time, I wondered if Rackspace would put itself on the market. The board said it is looking for a new CEO but the company has yet to name one.
While Rackspace is profitable, it's being squeezed by larger players such as Amazon Web Services, Microsoft and Google. Rackspace these days is best known for stewarding the OpenStack open source cloud compute storage and networking standards -- a move aimed at providing an interoperable cloud and an alternative to market-leader Amazon.
Many key cloud providers support OpenStack including IBM, Hewlett Packard and AT&T, as well as many smaller providers. OpenStack is also working its way into Linux servers, making it the cloud operating system of choice for those users.
But the three largest cloud providers -- Amazon, Microsoft and Google -- don't support OpenStack, though orchestration tools such as Puppet, Chef and numerous other third-party offerings enable some levels of interoperability.
Of course the Rackspace cloud servers and storage are now OpenStack-based. Rackspace made a big strategic bet when it teamed up with NASA over four years ago to contribute to the OpenStack code it codeveloped with the open source community. Making the transition for Rackspace was a big and costly bet and the company's stock is down 50 percent over the past year -- though shares jumped 20 percent this morning on the news. It's ironic that the company made the filing just as the semi-annual OpenStack Summit in Atlanta took place this week.
Rackspace also has a formidable SharePoint and Exchange hosting service. I can't help but ponder if Microsoft is one of the interested parties, as outlandish as that may sound to some. There's good reason to laugh off Microsoft having any interest in Rackspace. The Microsoft Azure cloud service already has 12 global datacenters online and has four more in the queue for this year. Microsoft Azure is part of the Cloud OS, largely based on Windows Server and Hyper-V.
Rackspace, by comparison, runs an open source infrastructure. And even though it supports Windows Server and Hyper-V, it's a whole different platform. On the other hand, several people in the OpenStack Foundation have lauded Microsoft for making meaningful contributions and participating in activities. But if Microsoft wanted to have a companion network of datacenters based on OpenStack, Rackspace would be an interesting play.
To be sure, this would be a surprising and likely disruptive move. Coming back from TechEd this week, Microsoft has made it clear it's going to put all of its resources into Azure. Unless those inside the company see OpenStack as a viable threat to Azure's future as a dominant enterprise public cloud, buying anything but the company's SharePoint and Exchange hosting service would be a major departure for Microsoft.
There are likely other interested parties. IBM reportedly was once seriously interested in Rackspace before acquiring SoftLayer for $2 billion. Perhaps IBM has renewed its interest in Rackspace, though a counterargument is that Big Blue is emphasizing higher margin services-based offerings. Though I have no insights as to which companies have expressed interest, here are some possibilities, other than IBM:
- Hewlett Packard: Like Rackspace, HP has made a major commitment to OpenStack for both its public and private cloud offerings. The company certainly has the resources to build out its own global footprint since it's a major provider of server, storage and network gear. In other words, it doesn't need Rackspace for its footprint but rather its brand and customer base.
- Cisco: Networking giant Cisco, recently announced its $1 billion "Intercloud" effort. It also is a significant contributor in the OpenStack community and perhaps Rackspace could provide the glue for its Intercloud. This doesn't sound like a move CEO John Chambers would make, as he's trying divest groups that aren't core. Given mixed results with WebEx, another service it acquired, picking up Rackspace may not be a natural fit for Cisco.
- Google: Another unlikely player since it has shown no interest in OpenStack but Google has lots of money to spend and it's made more surprising moves in the past. Also if it had misgivings about passing on OpenStack, this would be an easy way to get on board.
- AT&T: Perhaps the large telecommunications giant wants to follow in Verizon's footsteps (it bought Terremark a few years ago).
- Verizon: Even though Verizon has Terremark, like Google, it hasn't jumped on the OpenStack bandwagon.
- VMware/EMC: VMware has not totally given OpenStack a pass (having bought Nicera), but the VMware Hybrid Cloud service is targeting shops with its private virtualization infrastructure.
- Red Hat: Could the open source software company, which claims to be the largest OpenStack contributor, decide to become a service provider too?
- Other possibilities: One can't rule out some other companies with deep pockets (or access to capital) such as SAP, Salesforce.com, Dell and Oracle. But I'd say these are all likely longshots.
To be sure, Rackspace said in its filing that it could also go the partnership route or other alternatives. Rackspace has given no timetable for making any type of move, indicating it was just exploring its options. That said, we all know how these things usually work out.
Posted by Jeffrey Schwartz on 05/16/2014 at 11:52 AM0 comments
Not long after Microsoft released Hyper-V Recovery Manager, its tool for disaster recovery, the company is now giving it a new name: Microsoft Azure Site Recovery. But this is much more than a cosmetic change. Microsoft is stepping up its effort to make Azure your hot site for data recovery.
Released just a few months ago, Hyper-V Recovery Manager is designed to protect important workloads and applications by replicating them and making them available for recovery. The company announced the rebranding Monday at its annual TechEd conference in Houston. Microsoft Azure Site Recovery, available next month, extends the notion of using a secondary data center to replicate your site to using its Azure public cloud service.
"What if you don't have a secondary location?" asked Matt McSpirit, a Microsoft technical product manager, during Monday's opening keynote. "Microsoft Azure Site Recovery, [provides] replication and recovery of your on-premises private clouds into the Microsoft Azure data centers."
As noted Monday, Microsoft also announced plans to release Azure Files, which will let organizations use move their virtual machines to Azure with an SMB storage head as a shard store. Microsoft describes Azure Files as a file sharing as a service offering. It's a platform as a service offering where administrators can configure their apps in Azure and can access shared files without having to be managed explicitly.
Posted by Jeffrey Schwartz on 05/15/2014 at 12:51 PM0 comments
If you're attending Microsoft's annual TechEd conference in Houston or watching the keynote and sessions on Channel 9, it's hard to escape hearing some of the many upgrades to the Azure cloud service that were made this week. Microsoft is emphasizing the newly added resiliency of its infrastructure, which now has 12 global datacenters in service with four more scheduled to go online by year's end.
As I noted in passing on Monday, Microsoft announced the general availability of ExpressRoute, which lets organizations connect their datacenters directly to Azure without using a public Internet connection. The service is based on MPLS connectivity from carriers and colocation operators including AT&T, BT, Equinix, Level 3, SingTel, TelecityGroup, Verizon and Zadara Storage.
"You can just add us to your existing MPLS contract or your MPLS WAN and we're also redundant," said Brad Anderson, corporate VP for Microsoft's server and tools business. "Literally, we provision two circuits with every single connection so you have that redundancy for this high connection, dedicated pipe that you have."
Letting organizations use carrier-grade connections via ExpressRoute is likely to make Azure more appealing to enterprise users who don't want to use slower and less reliable Internet connections, experts say. That's especially the case when reliability is critical and for organizations moving large amounts of data that require high bandwidth links.
Microsoft's new Azure Import/Export service, which the company this week announced is generally available, lets organizations move extensive amounts of data in and out of storage blobs. The service can leverage ExpressRoute. Enterprises building hybrid clouds may also require more extensive network connections. Microsoft released several new network features to support those requirements.
Azure Virtual Network now supports multiple site-to-site VPN connections, Microsoft announced. Until now it only allowed for a single connection. This new vnet to vnet option lets organizations connect multiple virtual networks. Microsoft said this is suited for disaster recovery, notably when using SQL Server 2014's new AlwaysOn feature.
Azure users can now reserve public IP addresses using them as virtual IP addresses with Microsoft's new IP Reservation option. Microsoft noted this is important for applications requiring static public Internet IP addresses or for swapping reserved IP addresses to update apps.
Microsoft's new Azure Traffic Manager is also now generally available. It supports both Azure and external endpoints for applications requiring high availability. And Microsoft said that two new compute-intensive virtual machine instances -- A8 and A9 -- are now available to support faster processors and links, more virtual cores and larger memory.
Posted by Jeffrey Schwartz on 05/14/2014 at 11:56 AM0 comments
Microsoft has extended security for its Azure cloud service, will launch a new antimalware agent and will add encryption for its Office 365 service. The company talked up its enhanced cloud security offerings at this week's TechEd conference in Houston.
The new antimalware agent, released to preview, is available for both Microsoft's cloud services and virtual machines. Microsoft also announced partnerships with Symantec and Trend Micro, whose antimalware offerings will also be available in Azure.
"You can use these antimalware capabilities to protect your VMs as well as protect the Azure applications that you're building," said Brad Anderson, corporate VP for Microsoft's server and tools business, during Monday's keynote address. Anderson also said Microsoft would offer encrypted storage for Office 365, SharePoint Online storage and OneDrive for Business.
"What this provides is the ability to actually have every single file that is stored in OneDrive for business encrypted with its own key," Anderson said.
Trend Micro said its Deep Security and SecureCloud offerings will offer threat and data protection security controls for virtual machines deployed in Microsoft Azure. The controls include antimalware, intrusion detection, threat prevention and encryption. The company said they will also offer centralized, automated policy management.
In addition, Trend Micro said it will offer its PortalProtect data protection solutions for organizations migrating or sharing SharePoint workloads with Azure. And Trend Micro said it will offer Microsoft Agent Extension. Customers can choose Deep Security as a security extension when configuring a VM in Azure. Trend Micro also said customers can also use PowerShell Extensions when implementing Trend Micro Deep Security, SecureCloud and Portal Protect for Azure VMs and SharePoint workloads.
Posted by Jeffrey Schwartz on 05/13/2014 at 10:55 AM0 comments
Microsoft kicked off its annual TechEd conference today and underscored its "cloud first, mobile first" mantra by debuting key new wares aimed at advancing the company's effort to deliver access to data and applications to users anywhere and on any device.
Though Microsoft didn't reveal plans for new releases of Windows Server or System Center, nor was it expected to, the company is using this week's event in Houston to emphasize the role its Azure cloud service and Active Directory can play to deliver secure enterprise infrastructure to all forms of mobile devices. Microsoft officials emphasized that these new tools for IT pros and developers will let organizations house their data on-premises, in the public cloud or in a hybrid scenario that will combine the two.
Among the company's announcements today included Azure RemoteApp, the general availability of ExpressRoute and Azure Files, among a slew of other announcements at TechEd. I'll be drilling into these and many of the new offerings in the coming days and weeks.
Corporate VP Brad Anderson kicked off TechEd with the opening keynote today, saying cloud and mobile go hand-in-hand. "You cannot have a cloud without connected device, Anderson said. "As you think about the connected devices, without that cloud, all you have is potential that goes with it." He added: "The amount of information that will be at our fingertips will be amazing."
Key to putting that information at users' fingertips will be empowering IT to protect enterprise data, while at the same time giving users the access to information from any device. Microsoft's new Azure RemoteApp will let IT deliver Windows applications to almost any device including Windows tablets, Macs, iPads and Android tablets.
The preview of Microsoft Azure RemoteApp is now available to organizations that want to let up to 20 users test the app. Because this is a preview, the company has not determined how it will offer the service from a pricing and subscription model. Azure RemoteApp will deliver applications, initially Office in the preview, via Microsoft's Remote Data Services. The service will be designed to let organizations keep data centrally located and will support up to 1,300 Windows-based applications.
Microsoft has not committed to when it will release the offering but the company is targeting the end of the year. "Every organization I talk to has a very large inventory of Windows applications they're looking to deliver to mobile devices," Anderson said. "With Azure RemoteApp, users can scale up and down, so their capital expenditures goes down dramatically."
Anderson also made clear that Microsoft intends to be aggressive on the mobile device management front. Microsoft's recently announced Enterprise Mobility Suite will cost $4 per user per month regardless of the number of devices supported, Anderson announced.
The company also announced administrators will be able to manage Office for iPad, iPhone and Android devices. This Windows Intune component of the Enterprise Mobility Suite will let administrators manage a line of business apps running on Android and iOS.
Looking to help IT organizations reduce VM sprawl, Microsoft unveiled Azure Files. Based on the standard SMB 2.1 protocol, Azure Files runs on top of Azure storage will allow for shared readers and writers. It will also work on-premises, allowing users to access their storage accounts without having to spin up a virtual machine and mange an SMB share. In short, Azure Files will let IT organizations create single file shares available from multiple virtual machines.
Microsoft's ExpressRoute, which lets organizations connect their datacenters directly to Azure without using a public Internet connection, is generally available. The service is based on MPLS connectivity from carriers including AT&T, Verizon and Level 3, among others, as well as through colocation provider Equinix. This will appeal to those who want reliable, faster and inherently more secure connectivity, and Microsoft talked up this capability for those who want to use Azure for disaster recovery and business continuity.
Posted by Jeffrey Schwartz on 05/12/2014 at 10:58 AM0 comments
Now that Nokia's handset business is part of Microsoft, it'll be interesting to see what compelling features come from the new devices and services group besides Cortana, the recently introduced voice-activated personal assistant. One improvement Microsoft might want to put on the fast track is its approach to encryption with Windows Phone.
The suggestion comes from a reader, who responded to my post a few weeks ago about Microsoft's then-pending, and now completed, acquisition of the Nokia handset business.
The reader had recently switched from an iPhone to a Nokia Lumia 521, which he described as a "very capable utility smartphone." However, he quickly discovered that Windows Phone 8.1 BitLocker encryption is not automatically enabled on an unmanaged device when a screen-lock passcode is created, unlike iPhones.
According to a Microsoft Channel 9 video (about 10 minutes in), he discovered Windows Phone 8 devices aren't encrypted at all until activating Exchange ActiveSync (EAS). The reader asked how to activate the built-in BitLocker encryption function from any WP8 handset without having to use EAS or mobile device management (MDM). Also, he wanted to know how to create arbitrary length alphanumeric passcodes from any WP8 handset without having to use EAS or MDM. In short, he can't -- at least not now.
That was something he concluded after seeing the Channel 9 video and reading the Microsoft documentation regarding the BitLocker encryption and how it's built into every Windows Phone. The problem, he argues, is that Microsoft is avoiding the issue. He pointed out that his iPhone offered "on-the-fly device and file encryption as soon as one creates a screen lock password." This is also confirmed by Apple in its documentation (see pages 8-13).
Wondering if there's perhaps some undocumented workaround or if this will be addressed at a later date, I shared the reader's criticism with Microsoft. A company spokeswoman said the behavior observed by the customer is consistent with the design of Windows Phone 8/8.1. "Device encryption can only be invoked on devices using remotely provisioned management policy (via EAS or a MDM)," a Microsoft spokeswoman confirmed.
To protect personal information on a Windows Phone, Microsoft said users should set up a numeric PIN code. If the phone is lost, stolen or a malicious user attempts to brute force their way into the device, the device will automatically be wiped. To prevent attacks on the Windows Phone storage, Microsoft said it offers a few different solutions. First, when the phone is attached to a PC using USB, access to the data is gated based on successful entry of the user's PIN. Second, Microsoft said an offline attack affecting physical removable storage is addressed by fixing storage media to the device itself. Finally, users can register their Windows Phone devices which will enable them to locate, ring, lock or even erase the device when the phone is lost or stolen, Microsoft said.
Nevertheless, Microsoft is apparently taking this reader's suggestion to heart. "We will consider providing a means to enable device encryption on unmanaged devices for a future release of Windows Phone," the spokeswoman said. "In the meantime there are a series of effective security mechanisms in to protect your data. "
Is this a showstopper for you?
Posted by Jeffrey Schwartz on 05/09/2014 at 12:11 PM0 comments
It looks like Microsoft is set to launch its long-anticipated Surface "mini" this month. Microsoft is holding what it described in an invite to media as "a small gathering" on May 20 in New York. While the invitation didn't offer much detail other than the time and place, it indicated it was a private press event regarding the Surface line of hybrid tablet PCs.
Given the subtle hint in the title of its invitation and the fact a small tablet is one of the fastest-growing device types these days, it's a reasonable assumption the company is finally filling this gaping hole in its Surface line. Also suggesting this will be a major launch, Microsoft CEO Satya Nadella will preside over the event, reported Mary Jo Foley in her All About Microsoft blog.
The big question is will Microsoft offer an ARM-based Surface like the Surface 2 that runs Window 8.1 RT or one powered with an Intel processor running Windows 8.1 Pro? Or perhaps we'll see both? The other key question is size. Will it come with a 7- or 8-inch screen? I'd bet on the latter.
If Microsoft wants its Surface line to succeed, it needs a mini in the lineup. Just ask Apple, which reluctantly released its iPad Mini nearly two years ago, not to mention all the Android-based tablets in that form factor.
Price will also be key. With a slew of sub-$300 mini Android and Windows tablets, cost will be critical. Yet I'd be surprised if it's less than the iPad Mini's $329 price tag and if other features are tacked on, it could cost more. It's also a reasonable bet that Microsoft won't undercut its OEM partners.
Posted on 05/07/2014 at 9:57 AM0 comments
Did Microsoft blink? That's the first reaction one might have inferred upon learning of the company's decision to include Windows XP in repairing one of the most prominent zero-day vulnerabilities in Internet Explorer in recent memory.
Microsoft could have stuck to its guns by saying it's no longer patching Windows XP and customers are on their own to either upgrade to a newer operating system or seek costlier assistance. The company had long stated that it would stop issuing patches and updates to Windows XP on April 8 of last month. But the fact that this vulnerability -- revealed earlier this week by security firm FirstEye --- is so significant and that some attackers have already exploited it against companies in the financial services industry necessitated a swift decision by Microsoft.
This vulnerability affected all versions of Internet Explorer running on all releases of Windows including those running on embedded systems, except for users who configured their browsers in protection mode. The flaw enabled attackers to take advantage of a memory corruption vulnerability in the browser. It aimed to deliver a "newer version of the years-old Pirpi RAT to compromised, victim systems by taking control of their browsers, and in turn, their systems and networks," said Kurt Baumgartner, a researcher at Kaspersky Lab, in a blog post.
While Adrienne Hall, general manager of Microsoft's Trustworthy Computing group, said in a blog post that the flaw resulted in a limited number of attacks and fears were overblown, Baumgartner suggested the threat of wider attacks was real. "Once the update and code is analyzed, it can easily be delivered into waiting mass exploitation cybercrime networks," Baumgartner warned. "Run Windows Update if you are using a Windows system, and cheers to Microsoft response for delivering this patch to their massive user base quickly."
Indeed Microsoft acted quicky and decisively but Hall warned Windows XP users shouldn't be lulled into complacency by yesterday's release of a patch for Internet Explorer running on Windows XP. "Just because this update is out now doesn't mean you should stop thinking about getting off Windows XP and moving to a newer version of Windows and the latest version of Internet Explorer," she warned. "Our modern operating systems provide more safety and security than ever before."
Posted by Jeffrey Schwartz on 05/02/2014 at 12:17 PM0 comments
Equinix, one of the largest datacenter colocation and hosting operators, is rolling out an exchange that will link its facilities to multiple cloud service providers.
The new Equinix Cloud Exchange, launched Wednesday, aspires to create a global cloud interconnection network much like Cisco recently announced with its $1 billion Intercloud effort. Just like Cisco, the Equinix Cloud Exchange is initially available in selected areas. The selection Equinix is starting with, nevertheless, is not trivial.
It initially connects to Amazon Web Services and Microsoft Azure in 13 markets worldwide including Silicon Valley, Washington D.C., New York, Toronto, Seattle, Los Angeles, Atlanta, Dallas, Chicago, London, Amsterdam, Frankfurt and Paris. Six additional locations will be added to the network by the end of this year.
Connections to Amazon and Azure are currently limited to Silicon Valley, Washington, D.C. and London with the rest going online later in the year. The company indicated other cloud services will be added in the future. Through a portal and a set of APIs, customers can move workloads among multiple cloud environments. The portal and APIs let administrators allocate, monitor and provision virtual circuits in near real time, Equinix said.
Equinix already has offered Amazon customers connections via its AWS Direct Connect offering and it added Microsoft to the roster last week saying it would make Microsoft Azure ExpressRoute available in 16 markets worldwide.
Posted by Jeffrey Schwartz on 05/01/2014 at 10:48 AM0 comments
When Microsoft released its Office app for iPad users last month, the company left out one key feature: the ability to print files. The company fixed that yesterday with an updated version of the respective Office apps. But if you have an older printer, you may be out of luck. At the very least you'll have to find a workaround without AirPrint, Apple's universal print driver for iOS.
Microsoft said that adding the ability to print Office files from their iPads was the No. 1 request among the 12 million customers who have downloaded the new app, though many wonder why it was left out in the first place. An update to the app, available in Apple's iTunes App Store, lets Office 365 subscribers with at least the $6.99-per-month Personal Subscription, which recently went live, print their documents. You must update each Office app (Word, Excel and PowerPoint).
The update takes a few minutes if you have a good wireless connection. Once you open a document, spreadsheet or presentation, all you need to do is touch the File icon, Print and Select Printer. Upon doing so, I quickly discovered it couldn't find either of my two printers, which both support Wi-Fi. The error message read "No AirPrint Printers Found," as seen in Figure 1.
Both printers are at least five years old and when I called the manufacturer, Brother, the technician said only newer printers have firmware that support AirPrint. If you have an enterprise printer that supports firmware upgrades you might have more success. If you have a printer that can't support AirPrint firmware upgrades, printing files will be difficult (if not impossible).
Apple created AirPrint as an alternative to requiring printer vendors to develop drivers. According to a Microsoft spokeswoman, AirPrint works "with thousands of printers." To see if your printer supports AirPrint, Apple posted a list and other tips. For its part, Brother offers its own iPad printing tool called iPrint & Scan -- though you can't print Office documents from it directly either. However if you use Brother's iPrint & Scan Web interface (an app also in the Apple App Store) and log into OneDrive, you can open your file and print it. It's not the most elegant approach but it works.
The new Office app for the iPad has some other added features including AutoFit for Excel, which lets users adjust the width of multiple rows or the height of multiple columns simultaneously in a spreadsheet. This feature was designed to let users spruce up the appearance of their spreadsheets and ensure that no content is hidden. For PowerPoint users, Microsoft added SmartGuides, which lets users align pictures, shapes and textboxes when moved around on a slide. The app update also features some bug fixes.
Posted by Jeffrey Schwartz on 04/30/2014 at 1:06 PM0 comments
Now that Microsoft has reassured customers that it will continue to offer new releases of SharePoint for on-premise implementations, deployments of SharePoint 2013 are on the rise. That's the assessment of several experts including Metalogix CEO Steven Murphy. "They did an excellent job of clarifying their position that there are in fact two worlds -- on prem and in the cloud," Murphy said. They will be maintaining both and that's a huge clarification."
The uptick isn't off the charts, he acknowledged, but Metalogix says it's seeing increased demand for its migration and data protection tools for SharePoint. Metalogix this week debuted a new release of its Content Matrix migration tool. Murphy said the new Content Matrix 7 offers substantially improved performance, more fine-grained permissions that lets IT delegate certain functions to trusted business users for reorganization of content, handling bulk metadata and copy and move functions. It also supports the movement of SharePoint legacy code to SharePoint 2013 and provides Office 365 support for hybrid implementations.
LogMeIn Upgrades Join.me for Enterprise IT
The popular Join.me is best known as the free screen-sharing tool that many IT administrators use to troubleshoot remote PC users as well as for conducting online meetings. Now its parent company LogMeIn is offering an enterprise version. LogMeIn said the new enterprise edition will make it easier to manage, customize and deploy. The company claims Join.me is used by tens of millions of users with over 1 million first-time users. The enterprise release will support deployments of more than 25 users, support single sign-on via Active Directory Federation Services and Active Directory sync, will include advanced user policies and permission for both groups and individuals, better user access controls, 100 GB of managed online storage for sharing and recording meetings and Outlook integration. Join.me enterprise subscriptions will cost $19 per user per month.
Accelerator Cards from LSI Boost Windows Flash Array
As noted in my blog earlier in the week, Windows Storage Server 2012 R2 is optimized to improve performance thanks to its collaboration with Violin to ready the Windows Flash Array. This technology is ideal for improving SQL Server performance, proponents say.
For its part LSI yesterday said its LSI Nytro flash accelerator cards, which list for $3,495, further boost the performance of SQL Server 2014. Like Violin, LSI collaborated with Microsoft to accelerate SQL Server 2014 database transactions by improving I/O performance and thereby minimizing bottlenecks. In addition, the LSI Nytro flash cards reduce latency and boost throughput and reliability by taking advantage of the new SQL Server 2014 Buffer Pool Extension (BPE) capability, the company said. LSI says BPE functions as a level-two (L2) cache, with the main buffer pool functioning as a level-one (L1) cache. Microsoft in a statement said it tested the cards with SQL Server 2014 and endorsed LSI's claims.
Posted by Jeffrey Schwartz on 04/25/2014 at 1:28 PM0 comments
In the first quarterly earnings report since taking over as the third CEO in Microsoft's 39-year history, Satya Nadella appeared on a conference call with Wall Street analysts. While it helped that Microsoft earnings beat estimates, his debut with financial analysts was significant in that he took questions from those who will play a key role in the company's stock valuation.
Microsoft's actual performance under the new CEO's reign of course will ultimately determine how investors value Microsoft. But helping analysts understand Nadella's vision is an important first step. There were no bombshells but Nadella helped guide the analysts on how he intends to monetize the company's "mobile first, cloud first" transition. Nadella's appearance on the call didn't mark a changing of the guard, per se. However, his predecessor Steve Ballmer rarely appeared on quarterly earnings calls.
It's unclear whether Nadella will show up routinely. But it wouldn't be surprising if he becomes a regular on the calls as his ability to help lift Microsoft's stock price -- virtually flat during Ballmer's 13-year reign -- will be a key measure of his success. While Microsoft's performance and growing share in existing and new markets is table stakes, Wall Street never warmed up to Ballmer despite consistent revenue and profit growth. Nadella also needs to convince analysts that he was the right choice even as Wall Street was pushing for a seasoned CEO such as Ford Chief Alan Mulally and Qualcomm COO (now its CEO) Steve Mollenkopf.
Asked if Nadella has any major strategic changes in the works, he said the company will always be in a state of transition and ready to react to rapid shifts in market demand. "One of the things I strongly believe in is you're planning on a continuous basis, you're executing on a continuous basis," he said. "It's not episodic. The only way we're going to succeed here is by having this notion that you're planning all the time and you're also making the changes to your plans based on the changed circumstances. And I think that's the way you run a company like ours in a marketplace as dynamic as ours."
Nadella said Microsoft needs to continually build and buy new capabilities, and expressed confidence the company can do so, starting with the $7.2 billion acquisition of Nokia, which officially closed today.
Among the noteworthy questions he fielded was if and how Microsoft can make the transition from a company which relied on one-time license fees to a subscription model -- a change facing all major software providers. Nadella, with CFO Amy Hood at his side, pointed to the growth of Office 365 as a leading indicator. Microsoft added 1.1 million Office 365 subscribers in the last three months, bringing the total to 4.4 million. The company shared an interesting data point that came up last week during its hybrid cloud webinar saying that 25 percent of enterprises worldwide are using Office 365.
"We are well on our way to making that transition from moving to pure licenses to long-term contracts as well as a subscription model," Nadella said. "This is a gold rush from being able to capitalize on the opportunity. When it comes to that we have some the broadest SaaS solutions and the broadest platform solutions. That combination of those assets doesn't come often."
To that point, he told the Street what it wants to hear: "What matters to me in the long run is the magnitude of the profits we generate, given a lot of categories that are going to be merged as this transition happens. We have to actively participate in it and drive profit growth."
Posted by Jeffrey Schwartz on 04/25/2014 at 10:15 AM0 comments
Flash storage is one of the fastest-growing new datacenter technologies these days and while critics warn it can cost a lot, proponents say it can vastly improve performance and reduce operational and capital expenses.
With the release of Windows Server 2012 R2, and more specifically Windows Storage Server 2012 R2, Microsoft is testing the limits of flash storage. Violin, a rapidly growing startup which went public last year, and Microsoft codeveloped the new Windows Flash Array. It's a converged storage-server appliance which has every component of Windows Storage Server 2012 R2 including SMB 3.0 Direct over RDMA built in and powered by dual-Intel Xeon E5-2448L processors.
The two companies spent the past 18 months developing the new 3U dual-cluster arrays that IT can use as networked-attached storage (NAS), according to Eric Herzog, Violin's new CMO and senior VP of business development. Microsoft wrote custom code in Windows Server 2012 R2 and Windows Storage Server 2012 R2 that interfaces with the Violin Windows Flash Array, Herzog explained. The Windows Flash Array comes with an OEM version of Windows Storage Server.
"Customers do not need to buy Windows Storage Server, they do not need to buy blade servers, nor do they need to buy the RDMA 10-gig-embedded NICs. Those all come prepackaged in the array ready to go and we do Level 1 and Level 2 support on Windows Server 2012 R2," Herzog said.
Based on feedback from 12 beta customers (which include Microsoft), the company claims its new array has double the write performance with SQL Server of any other array, with a 54 percent improvement when measuring SQL Server reads, a 41 percent boost with Hyper-V and 30 percent improved application server utilization. It's especially well-suited for any business application using SQL Server and it can extend the performance of Hyper-V and virtual desktop infrastructure implementations. It's designed to ensure latencies of less than 500 microseconds.
Violin is currently offering a 64-terabyte configuration with a list price of $800,000. Systems with less capacity are planned for later in the year. It can scale up to four systems, which is the outer limit of Windows Storage Server 2012 R2 today. As future versions of Windows Storage Server offer higher capacity, the Windows Storage Array will scale accordingly, according to Herzog. Customers do need to use third-party tiering products, he noted.
Herzog said the two companies will be giving talks on the Windows Flash Array at next month's TechEd conference in Houston. "Violin's Windows Flash Array is clearly a game changer for enterprise storage," said Scott Johnson, Microsoft's senior program manager for Windows Storage Server, in a blog post. "Given its incredible performance and other enterprise 'must-haves,' it's clear that the year and a half that Microsoft and Violin spent jointly developing it was well worth the effort."
Indeed interest in enterprise flash certainly hasn't curbed investor enthusiasm. The latest round of high-profile investments goes to flash storage array supplier Pure Storage, which today bagged another round of venture funding.
T. Rowe Price and Tiger Global, along with new investor Wellington Management, added $275 million in funding to Pure Storage, which the company says gives it a valuation of $3 billion. But as reported in a recent Redmond magazine cover story, Pure Storage is in a crowded market of incumbents, including EMC, IBM and NetApp, that have jumped on the flash bandwagon as well as quite a few newer entrants including Flashsoft, recently acquired by SanDisk, SolidFire and Violin.
Posted by Jeffrey Schwartz on 04/23/2014 at 1:19 PM0 comments
Microsoft's deal to finalize its acquisition of the Nokia Devices and Services business is set for this Friday, April 25, with the Nokia branch rumored to be renamed "Microsoft Mobile."
According to the Web site Ubergizmo, the Nokia handset and services business will remain headquartered in Finland under the new name Microsoft Mobile Oy. As my friend Mike Elgan pointed out, Oy is the equivalent of LLC or Corp. "It's also Yiddish for 'ouch,' but it's likely Microsoft has the Finnish one in mind," Elgan noted. Microsoft isn't commenting on the report. "We have confirmed the acquisition will be completed on April 25," according to a spokeswoman for Microsoft. "At that time we will begin the work of integration."
Also, before the merger becomes official, the terms of the $7.2 billion deal, announced last summer, have been changed.
Though nothing major, 21 Nokia employees in China who were slated to remain with the company will now join Microsoft. Since it was China that held up the deal last month, perhaps these terms were added to appease all parties? The employees work on mobile phones. Microsoft will also now manage the Nokia.com domain and its social media sites for up to a year and will no longer acquire Nokia's manufacturing facility in Korea.
"The completion of this acquisition follows several months of planning and will mark a key step on the journey towards integration," said Microsoft Chief Counsel Brad Smith in a blog post Monday. "This acquisition will help Microsoft accelerate innovation and market adoption for Windows Phones. In addition, we look forward to introducing the next billion customers to Microsoft services via Nokia mobile phones."
Microsoft has a lot riding on that integration. The deal was long championed by former CEO Steve Ballmer, who recently admitted his biggest regret was missing the mobile wave. The deal involved drawn-out negotiations which originally lacked the support of Founder Bill Gates and current CEO Satya Nadella.
It remains to be seen whether acquiring Nokia's devices and services business turns out to be the savior for Windows Phone and Microsoft's tablet ambitions or what ultimately does it in.
Posted by Jeffrey Schwartz on 04/21/2014 at 3:28 PM0 comments
If you were wondering if Microsoft Azure service would ever become an OpenStack cloud, it looks unlikely anytime soon based on statements by company officials Thursday.
Perhaps you never thought that was in the cards anyway, but given Microsoft's more-welcome approach to open source, I've always wondered what the future held for OpenStack on Azure. I usually get blank stares when I raise the issue.
But Microsoft doesn't believe there are any OpenStack clouds that come near the size and scale of Azure, or the services offered by Amazon or Google, said Corporate VP Brad Anderson, answering a question during a company presented webinar -- the first of its new Hybrid Cloud Series -- held in Redmond (see Kurt Mackie's recap of the presentation here). The hour-long talk is now available on demand here.
"I hear the conversation -- is OpenStack delivering this promise of public, hosted and private and I would argue there's not a global public cloud that's built on OpenStack today," Anderson responded. "If you look at these public cloud organizations -- us, Google and Amazon -- none of us have built on OpenStack. And we're the only one of those three that has this promise and a proven track record of taking everything that we're doing in the public cloud and then delivering it across... a hybrid model."
While Rackspace may beg to differ, IBM and Hewlett Packard are among those that say their OpenStack-based clouds support OpenStack. But both are still a work in progress. At the same time, OpenStack, like Azure, is designed to run Windows Server instances and Hyper-V virtual machines. The promise of OpenStack, however, is that customers can move their workloads to other OpenStack clouds. Microsoft counters that customers can do that with in-house Windows Server private clouds, hosting providers that support Microsoft's cloud OS (few as those may be at this time) and Azure. It's safe to say that the OpenStack community wouldn't see that as a valid comparison.
The question came up just as the OpenStack Foundation this week released its semi-annual distribution called Icehouse which has 350 new features and targets better scalability for enterprise workloads. Members of the OpenStack community from various companies have consistently described Microsoft as an active participant in committees where it comes to ensuring Hyper-V works well in OpenStack clouds.
Despite questioning the reach of OpenStack, Anderson reiterated the company's commitment to integrating with it. "OpenStack is going to be used in a number of different places so we want to also integrate with OpenStack," he said. "If an organization has made a decision that they're going to use OpenStack, it's a lot like Linux. If I go back and look at Linux 10 years ago, we embraced Linux with System Center. We've got an awful lot of Linux. We look at the number of VMs that are running inside of Azure that are Linux-based, and that's a significant number. We'll do the work on OpenStack to make sure Hyper-V in the Microsoft cloud is a first-class citizen. We will continue that work."
While Anderson was playing both sides, in an ironic sort of way, so was Canonical Cloud Product Manager Mark Baker, whom I chatted with earlier in the week about its release of Ubuntu Linux 14.04. Canonical is a major OpenStack participant and Baker claims Ubuntu is a widely used Linux distribution on OpenStack clouds today. At the same time, Baker said besides Amazon, Microsoft Azure is one of the fastest-growing alternatives when it comes to deployments of Ubuntu.
"Even through people may find it surprising, we have a great working relationship with Microsoft and the Azure team," Baker said. "We see that as one of the fastest-growing clouds, and Ubuntu is growing fast on that."
Regardless, a number of major organizations are using OpenStack clouds including Samsung, Netflix, Time Warner, Best Buy and Comcast, according to Baker, acknowledging most are tech-centric enterprises today.
While Anderson didn't actually go into whether or not Azure will support OpenStack, his sizing of it didn't make it sound imminent. Do you agree with his assessment of OpenStack or is he underestimating it?
Posted by Jeffrey Schwartz on 04/18/2014 at 12:33 PM0 comments
With so many tools released for IT pros every week, many of them often go under the radar. Looking to address that, I thought it would be a good idea to offer regular roundups with the latest bits of product and technology news. Here in the Schwartz Report, we'll call it "Tech and Tools Watch." Without further ado, here's our first installment:
Riverbed Improves Branch Office Converged Infrastructure
Riverbed has turned granite into steel. The company this week relaunched its Granite Solution, a converged infrastructure appliance it introduced two years ago, with the new name SteelFusion. But the change is more than cosmetic. The new SteelFusion 3.0 offers a six-fold improvement in performance and a three-fold improvement in capacity -- up to 100 terabytes, the company said.
Long known for its branch office WAN optimization hardware, Riverbed's SteelFusion brings the same concept in the form of converged appliances for remote locations. The SteelFusion branch office appliance provides converged compute, storage, networking and virtualization features different from typical converged appliances from the likes of Cisco, Dell and HP. The Riverbed offering stores data centrally at the datacenter or headquarters location and streams it to the branch office rather than storing data at each remote location.
"Data belongs in the datacenter, which is why it's called a datacenter," said Riverbed Director of Technology Rob Whitely. "When putting these in the branch, now I can run my services locally at the branch but I don't want data residing at the branch where it's subject to theft, corruption or downtime."
Riverbed said the new release also offers improved integration with EMC and NetApp SANs with support for NetApp cluster mode and EMC VNX2 snapshots. It also has an improved backup and recovery support with a new recovery agent, an enhanced scaled-out architecture and is better suited for VDI and CAD/CAM-type implementations.
BMC Software's CLM Tool Targets Microsoft Azure
BMC Software's Cloud Lifecycle Management (CLM) software now supports migration to Microsoft Azure. The company said its new 4.0 release, due for GA in early June, makes it simple to migrate from VMware-based clouds to the Microsoft Azure infrastructure-as-a-service platform (IaaS). BMC said the new release of its CLM tool will let IT manage services delivery, operations, planning and compliance via different public cloud service providers' infrastructures from a single management platform and interface.
Workloads designed to run in VMware environments can be easily redirected to Microsoft Azure, according to Steven Anderson, BMC's principal product manager. "You can specify the application stacks, the networking necessities, storage necessities and all those various aspects," he said. "Those parts go into the blueprints and can remain essentially the same. So all you have to do is change the blueprint and point to a different OS image, whatever the image ID is for the platform you're interested in and you can deploy the new instances of those workloads on the new platform in very little time at all."
The CLM tool can integrate through Microsoft's System Center Virtual Machine Manager. Anderson said that while Amazon is by far the most widely deployed cloud, the company is seeing increased usage of Microsoft Azure as well.
Netwrix Survey: IT Pros Admit Undocumented Changes
Netwrix this week released the results of a survey it commissioned which found that more than half (57 percent) of all IT pros surveyed admit they have made undocumented changes that no one else is aware of. These changes put organizations at risk for downtime and security breaches, according to Netwrix, which supplies the Netwrix Auditor for tracking and managing changes.
The company surveyed 577 IT pros for its "2014 State of IT Changes" report. The study shows these changes caused services to stop for 65 percent of those surveyed. It also found these undocumented changes led to daily or weekly downtime (52 percent), were the root cause of security breaches (39 percent) and 62 percent of the changes were unauditable. Only 23 percent said they have an auditing or change management solution in place.
That's good news for Netwrix, which last month made its auditing solutions available as specific modules. It now include specific standalone offerings under the Auditor brand for Active Directory, file servers, Exchange, SQL Server, Windows Server and VMware.
Posted by Jeffrey Schwartz on 04/18/2014 at 1:02 PM0 comments
It's no secret that the ease of procuring various cloud-computing applications and infrastructure services and the BYOD trend have impacted IT organizations' influence. Now a survey released yesterday suggests business leaders are broadly seizing influence over IT decisions from CIOs and enterprise IT decision makers.
More than one-third of IT decisions are made by business leaders who don't report to the CIO, according to the survey released by Avanade, a joint venture of Microsoft and Accenture focused on the deployment and support of Microsoft technologies. The survey of 1,003 business and IT executives shows that 79 percent believe the business leaders feel they are better equipped to make technology decisions.
This shift means IT organizations are becoming "service brokers," according to Avanade. Under this model, IT organizations consult with the business units to determine their needs and goals. Already 35 percent of IT organizations have transitioned to this service-broker model, according to the survey.
Despite the new shift in control, the survey shows that the vast majority of business leaders (83 percent) still have confidence in IT staff interacting with key stakeholders as consultants and 66 percent plan to expand the role of technologists in becoming business advisors in the coming year. To enable this transition, business leaders are turning to IT organizations to partner with them. The survey found that 44 percent of business leaders are looking to enhance their cloud computing skills and 43 percent are looking to work with IT on systems integration.
This shift has not come without pain. "The tilting balance of control over technology decisions and budget has created a real tension between IT and the business and requires IT to rethink its approach, learn new skills and grow its influence," said Mick Slattery, Avanade executive vice president at Global Service Lines, in a statement. "Forward-looking companies are positioning their IT staff as business advisors and see IT contributing more to accomplishing objectives, and driving positive business results than ever before."
Nevertheless IT organizations are for the most part (71 percent) cooperating with this shift, according to the survey.
Are the lines of business seizing your IT budget? If so how much tension has this created in your organization?
Posted by Jeffrey Schwartz on 04/16/2014 at 11:59 AM0 comments
Nearly six months after Microsoft has shipped Windows Server 2012 R2, a growing number of IT pros now believe the Hyper-V hypervisor is ready for prime time. A growing number of third parties and IT pros say it's now practical to use Hyper-V for business-critical workloads.
Before the current release of Hyper-V 3.0, that wasn't the case. While Hyper-V was suitable for various workloads, most enterprises were reluctant to use it for heavy duty virtualization. And even those that were using it certainly weren't displacing existing hypervisors, especially VMware's ESX.
It's not that many shops weren't intrigued by the thought of using Hyper-V, which Microsoft has offered free of charge with Windows Server since 2008. It's just that it lacked the robustness and management capabilities offered by VMware. Many say while VMware still has a technical edge over Hyper-V, the gap has narrowed to the point that it's suitable for a growing number of mainstream use cases. It's even more appealing for those considering Microsoft's hybrid cloud strategy, called Cloud OS, that makes it easier to bridge Windows Server to Azure using Hyper-V.
Microsoft this week moved to make it easier to migrate VMware infrastructure to Hyper-V with the release of Virtual Machine Converter 2.0. The free tool lets IT pros migrate VMware-based virtual machines and virtual disks to Hyper-V-based VMs and virtual hard disks (VHDs).
"Virtual machine migration is an increasing priority for many customers as more and more are exploring and evaluating the Microsoft platform against their existing VMware installed base," the company said in a blog post from its server and cloud team. "Whether it's virtual to virtual (from one hypervisor to another) or physical to virtual, migration provides customers a path for consolidation of workloads and services, and the foundation for cloud."
The new VM Converter 2.0 supports V-Center and ESX 5.5, VMware virtual hardware version 4 through 10 support and Linux guest OS migration support including CentOS, Debian, Oracle, Red Hat Enterprise, SuSE enterprise and Ubuntu. Microsoft also pointed to two new features. The first is an on-premises VM to Azure VM conversion tool, which lets IT pros migrate their VMware VMs directly to Azure. It also now includes a PowerShell interface for scripting and automation support, letting IT pros automate migration processes with workflow tools including System Center Orchestrator, among others, Microsoft said.
Microsoft also said MVVC 3.0, slated for this fall, will add physical-to-virtual (P2V) machine conversion for supported versions of Windows.
Do you plan to make the switch or are you sticking with VMware (or looking at KVM or other alternative hypervisors)? Share your views in the comment section below or drop me a line at email@example.com.
Posted by Jeffrey Schwartz on 04/11/2014 at 11:23 AM0 comments
Microsoft yesterday issued its final patch for Windows XP and Office 2003. The operating system, arguably the most popular version of Windows ever, is now officially dead (though it's still a long way from the grave). It still lives on millions of PCs and it is well documented that many of them will continue to run the dead OS indefinitely.
Because Microsoft issued the last patch yesterday, nothing bad is likely to happen imminently. It will take many weeks and months before it is clear what vulnerabilities are exploited and how severely it impacts users.
Some expect little of consequence to happen while others say those keeping their Windows XP-based PCs will face major problems. For example, Jason Kennedy, a business product marketing director at Intel, told me this week that he's concerned that many unsuspecting users, especially those with small- and medium-sized business, are awaiting disaster.
"I unfortunately expect many of the bad people who are crafting malware or identity theft opportunities have been lying in wait for some time after April 8," Kennedy said. "I do believe sometime after the deadline those attacks will be unleashed. And people will suffer. I hope it's not severe but I expect there will be problems as a result of not taking the threats serious enough and not taking steps to mitigate."
Given the obvious fact that Intel has a vested interest in users moving off Windows XP since most will have to buy new PCs (with new processors) you may take that with a grain of salt. On the other hand, no one knows what vulnerabilities will surface.
While it remains to be seen if such a dire event happens, those who've decided to stick with Windows XP have made their decisions and are ready to live with the consequences. If you back up your data, chances are the worst that will happen is you'll have to buy a new PC or some other device.
Perhaps you'll give up on Windows altogether? That's what Google, VMware and even Citrix are urging business customers to do. "Many businesses are in a tough spot," Amit Singh, president of Google Enterprise, said in a blog post. "Despite 'significant' security and privacy risks, legacy software or custom-built apps have held businesses back from migrating in time for today's XP support deadline. Companies in this position now find themselves at a timely crossroads. It's time for a real change, rather than more of the same."
Google and VMware teamed up yesterday to announce they will take $200 off Google Chromebooks for Business with VMware Horizon DaaS. The two companies last month announced a pact to bundle the two offerings and this looks to sweeten the deal. Google is offering $100 off Chromebooks for each managed device purchased for a company and Citrix is offering 25 percent off its Citrix XenApp Platinum Edition, which includes Windows XP migration acceleration tool AppDNA.
Windows XP may be dead but as rivals pick at the carcass, it's a long way from being buried.
Posted by Jeffrey Schwartz on 04/09/2014 at 1:24 PM0 comments
Tomorrow represents a milestone for many PC and Exchange administrators. It's the long-dreaded day when Microsoft will issue its last patch for Windows XP, Exchange 2003 and Office 2003 (which, of course, includes Outlook). It's also an important day because Microsoft will also issue the Windows 8.1 Update.
As reported in this month's Redmond magazine cover story, 23 percent of polled readers will keep their Windows XP-based systems running indefinitely. Only 28 percent of you have completed your migrations or have no Windows XP-based machines left. Even though tomorrow is the end for Windows XP, barring any unexpected events, the day will likely come and go without incident -- though you won't be able to avoid hearing about it if you're watching the evening news or listening to the radio.
Nevertheless, Windows XP systems will be around for the foreseeable future as they slowly fade over time. Until then, if you're just using Windows XP for PC apps and are not connecting to the Internet, you shouldn't have any problems. For those still connected, it's advisable to remove the default administrative privileges, enable memory and buffer overflow protection and allow whitelisting for zero-day vulnerability protection, as noted by security software supplier McAfee.
For many organizations, upgrading Windows XP PCs is not a simple task, especially for those with apps that can't run on newer versions of Windows. While there are many remedies -- rebuilding apps, using third-party tools or desktop virtualization/VDI -- all come with a cost and some simply don't see a need to change OSes. Others do but just are going to have to let that deadline pass and either pay extra for support or take other measures -- or perhaps just cross their fingers.
Redmond columnist Greg Shields put it best. In last month's Windows Insider column, he compared replacing Windows XP-based PCs to replacing an aging bridge. "Fixing a bridge or replacing it entirely is an inconvenient activity," he wrote. "Doing so takes time. The process often involves scheduled setbacks, cost overruns and incomprehensible activities that are tough to appreciate when you're idling in construction traffic."
While tomorrow represents the end for Windows XP, Microsoft will issue its Windows 8.1 Update that comes with a more mouse-friendly Start Screen, the ability to pin Windows Store apps to the task bar and APIs that are shared with the forthcoming Windows Phone 8.1.
Right now only a small handful of enterprises are moving to Windows 8.1. But as Microsoft makes more progress in blending the old with the new, perhaps the aversion of moving to the newest version of Windows will subside.
Posted by Jeffrey Schwartz on 04/07/2014 at 12:40 PM0 comments
Typically when I talk to experts about the public cloud, the usual refrain is that there's Amazon Web Services ... and then there's everyone else. When it comes to everyone else, Microsoft Azure is among the leading players with 12 datacenters now in operation around the globe including two launched last week in China. And with 16 additional centers planned by year's end and 300 million customers, the company has strong ambitions for its public cloud service.
At the Build conference in San Francisco this week, Microsoft showed how serious it is about advancing the appeal of Azure. Scott Guthrie, Microsoft's newly promoted executive VP for cloud and enterprise, said Azure is already used by 57 percent of the Fortune 500 companies and has 300 million users (with most of them enterprise users registered with Active Directory). Guthrie also boasted that Azure runs 250,000 public-facing Web sites, hosts 1 million SQL databases with 20 trillion objects now stored in the Azure storage system and it processes 13 billion authentications per week.
Since its launch in November, Guthrie claims that 1 million developers have registered with the Azure-based Visual Studio Online service. This would be great if the vast majority have done more than just register. While Amazon gets to tout its major corporate users, including its showcase Netflix account, Guthrie pointed to the scale of Azure, which hosts the popular Titanfall game that launched last month for the Xbox gaming platform and PCs. Titanfall kicked off with 100,000 virtual machines (VMs) on launch day, he noted.
Guthrie also brought NBC Executive Rick Cordella to talk about the hosting of the Sochi Olympic games in February. More than 100 million people viewed the online service with 2.1 million concurrently watching the men's United States vs. Canada hockey match, which was "a new world record for HD streaming," Guthrie said.
Cordella noted that NBC invested $1 billion in this year's games and said it represented the largest digital event ever. "We need to make sure that content is out there, that it's quality [and] that our advertisers and advertisements are being delivered to it," he told the Build audience. "There really is no going back if something goes wrong," Cordella said.
Now that Azure has achieved scale, Guthrie and his team has been working on rolling out a bevy of significant enhancements aimed at making its service appealing to developers, operations managers and administrators. As IT teams move to a more dev-ops model, Microsoft is taking that into consideration as it builds out the Azure service.
Among the Infrastructure as a Service (IaaS) improvements, Guthrie pointed to the availability of auto-scaling as a service, point-to-site VPN support, dynamic routing, subnet migration, static internal IP addressing and Traffic Manager for Web sites. "We think the combination of [these] really gives you a very flexible environment, a very open environment and lets you run pretty much any Windows or Linux workload in the cloud," Guthrie said.
Azure is a more flexible environment for those overseeing dev-ops thanks to the new support for configuring VM images using the popular Puppet and Chef configuration management and automation tools used on other services such as Amazon and OpenStack. IT can also now use Windows PowerShell and VSD tools.
"These tools enable you to avoid having to create and manage lots of separate VM images," Guthrie said. "Instead, you can define common settings and functionality using modules that can cut across every type of VM you use."
Perhaps the most significant criticism of Azure is that it's still a proprietary platform. In a move to shake that image, Guthrie announced a number of significant open source efforts. Notably, Microsoft made its "Roslyn" compiler and other components of the Microsoft .NET Framework components open source through the aptly titled .NET Foundation.
"It's really going to be the foundation upon which we can actually contribute even more of our projects and code into open source," Guthrie said of the new .NET Foundation. "All of the Microsoft contributions have standard open source licenses, typically Apache 2, and none of them have any platform restrictions, meaning you can actually take these libraries and you can run them on any platform. We still have, obviously, lots of Microsoft engineers working on each of these projects. This now gives us the flexibility where we can actually look at suggestions and submissions from other developers as well and be able to integrate them into the mainline products."
Among some other notable announcements from Guthrie regarding Azure:
- Revamped Azure Portal: Now available in preview form, the new portal is "designed to radically speed up the software delivery process by putting cross-platform tools, technologies and services from Microsoft and its partners in a single workspace," wrote Azure General Manager Steven Martin in a blog post. "The new portal significantly simplifies resource management so you can create, manage, and analyze your entire application as a single resource group rather than through standalone resources like Azure Web Sites, Visual Studio Projects or databases. With integrated billing, a rich gallery of applications and services and built-in Visual Studio Online you can be more agile while maintaining control of your costs and application performance."
- Azure Mobile Services: Offline sync is now available. "You can now write your mobile back-end logic using ASP.NET Web API and Visual Studio, taking full advantage of Web API features, third-party Web API frameworks, and local and remote debugging," Martin noted. "With Active Directory Single Sign-on integration (for iOS, Android, Windows or Windows Phone apps) you can maximize the potential of your mobile enterprise applications without compromising on secure access."
- New Azure SDK: Microsoft released the Azure SDK 2.3, making it easier to deploy VMs and sites.
- Single Sign-on to Software as a Service (SaaS) apps via Azure Active Directory Premium, now generally available.
- Azure now includes one IP address-based SSL certificate and five SNI-based SSL certs at no additional cost for each site instance.
- The Visual Studio Online collaboration as a service is now generally available and free for up to five users in a team.
- While Azure already supports .NET, Node.Js PHP and Python, it now supports the native Java language thanks to its partnership with Oracle that was announced last year.
My colleague Keith Ward, editor in chief of sister site VisualStudioMagazine.com, has had trouble in the past finding developers who embraced Azure. He now believes that could change. "Driving all this integration innovation is Microsoft Azure; it's what really allows the magic to happen," he said in a blog post today. Furthermore, he tweeted: "At this point, I can't think of a single reason why a VS dev would use Amazon instead of Azure."
Are you finding Azure and the company's cloud OS hybrid platforms more appealing?
Posted by Jeffrey Schwartz on 04/04/2014 at 8:15 AM0 comments
Microsoft opened its Build conference for developers with a keynote that focused on the company's attempts at breathing new life to its struggling Windows franchise while simultaneously embracing interoperability with other platforms.
In addition to unveiling its intelligent voice assistant planned for Windows Phone 8.1 and announcing the Windows 8.1 update, Microsoft's top executives talked of progress towards unifying its operating system across PCs, tablets, phones and its Xbox gaming platform. The company has lately described this and efforts to extend to open source and competitive platforms as a "universal Windows."
Underscoring the progress Microsoft has made toward that effort, Microsoft's new CEO Satya Nadella said that 90 percent of its APIs are now common and this should remove some of the barriers to developing for the various system types. "That's fantastic to see," Nadella told the 5,000-plus attendees at the event, held in San Francisco. He said Microsoft will continue to push for a "shared library across a variety of device targets."
Those device targets won't be limited to traditional hardware. Terry Myerson, executive vice president for Microsoft's operating system group, described the company's ambitions for its current-generation operating system, which only accounts for being on a small share of tablets and is still not favored among most PC users.
Those ambitions include not only making Windows tools and frameworks more broadly available but extending them to new types of devices -- including on the so-called "Internet of things," which can range on anything from a piano, as demonstrated, to telemetry components equipped with Intel's x86 system-on-chip called Quark. The component is the size of an eraser, Myerson noted. Such advances will open new opportunities for Windows, he said.
Also in a bid to grow its market share in the low-cost tablet and phone market, Myerson emphasized the company's efforts to expand the presence of Windows by making it free to tablet, PC and phone suppliers offering hardware that's nine inches or less. That promises to take away the key advantage of the Android OS being free.
"We really want to get this platform out there," Myerson said of Windows. "We want to remove all the friction between you and creating these devices."
While Nadella and company are taking steps to expand Windows, they also acknowledged to its core audience that it's not going to be a Windows-everywhere world, as evidenced by the company's long-awaited release of Office for the iPad last week.
In a pre-recorded question displayed during the closing of today's presentation, an Android developer asked why he should also develop for Windows. Nadella's answer: "We are the only platform that has APIs with Language bindings across both native, managed and Web. And the fact that that flexibility exists means you can build your core libraries in the language of your choice and those core libraries you can take cross platform. Obviously the Web [is] the one that's easiest to conceptualize and that's what we've done by taking WinJS and putting it into open source and making it a community effort so you can take it cross platform."
Posted by Jeffrey Schwartz on 04/02/2014 at 3:39 PM0 comments
Even though Microsoft had strong evidence that a former employee was transmitting via Hotmail stolen code and trade secrets, customers were unnerved when learning the company snooped at the suspect's e-mail account.
As reported two weeks ago, Alex Kibkalo, a former Microsoft architect, was arrested for allegedly stealing trade secrets and leaking Windows 8 code to an unnamed French blogger while working for the company. By delving into his Hotmail account, Microsoft was able to provide evidence to the authorities. If the suspect were smart enough to use any e-mail service not owned by Microsoft, the company would have needed to get a warrant from law enforcement authorities.
But the law doesn't prohibit the owner of a service from snooping so Microsoft was within its legal rights to search the suspect's Hotmail account. Nevertheless, Microsoft was well aware it needed to reassure customers it won't take matters into its own hands so blatantly in the future. Due to the public backlash that arose shortly after the incident came to light, Microsoft said that it would turn to a former judge to determine if it had probable cause to look into a suspect's account.
Clearly that wasn't cutting it since the judge is still on Microsoft's payroll and Microsoft again said it would be making a change. Microsoft General Counsel Brad Smith on Friday said it would turn all suspected information to the authorities before taking further action. "Effective immediately, if we receive information indicating that someone is using our services to traffic in stolen intellectual or physical property from Microsoft, we will not inspect a customer's private content ourselves," Smith said in a blog post announcing the change. "Instead, we will refer the matter to law enforcement if further action is required."
Smith pointed out while the law and the company's terms of service allowed it to access Kibkalo's account, doing so "raised legitimate questions about the privacy interests of our customers." As a result the company will revise its terms of services and has reached out to The Center for Democracy and Technology (CDT) and the Electronic Frontier Foundation to further discuss the issue of ensuring security without compromising privacy.
It appears this was the right move to take. Does Microsoft's latest move make you feel more comfortable or do you just see it as lip service?
Posted by Jeffrey Schwartz on 04/01/2014 at 12:55 PM0 comments
As predictably as the sun rises, Microsoft yesterday followed Amazon's latest round of price cuts by reducing the rates for its Windows Azure – rather Microsoft Azure – cloud service. (In case you missed it, Microsoft last week shed the Windows name from its cloud service. Hence Windows Azure is now Microsoft Azure.)
Microsoft is cutting the price of its compute services by 35 percent and its storage service 65 percent, the company announced yesterday afternoon. "We recognize that economics are a primary driver for some customers adopting cloud, and stand by our commitment to match prices and be best-in-class on price performance," said Steven Martin, general manager of Microsoft Azure business and operations, in a blog post. The price cuts come on the heels of the company last week expanding Microsoft Azure into China.
In addition to cutting prices, Microsoft is adding new tiers of service to Azure. On the compute side, a new tier of instances called Basic consist of similar virtual machine configurations as its current Standard tier and won't include load balancing or auto-scaling offered in the Standard package . The existing standard tier will now consist of a range of instances from "extra small" to "extra large." Those instances will cost as much as 27 percent less than their current instances.
Martin noted that some workloads, including single instances and those using their own load balancers, don't require the Azure load-balancer. Also, batch processing, dev and test apps are better suited to the Basic tier, which will be comparable to AWS-equivalent instances, Martin said. Basic instances will be available this Thursday.
Pricing for its Memory-Intensive Instances will be cut by up to 35 percent for Linux instances and 27 percent for Windows Server instances. Microsoft said it will also offer the Basic tier for Memory-Intensive Instances in the coming months.
On the storage front, Microsoft is cutting the price of its Block Blobs by 65 percent and 44 percent for Geo Redundant Storage (GRS). Microsoft is also adding a new redundancy tier for Block Blob storage called Zone Redundant Storage (ZRS).
With the new ZRS tier, Microsoft will offer redundancy that stores the equivalent of three copies of a customer's data across multiple locations. GRS by comparison will let customers store their data in two regions that are dispersed by hundreds of miles and will store the equivalent of three copies per region. This new middle tier, which will be available in the next few months, costs 37 percent less than GRS.
Though Microsoft has committed to matching price cuts by Amazon, the company faced a two-prong attack last week which included both Amazon and Google not only slashing prices for the first time but by finally offering Windows Server support. While Microsoft has its eyes on Amazon, it needs to look over its shoulder as Google steps up its focus on enterprise cloud computing beyond Google Apps.
One area where both Amazon and Google have a leg up on Microsoft is their respective desktop-as-a-service (DaaS) offerings. As noted last week, Amazon made generally available its WorkSpaces DaaS offering, which it announced back in November at its re:Invent customer and partner conference. And as reported last month, Google and VMware are working together to offer Chromebooks via the new VMware Horizon DaaS service. It remains to be seen how big the market is for DaaS and whether Microsoft's entrée is imminent.
Posted by Jeffrey Schwartz on 04/01/2014 at 12:54 PM0 comments
As soon as Microsoft CEO Satya Nadella announced the long-expected release of Office for the iPad last week, I downloaded it on mine. Upon opening Word on the iPad, it displayed all of my documents stored in OneDrive and even sorted them in the order that I last accessed them in. Frankly, Office documents in OneDrive are easier to find and navigate on the iPad than the Microsoft Surface or Dell Venue 8 Pro using the modern Windows 8.1 interface because of how they're organized. Still, because it doesn't have native support for an external mouse, I won't be using Office on the iPad that often.
Even so, if you have an Office 365 subscription and an iPad you should download it and determine how it may best suit your needs. Microsoft optimized Office for the iPad and those that compared it to Apple's own iWork suite believe Microsoft delivered a worthy alternative. But should iPad users who don't have Office 365 subscriptions (the new personal plan costs $69 per year) sign up for it? That depends. Keep in mind if all you want to do is view documents, it's free. Only if you intend to create or edit them do you need an Office 365 subscription. If you don't mind the lack of mouse support, you're set.
But having used a mouse and keyboard with Office for two decades, even though I'm quite open to change and doing things differently, I've become quite accustomed to working with both a keyboard and mouse. Editors and writers tend to frequently cut and paste phrases, sentences and entire paragraphs. Using a mouse and keyboard has become second nature to me. While I'm not averse to typing with an on-screen keyboard, using touch for cut and paste doesn't work for me. But I never bought an iPad with the intent of writing or editing with it. I may pull up a document or PDF to proof, but that's about it.
I bought the iPad to use other apps, notably to read various publications, access e-mail and connect to the Internet when I'm away from my desk. But if I need to take notes, write or edit away from my office, I use a Windows tablet with its type-style keyboard and the track pad and/or an external Bluetooth mouse.
My understanding is while iPads support external Bluetooth keyboards, Apple has made a decision not to support the use of an external mouse on its tablets. Both Android and Windows tablets support them. It's ironic that Apple doesn't do this considering it brought the mouse to life with the Macintosh. Clearly Steve Jobs envisioned weaning users off the mouse to the new touch-based world. It remains to be seen whether CEO Tim Cook has a different view.
I think touch is a great evolution of the user experience but not when it comes to writing and editing anything more than a text, e-mail or social media post. Yes there are ways to jailbreak iOS to enable a mouse to work with the iPad. But until iOS natively gains support for a mouse, I have no intention of using my iPad, which I otherwise enjoy, for content creation. Does this make me stodgy and stuck in my ways or am I in good company?
Posted by Jeffrey Schwartz on 03/31/2014 at 12:26 PM0 comments
Today is World Backup Day, created as an independent effort in 2011 on the eve of April Fools' Day. The goal is to underscore that it takes a fool not to back up their files regularly. Described as an independent effort created under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, it was founded because many people still don't understand the need to back up their data. Hence it's aimed at bringing awareness to the issue. While the intent is well grounded, it doesn't seem to have garnered much attention.
As far as I can tell, even some of its sponsors, including Backblaze, LaCIE and Western Digital, haven't made much noise about World Backup Day this year. Could it be that most providers of data protection software assume everyone backs up their data? Certainly enterprises do and the April issue of Redmond magazine looks at improvements to Microsoft's new System Center 2012 R2 Data Protection Manager and how third-party providers of backup and recovery software are offering new ways to replicate server data of specific apps and functions such as SharePoint, SQL Server and VMware-based VMs.
But the majority of people are still careless when it comes to protecting their data. According to the World Backup Day Web site, 30 percent of people have never backed up their data. With 113 phones being lost or stolen every minute and 10 percent of every computer infected with viruses each month, those 30 percent need to get on board.
If you're one those who haven't backed up their PCs and devices lately (you know who you are), take the time today to do so. At some point, you'll be glad you did.
Posted by Kurt Mackie on 03/31/2014 at 12:30 PM0 comments
While virtual desktops represent a small niche of the enterprise client system universe, they're a reasonable option for organizations with PCs still running Microsoft's Windows XP operating system. Unless you've been hiding under a rock, Windows XP will shortly lose official support from Microsoft.
As I reported earlier this month, a survey of Redmond magazine readers found that 23 percent will continue to run their Windows XP-based systems after Microsoft releases the final patch for the OS on April 8. And while the survey also showed an overwhelming 85 percent will deploy Windows 7-based PCs and 35 percent will depoly systems running Windows 8 (multiple responses were permitted), 9 percent said they are looking to virtual desktops. That may include some form of VDI or desktop as a service (DaaS).
Evolve IP, a managed services provider that offers its own hosted DaaS offering based on VMware Horizon View, said its own survey showed that 63 percent will use virtual desktops for at least a portion of their employees. The VDI as a service is hosted in its own cloud where customers can also host their Active Directory instances to manage users. "It's a good mix for the IT department who needs control, but it's also good because it's not an all-in philosophy," said Scott Kinka, Evolve IP's CTO.
There are a number of solutions from the likes of AppSense, Citrix, Dell/Wyse, HP, NComputing and VMware. Of course, Microsoft's own Remote Desktop Services (RDS) and AppV solutions are all viable options as well, either via an MSP or hosted internally. Here's a look at a number of options:
- AppSense: Using DesktopNow and DataNow, IT can bring together related persona and data to centralize and stream the components to a new desktop. "We don't modify, we just lock down and migrate the settings and other things relative to the application," said Jon Rolls, AppSense vice president of product management."
- Citrix: With the company's XenDesktop, IT can virtualize Internet Explorer 6 (which can't run on newer operating systems) in a virtual desktop. Likewise, apps that cannot be updated to Windows 7 or Windows 8.1 can run in virtual Windows XP instances.
- NComputing: The supplier of virtual desktop solutions offers its Desktop and Application Virtualization platform for small- and mid-sized business looking for more of a turnkey type offering. The company plans to further simplify the delivery of virtual solutions with the planned release of its new oneSpace client virtualization platform. The company describes it as a workspace for IT to securely deliver apps and files in BYOD scenarios to any device including iPads and Android-based tablets. "Users are getting full-featured versions of their Windows applications but we've done our own optimization to allow those apps to be mobile- and touch-friendly," said NComputing's senior director of marketing Brian Duckering. "Instead of using the Windows Explorer experience, we integrated it and unified it so it's Dropbox- like." It's due to hit private beta this spring.
- Microsoft: Just last week Microsoft took a step toward making it easier for IT to deploy VDI scenarios based on its Remote Desktop Services. Microsoft released the preview of its Virtual Desktop Infrastructure Starter Kit 1.0. As Redmond's Kurt Mackie reported, Microsoft is billing it as something that should not be used for production environments. It's just for testing purposes. The kit "complements" the management console and wizards used with the RDS server role of Windows Server 2012 R2. It comes with apps including Calculator and WordPad for testing virtual desktop access scenarios. The finished Starter Kit product is scheduled for release in the second quarter of this year and the preview is available for download now. Organizations can also pair Microsoft's RD Gateway with Windows Server 2012 to deploy VDI, as explained in a recent article.
Desktop as a Service
This week Amazon Web Services released its WorkSpaces DaaS offering. Amazon first disclosed plans to release WorkSpaces at its re:Invent conference in November at its customer and partner conference in Las Vegas. The service will be available with one or two virtual CPUs with either 3.75 or 7.5 GB of RAM and 50 to100 GB of storage. Per-user pricing ranges from $35 to $75 for each WorkSpace per month. Organizations can integrate the new service with Active Directory.
A wide variety of use cases were tested, from corporate desktops to engineering workstations, said AWS evangelist Jeff Barr in a blog post this week. Barr identified two early testers, Peet's Coffee & Tea and ERP supplier WorkWise. The company also added a new feature called Amazon WorkSpaces Sync. "The Sync client continuously, automatically and securely backs up the documents that you create or edit in a WorkSpace to Amazon S3," Barr said. "You can also install the Sync client on existing client computers (PC or Mac) in order to have access to your data regardless of the environment that you are using."
Google and VMware are also making a big DaaS push. As I reported earlier this month, the two companies teamed up to enable Google Chromebooks to work with VMware's Horizon View offerings.
Things could get really interesting if Microsoft offers its own DaaS service.
Posted by Jeffrey Schwartz on 03/28/2014 at 11:28 AM0 comments
Google today gave Microsoft shops a reason to consider its enterprise cloud services by adding Windows Server support and slashing the pricing of its infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS) offerings.
At the company's Google Platform Live event in San Francisco, the company also stepped up its effort to extend the appeal of its IaaS and PaaS services to enterprises by introducing a new blend of the two called Managed Virtual Machines, along with an improved big data analytics offering. The company expanded its menu of server operating system instances available with the Google Compute Engine IaaS with the addition of Suse Linux, Red Hat Enterprise Linux and Windows Server.
"If you're an enterprise and you have workloads that depend upon Windows, those are now open for business on the Google Cloud Platform," said Greg DeMichillie, a director of product management at Google. "Our customers tell us they want Windows so of course we are supporting Windows." Back in December when Google's IaaS was first announced, I noted Windows Server wasn't an available option.
Now it is. However, there is a caveat. The company, at least for now, is only offering Windows Server 2008 R2 Datacenter Edition, noting it's still the most widely deployed version of Microsoft's server OS. There was no mention if and when Google will add newer versions of Windows Server. The "limited preview" is available now.
The new Windows Server support was a footnote to an event which emphasized lower and more predictable pricing and a new offering that allows customers to get the best of both the PaaS and IaaS worlds. Given Amazon Web Services (AWS) and Microsoft frequently slash their prices, Google had little choice but to play the same game. That's especially the case given its success so far.
"Google isn't a leader in the cloud platform space today, despite a fairly early move in platform as a service with Google App Engine and a good first effort in Infrastructure as a Service with Google Compute Engine in 2013," wrote Forrester analyst James Staten in a blog post. "But its capabilities are legitimate, if not remarkable."
Urs Hölzle, the senior vice president at Google who is overseeing the company's datacenter and cloud services, said 4.75 million active applications now run on the Google Cloud Platform, while the Google App Engine PaaS sees 28 billion requests per day with the data store processing 6.3 trillion transactions per month.
While Google launched one of the first PaaS offerings in 2008, it was one of the last major providers to add an IaaS and only recently hit the general availability status with the Google Compute Engine back in December. Meanwhile just about every major provider is trying to catch up with AWS, both in terms of services offered and market share.
"Volume and capacity advantages are weak when competing against the likes of Microsoft, AWS and Salesforce," Staten noted. "So I'm not too excited about the price cuts announced today. But there is real pain around the management of the public cloud bill." To that point, Google announced Sustained-Use discounts.
Rather than requiring customers to predict future usage when signing on for reserved instances for anywhere from one to three years, discounts of 30 percent of on-demand pricing kick in after usage exceeds 25 percent of a given month, Hölzle said.
"That means if you have a 24x7 workload that you use for an entire month like a database instance, you get a 53 percent discount over today's prices. Even better, this discount applies to all instances of a certain type. So even if you have a virtual machine that you restart frequently, as long as that virtual machine in aggregate is used more than 25 percent of the month, you get that discount."
Overall other pay-as-you-go services price cuts range from 30 to 50 percent dependency. Among the reductions:
- Compute Engine reduced by 32 percent across all sizes, regions and classes
- App Engine pricing for instance-hours reduced by 37.5 percent, dedicated memcache by 50 percent and data store writes by 33 percent. Other services including SNI SSL and PageSpeed are now available with all applications at no added cost
- Cloud Storage is now priced at a consistent 2.6 cents per GB, approximately 68 percent lower.
- Google BigQuery on-demand prices reduced by 85 percent.
The new Managed Virtual Machines combines the best of IaaS and PaaS, Hölzle said. "They are virtual machines that run on Compute Engine but they are managed on your behalf with all of the goodness of App Engine. This gives you a new way to think about building services," he said. "So you can start with an App Engine application and if you ever hit a point where there's a language you want to use or an open-source package that you want to use that we don't support, with just a few configuration line changes you can take part of that application and replace it with an equivalent virtual machine. Now you have control."
Google also extended the streaming capability of its BigQuery big data service, which initially was able to pull in 1,000 rows per second when launched in December to 100,000 now. "What that means is you can take massive amounts of data that you generate and as fast as you can generate them and send it to BigQuery," Hölzle, said. "You can start analyzing it and drawing business conclusions from it without setting up data warehouses, without building sharding, without doing ETL, without doing copying."
Posted by Jeffrey Schwartz on 03/25/2014 at 1:00 PM0 comments
Cisco today is bringing new meaning to the old saying, "if you can't beat them, join them."
The company today said it will invest $1 billion over the next two years to offer what it argues will be the world's largest cloud. But rather than trying to beat Amazon Web Services, Microsoft, Rackspace, IBM, Hewlett Packard, Salesforce.com, VMware and other major providers that offer public cloud services, Cisco said it will "join" them together, figuratively.
Cisco said it will endeavor to build its so-called "Intercloud" -- or cloud of clouds -- aimed at letting enterprise customers move workloads between private, hybrid and public cloud services. Of course Cisco isn't the only provider with that lofty goal but Fabio Gori, the company's director of cloud marketing, said it's offering standards-based APIs that will help build applications that can move among clouds and virtual machines.
"This is going to be the largest Intercloud in the world," Gori said. Cisco is building out its own datacenters globally but is also tapping partners with cloud infrastructure dedicated to specific counties to support data sovereignty requirements. Gorisaid Cisco will help build out their infrastructures to spec and those providers will be part of the Intercloud.
Gori emphasized Intercloud will be based on OpenStack, the open source cloud infrastructure platform that many cloud providers, including Rackspace, IBM, HP and numerous others, support. But there are key players including Amazon, Microsoft and Google, who don't support it. Gori said Cisco can work around that by using the respective providers' APIs and offering in its own programming interfaces for partners to deliver application-specific offerings.
Core to this is the Intercloud fabric management software, announced in late January at the Cisco Live! conference in Milan, Italy. The Intercloud fabric management software, now in trial and slated for release next quarter, is the latest component of the Cisco One cloud platform that's designed to securely tie together multiple hybrid clouds.
Among the cloud providers now on board are Australian service provider Telstra, Canadian communications provider Allstream, European cloud provider Canopy, cloud services aggregator and distributor Ingram Micro, managed services provider Logicalis Group, BI software vendor MicroStrategy, Inc., OnX Managed Services, SunGard Availability Services and outsourcing company Wipro.
Gori insists Cisco is lining up many other partners, large and small, from around the world. It remains to be seen if Amazon, Microsoft and Rackspace are in the mix. Asked how Cisco's effort is different from VMware, which is also building a public cloud and enhancing it with local partners, Gori pointed out that its service supports any hypervisor.
Cisco will announce more partners and deliverables at its Cisco Live! conference in San Francisco in May. Whether or not Microsoft is one of those players remains to be seen in the future, he said. "Microsoft is a very big player and is going to be part of this expanded Intercloud," he said. "We are going to do something specific around the portfolio."
Posted by Jeffrey Schwartz on 03/24/2014 at 1:29 PM0 comments
Just over a year ago, Symantec CEO Steve Bennett announced a plan to turn around the largest provider of security and data protection products. But as rivals continued to gain ground on the company, its board ran out of patience and showed him the door yesterday.
Bennett, a former GE executive and onetime CEO of Intuit, lasted less than two years as Symantec's chief after his predecessor Enrique Salem was also ousted. When Bennett presided over last year's two-hour analyst event dubbed Symantec 4.0, he positioned it as a reboot of the company. The reorganization focused on realigning R&D with its disparate product groups, integrating its technologies, removing the siloes and improving the company's lagging software subscription rates.
During the two-hour Webcast of the event, Bennett and his executive team talked about plans to move into new product areas like network security and putting in place functional technology sharing across its businesses. But according to reports, Bennett's efforts never took hold, though the company said he did help reorganize the company and reduce costs. But that wasn't enough to stem declining revenues, a dearth of new technology innovations and an executive exodus that included the company's CFO and several key business unit heads, according to a report in The New York Times.
Symantec said it has appointed board member Michael Brown as interim president and chief executive officer, effective immediately. Brown joined Symantec's board in 2005 following its $13.5 billion acquisition of Veritas. He had once served as chairman and CEO of Quantum. The company said it has hired an executive search firm to recruit a permanent CEO.
Shares in Symantec were off 12 percent Friday afternoon as investors wonder who will take the company forward and if there will be a Symantec 5.0.
Posted by Jeffrey Schwartz on 03/21/2014 at 12:50 PM0 comments
A former Microsoft employee was arrested in Seattle earlier this week after the company searched his Hotmail account and found evidence he was allegedly leaking information and code to a blogger who ended up illegally selling pirated software.
Alex Kibkalo, a former Microsoft architect, is accused of stealing trade secrets and leaking Windows 8 code to an unnamed French blogger while working for Microsoft. Kibkalo, a Russian national who also has worked for Microsoft in Lebanon, also allegedly bragged about breaking into the Redmond campus and stealing the Microsoft Activation Server Software Development Kit, a proprietary solution aimed at preventing unauthorized distribution of the company's software and licenses, SeatlePI reported Thursday.
The move forced Microsoft to admit it had scanned a user account on its Hotmail service to obtain evidence. This comes at a time when many customers lack trust that Microsoft and others are taking enough measures to ensure their privacy of information in the services. Revelations of the National Security Agency (NSA) surveillance efforts by Edward Snowden and accusations that Microsoft and others were cooperating with the NSA has heightened those fears, despite efforts by the players involved to ensure such cooperation is limited to rare instances where there are court orders.
In this case, Kibkalo made it quite easy for Microsoft to discover his alleged acts. One must wonder why he or the blogger would use the company's e-mail service to communicate. Putting that aside, Microsoft accessed the e-mails without a court order because apparently the company legally didn't need a court order to search its own service. But the company did obtain court orders for other aspects of the investigation, said Microsoft Deputy Counsel John Frank, in a blog post published last night.
Frank justified Microsoft's decision to access the e-mails in its Hotmail service and it appears Microsoft didn't violate any laws or its own policies, though some question the wisdom of its actions. "We took extraordinary actions based on the specific circumstances," Frank said. "We received information that indicated an employee was providing stolen intellectual property [IP], including code relating to our activation process, to a third party who, in turn, had a history of trafficking for profit in this type of material. In order to protect our customers and the security and integrity of our products, we conducted an investigation over many months with law enforcement agencies in multiple countries. This included the issuance of a court order for the search of a home relating to evidence of the criminal acts involved. The investigation repeatedly identified clear evidence that the third party involved intended to sell Microsoft IP and had done so in the past."
Likely anticipating customers and privacy advocates might be unnerved by the fact that it dipped into its own servers despite the probable cause of the alleged criminal activity, Frank said Microsoft is stepping up its policies for the way it handles such discovery in the future. "While our actions were within our policies and applicable law in this previous case, we understand the concerns that people have," he said.
Moving forward, he said Microsoft will not search customer e-mail or other services unless there's evidence of a crime that would justify a court order. In addition, Microsoft will turn to a former judge who will now determine if the probable cause would justify a court order and even in those instances, the searches would be limited to searching for the information centered around the suspected activity, not other data, and that it would be supervised by counsel.
To ensure transparency, Microsoft will publish whatever searches it has conducted as part of its biannual transparency reports, he said. "The privacy of our customers is incredibly important to us," he said. "That is why we are building on our current practices and adding to them to further strengthen our processes and increase transparency."
Will appointing a judge to evaluate the merits of the case be enough to settle your concerns that the company won't be looking at your data? Leave your comments below or e-mail me directly.
Posted by Jeffrey Schwartz on 03/21/2014 at 3:11 PM0 comments