More on this topic:
Let's say you own a car and you decide to take it to a new mechanic for its annual tuneup. Your new mechanic doesn't actually have any formal training in automotive repair or maintenance, but before he got the job he spent a lot of time playing Grand Theft Auto on his Xbox. He inherited some tools from the guy before him, so he opens the hood of your car and gets to work. You're later appalled to find that he's rewired your car to a small electric motor and replaced the steering wheel with a two-handed controller featuring X, Y, A, and B buttons.
Stupid, right? You'd probably never do that. You probably take your car to the dealer, which certifies its mechanics with the manufacturer, or at least go to a mechanic that's ASE-certified, meaning he's been to a specified number of classes and passed some tests.
Kinda crazy that you run your network differently than your car, isn't it?
Yet that's exactly what you do. Microsoft has finally started meeting customer demands -- mostly from large enterprise customers, to be sure -- to provide better administrative and operational automation. They're rolling out PowerShell support for nearly everything in the datacenter, giving administrators unprecedented access to, and control of, the technologies they manage.
And how many of your administrators know how to use that power? How many have you sent to training on it? I realize that Microsoft -- for reasons which I suppose I grasp, yet do not agree with -- hasn't released an official PowerShell certification, but it's certainly released classes. There are dozens of books on the market, including the ones I've written. There are self-paced training videos, too, like the ones I've done for CBT Nuggets.
But administrators clearly aren't getting that education. I know they're not, because every day I help answer forums questions from guys and gals who are trying their damnedest to do a good job, and who just haven't been given any kind of grounding in basic PowerShell skills. They're making newbie mistakes, and they're painfully aware that they're doing so -- they even apologize for being noobs when they ask the questions. They're learning from people's blog posts (if it's on the Internet, it must be accurate, right?), from Q&A forums and from magazine articles. They're desperately trying to educate themselves for you -- but this isn't like changing a tire on a car. PowerShell is a complex technology with many unobvious bits that you simply won't pick up on your own.
If your admins are like most of the ones I know, they're go-getters. They know PowerShell is the future, and it's becoming increasingly mandatory. Try doing much of anything in Azure, or Exchange Server, or even Office 365 without using PowerShell and you'll see what I mean. So they put their nose to the grindstone, flatten it out nicely and learn what they can.
And they're learning on your production network. In many cases they're adopting awful habits, and creating an infrastructure full of hard-to-maintain patterns, without even realizing it. They simply don't know any better. They're doing the best with what they've got -- which is virtually nothing.
Yeah, when it comes to a lot of Microsoft server products, you can often get by for a long time with pretty minimal training. After all, it's just point-and-click, right? But PowerShell is a form of software development. There are no wizards, and it's incredibly freeform. That means the shell offers a ton of power, a metric ton of flexibility and a metric butt-ton of room to go wrong.
And believe you me, you've got admins going wrong. I'm seeing it every day. There's a big community of folks trying to set everyone straight, but unless they find us (I'm at PowerShell.org, for what it's worth), they're just scrambling around on their own. That's not good for you.
The moral of the story, of course, is to get your people trained. Find your best and brightest, the ones who can become your Automation Makers, and send them to a quality class taught by an awesome instructor. Buy them a good book, for pity's sake, or get a subscription to a video training library. You don't need to train the entire team; the enterprise pattern is shaping up as having a relatively small number of bright, adaptable IT folks taking on the role of PowerShell Wrangler. They use their shell sk1llz to create easier-to-use units of automation for the rest of the team; they become your Toolmakers. They're the ones who attend specialized events like the PowerShell Summit to hone and deepen their skills. They're the ones you rely on to make everyone else more efficient, and to manage the PowerShell beast in your environment. Give them enough training, and you can measure the man-hours they save you. Yeah, it's return on investment time, in a big way.
But for the love of your entire IT infrastructure, don't make them do it blind. Please, get them some education, or you're going to find yourself in a sorry state in an amazingly short period of time. PowerShell isn't hard, but it is complex, and it offers a lot of opportunity to go down the wrong path.
Get your team on the right path. Get them trained. Please.
Posted by Don Jones on 10/29/2013 at 11:28 AM0 comments
Ten years ago, sets of magic three- and four-letter acronyms were the golden ticket to a promotion, a better IT job or even your first IT job. MCSE. CCNA. CNE. MCSA. OMG! Today, many organizations' HR departments still rely on these TLAs and FLAs as a sort of filter, meaning resumes lacking these special character sequences don't even end up in front of hiring managers or department heads.
It's a pity.
We all know that the certifications associated with most (if not all) of these acronyms were, at best, a pretty minimal indicator of technical expertise. "Paper MCSE" became common enough to earn a place in may urban dictionaries, spawn hundreds of online and magazine articles and to generally put a cloud of derision over that particular title.
Today, Microsoft's made a lot of attempts to restore some glamour to its top-end IT Pro title (which was until briefly not the top end, of course; the retirement of the company's "Master" certifications brought MCSE back to the limelight). Whether they've been successful really doesn't matter, or shouldn't.
Remember that the whole point of those titles was for Microsoft to demonstrate to companies that the world housed plenty of people qualified to support Microsoft's products. Ergo, it's safe to buy Microsoft products, because you'll be able to easily find people to run them. That, of course, means Microsoft really never had a huge stake in making the MCSE rigorous –- it just needed lots of us to jump through hoops. Today, that whole business driver seems a lot less relevant. Plenty of companies already bought Microsoft software, after all, and any company that doesn't use any Microsoft software sure as heck isn't going to be swayed by the existence of a lot of MCSEs, paper or otherwise.
I'll argue, then, that HR should drop or de-emphasize acronyms on their list of low-level resume filters. Hiring managers should give those acronyms less weight. And IT pros should perhaps worry less about including them in the first place. Sure, list 'em if you've got 'em, but I've got something better.
"Hi, my name is Don, and in my last job I eliminated an average of 4,000 man-hours of manual IT labor annually." Follow that with your education history and hobbies or whatever, and you've got a compelling resume that the most important bullet point would fit on a business card: reduced manual man-hours.
Regardless of your gender, each carbon-based lifeform in an IT department represents up to 2,400 man-hours annually; less in countries and organizations with more-generous paid-vacation policies. If you're good enough at automation to reduce manual man-hours by any significant chunk of that, then you're a major asset to any IT team, regardless of the three- and four-letter designations you may or may not have.
"I can, through my powers of automation, free up or replace two human beings per year" is another way of saying, "I saved 4,000 man-hours annually." That's a huge deal for IT organizations strapped for resources and unable to expand their ranks. That's people to go work on new projects. It's also an almost ironclad guarantee that if layoffs come around, your name won't be on the list, you person-reducing person, you.
So how do you affect this change? How do you document it for your resume?
Dig into your help desk ticketing system, and do some analysis on time-to-close. Find tasks that get repeated fairly often, and figure out how many man-hours are involved in closing those tasks per year -- pretty basic information from a decent ticketing system. Those tasks, and their man-hours, are your target.
Your weapons are PowerShell. System Center Orchestrator. VBScript. Batch files. C#. Whatever. It truly doesn't matter what tools you use to automate -- although certain ones will obviously be more suitable for some tasks than for others, which is why I've always cultivated in myself a fondness for many technologies. Like an action movie hero carrying knives, ninja stars, a 9mm handgun and a grenade launcher, I like to be prepared for a variety of automation situations. In the end, it's not about the tool I use -- it's about the hours I save.
When you're an automator, everything is a button, or ought to be. I go looking for tasks where my fellow admins spend hours, repeating the same sequence of actions over and over in a GUI or whatever. I then create a button -- some automation unit, like a PowerShell script or an Orchestrator runbook -- that accomplishes the same task automatically, with as little intervention as possible. Hellz yes, I'll jury-rig when I have to. The goal is saving hours. Then I'll document that task -- even if only on my resume -- as hours I've saved.
When the time comes to argue about raises, or a new job, or a promotion, or whatever -- I've got my ammunition. Don't think I need more than a 2 percent raise after I made two of my team members superfluous? Maybe I should float my sk1llz around the other companies in town and see if any of them have more value for an automator.
Remember, back in the good ol' MCSE days, when you could change jobs and get a 20 percent boost in salary? You'd be surprised how many organizations will still offer that kind of bump -- just not for four letters. For man-hours.
Posted by Don Jones on 10/14/2013 at 11:00 AM0 comments
For months now, I've been bemoaning -- to pretty much anyone who'll listen to me -- Microsoft's mobile device strategy, or seeming lack thereof.
When Windows Phone 7 was announced, I thought, "Aha! This is how Microsoft's going to compete! They'll leverage their deep relationship with business and produce a mobile phone that's cutting edge and manageable, unlike everything Apple and Google have thrown at us!" I note that recent Samsung devices are an exception; Sammy's been getting enterprise-savvy in the past months.
When Surface was announced, I thought "Aha! This is how Microsoft's going to compete! They'll leverage their deep relationship with business and produce a tablet that's cutting edge and manageable, unlike..."
Yeah, not so much. Microsoft seems so "et up" with Apple Envy, Windows Phone and Surface RT both turned out to be almost purely consumer plays (Surface Pro doesn't count; it's not a tablet, it's an Ultrabook with a removable keyboard, which is fine). Nothing in Windows RT or Windows Phone really pointed to proper enterprise manageability. No Group Policy love. No real anything love. Ugh. I keep telling people that I wish Microsoft would spend a bit less time worrying about the phone my Mom buys, and ship a phone my CIO could love. Fewer organizations would feel the need to cave to BYOD if a viable corporate-friendly alternative was available.
Or maybe BYOD is inevitable. Certainly, fewer organizations are paying for those expensive smartphones and their data plans, now that BYOD is rampant. "Heck, if users are willing to pay for these things and use them to check work e-mail... um, OK." But BYOD still has massive downsides, and Microsoft's tablet and phone folks just didn't seem to be attacking the problem.
Leave it to the System Center team, specifically the System Center Configuration Management (SCCM, although I'm told I'm supposed to call it ConfigMgr these days) team. With SCCM (I'm old-school) 2012 R2, these folks have come up with a brilliant solution that recognizes not only the importance of MDM, but the stark reality of BYOD. They're rolling out a "Company Portal" app, which users can download from their device's respective app store, and use to enroll in their organization's infrastructure. SCCM will understand the difference between a BYOD situation and a company-owned device (you tell it which situation a device is in), and offer appropriate data and manageability. For example, company owned devices can be more deeply managed and completely wiped; BYOD devices can have company-deployed apps and data removed, but that's all. Once a device is enrolled, you get inventory, app deployment, and even a great degree of configuration enforcement through SCCM's configuration auditing feature set. The Company Portal app, along with native device features, essentially acts as a local SCCM client.
The Company Portal app also provides an "in" for sideloading enterprise apps without going through the device's native app store. Typically, the Portal app accepts the organization's deployment certificate, which would need to be obtained from Apple or Google or whoever, which enables sideloaded apps to execute on the device. It's a lot like the Test Flight app for iOS, which allows developers to recruit their friends, and "push" app builds to them, bypassing the store during testing phases. That means organizations can offer mobile apps to users -- whether those apps were developed in-house or brought in from a vendor -- and drop the apps directly on the device, bypassing the device's store. Those apps can similarly be wiped -- along with their data -- on demand.
Note that all of the MDM features of SCCM are actually conducted through an InTune subscription; InTune does the management, and integrates with SCCM for a simpler and more centralized administrative experience. It's another clear sign that Microsoft's future consists of a Cloud+Local=Hybrid environment.
For me, this is just one more example of how Microsoft's back-end business units really "get it." Buying Azure? They're happy to have you run a LAMP stack in Azure... you're paying for Azure, and that was their goal. Standardized on iOS, or just inundated by it? SCCM is happy to manage it... 'cuz you bought SCCM, and that was the goal. It's as if Microsoft -- or at least that back-end portion of the company -- has said, "so long as we own the back-end, we don't really care what the front-end is doing, so we're going to be as embracing of the different front-end ecosystems as possible."
Of course, it's a journey. The current MDM capabilities from InTune and SCCM aren't 100 percent complete, but they're pretty darned impressive. Each new release (InTune is on a quarterly rev cycle, like many of Microsoft's cloud offerings) brings new capabilities to Windows RT, iOS, and Android, along with Windows Phone. "Do whatever you want on the front-end," Microsoft is saying. "Our business unit makes money when we're managing it all."
Posted by Don Jones on 09/12/2013 at 2:36 PM0 comments
Back in the day, I subscribed to the very first Microsoft TechNet offering, which consisted primarily of a CD-based offline Microsoft Knowledge Base. It was tremendously helpful, and only cost a couple hundred bucks. Until recently, you could pay not much more to have access to non-expiring evaluation software of Microsoft's major business products, with the Knowledge Base being better served from the Internet. Sadly, TechNet Subscription's days have come to an end, a move many see as evidence of Microsoft's increasing detachment from the "IT Pro" side of their audience. So what replaces TechNet?
First up is Microsoft's official answer: free-to-download product evaluations and "test drive" VHDs. Unfortunately, these expire, so while they could certainly be useful for evaluating software that you might purchase, they're not much good beyond that. Even as evaluation software, many have commented that more complex trials can simply take longer than the allotted time. Trying to do a comprehensive eval of the major System Center products can take months just to set up properly, before you even get into the testing and actual evaluation stage.
Beyond those free eval downloads, your options depend a bit on what TechNet was doing for you. Reading the comments in a recent article I wrote about TechNet , it's clear that my perspective comes from using TechNet in large enterprise-scale environments, but that a myriad of other uses exist.
Let's start with that perspective of mine. Unfortunately, I definitely worked with customers whose companies purchased a TechNet subscription, and then used it to set up permanent lab environments that were used by multiple IT team members. These environments weren't often used to evaluate Microsoft software, but instead were used to stand up a small-scale mirror of the production environment so the organization could test in-house software, deployment scenarios and so on. That use seems to be a pretty clear violation of at least the letter of the TechNet license, if not its spirit, and those organizations will have to pony up more money going forward to have that same lab environment.
MSDN is one option, with the cheapest "MSDN Operating Systems" package at $700 for the first year and $500/year thereafter. You don't get server products with that, though, only Windows itself; to get all the server products like Exchange and SharePoint, they'll be spending at least $6,200 for the first year and $2,600 thereafter. But the MSDN license is still for a single user -- it really isn't intended to be used in a shared lab environment. These companies could, I suppose, buy an MSDN subscription for each team member, but that's going to add up. Still, it won't add up as fast as paying for production licenses for all of those products, which is the other option. Some of the organizations I work with are looking at the idea of negotiating "lab licenses" as part of their overall licensing agreements with Microsoft, to get those server products at a lower, non-production cost.
Another perspective is that of the very small -- even one-person -- consultancy that needs to build out simulated customer environments to diagnose issues, test new techniques, and so on. That's a use explicitly allowed by MSDN, but at close to 10 times the price of TechNet, it's been a bitter pill for these small businesses to swallow.
CloudShare.com offers a possible alternative, although the financials aren't necessarily more compelling. For about $500, you can get an online "environment" that contains up to 8 GB of RAM and 10 vCPUs to be shared between a set of cloud-hosted virtual machines. Those VMs can be built from templates that include the OS and software licensing, and products like SharePoint are amongst the templates offered. VMs run for only an hour or two unattended, then they suspend automatically -- making them useful for a true lab, but not for production costs. Upgrade fees get you more RAM, more CPU, "always on" capability, and more -- and those add-ons add-up quickly. For a small consultancy looking to build one environment per customer, it might be possible to justify the price... but it's certainly a compromise situation. Larger environments can get incredibly expensive (think $6k/year and up), though -- and it certainly isn't a stretch to think that larger environments would be needed.
What about Azure? Unfortunately, Azure's pricing model is intended for production uses, and it's an expensive lab option. Its pricing also doesn't include server software aside from the OS itself, so you're really just shifting the hypervisor to the cloud, not creating an alternative to the software that was licensed as part of TechNet. I could certainly see Microsoft positioning some kind of "Azure Lab" product, providing VM templates that include software and perhaps doing something to restrict run-time or incoming connections to tie the lab to non-production uses, but there's currently no such offering. And, as many have pointed out, cloud-based labs aren't going to meet the need in every situation.
A third perspective is that of the hobbyist, student, and self-learner. The folks who build a "basement lab" in their home, and who were in most cases paying for TechNet out of their own pocket. These folks might be able to get by with an option like CloudShare; it depends a bit on what-all they were testing. Certainly, they can't build out complex networks, which in many cases was part of the point. CloudShare and options like it don't offer the full range of MS products, either -- you won't be doing System Center experiments, for example. Like the smaller-sized consultancy, these folks are sadly screwed, and it's a shame, because they're also some of Microsoft's biggest fans. It'd be nice to see some sort of "spark" program from Microsoft that provided software licenses to these people, at a low cost, for their own self-learning purposes... but that's pretty much what TechNet was, and the company's made its interest in that audience pretty clear.
A fourth use-case is the small business that maintains a small lab environment, but that isn't engaged with Microsoft at the Enterprise Agreement (EA) level. In other words, they don't have the negotiation option where they can try to wrangle some lab-use licenses from the company as part of a larger licensing scheme. MSDN pricing for even a small team of IT folks still seems cost-prohibitive; you could end up spending as much on MSDN licensing as you do on healthcare for those same employees. Now, I do believe that labs provide a business benefit in terms of increased stability, consistency, and reduced headache, and I do believe businesses should expect to pay money for those benefits. It's called Return on Investment, right? A 10-person team could have used TechNet for about $3,500 a year; MSDN would be more than seven times as much money. It does seem as if some middle ground should exist. Again, these folks seem a bit left out in the cold, or at least left in some kind of "gap" in Microsoft's strategy.
So what could the company do?
We've actually got some glimmerings of precedent, here. SQL Server, for example, allows you to run a free "hot spare" instance of the product in scenarios like log shipping. Provided the hot spare server doesn't do any production work, and is only used for failover, you don't have to buy it a separate SQL Server license. There's no enforcement of that licensing; it's something Microsoft would catch in an audit, and it, by and large, trust its customers to comply with licensing terms. So let's pull that thread a bit. What if, for example, your Software Assurance (SA) agreement allowed you to mirror the covered software in a non-production "mirror" environment? In other words, what if every SA-covered piece of software included the right to use that software once in production, and once in a production-mirror lab environment? For many organizations of varying sizes, that'd solve the problem instantly. It'd add real value to the sometimes-sketchy SA proposition, too, which would be good for Microsoft. It wouldn't require any extra infrastructure on Microsoft's part. It isn't a complete solution; evaluating new Microsoft software would still be dependent upon 180-day trials, but it would solve a big chunk of the problem for organizations using SA.
Of course, that isn't a one-size-fits-all solution. Your basement hobbyist, your smaller consultancies, and the significant number of businesses who don't use SA would still be left out. Microsoft might manage to get more businesses onto SA if lab licensing was an included benefit, but it still wouldn't be everyone -- not everyone can afford the SA surcharge of 25 percent of the product price per year, and smaller businesses would be impacted the most by that surcharge.
Frankly, Microsoft's not offering a lot of answers. Most folks fully agree that maintaining a lab environment of any size should cost something; nobody's asking for free software. Microsoft's official answers so far -- free evals -- don't support a permanent lab, which organizations clearly need. Microsoft seems to be saying they either have to pony up for MSDN on a per-IT-user basis ($$$), or outright buy full software licenses ($$$$), neither of which is exciting their customers or their community. Cloud-based options only meet a very small subset of lab needs, and it seems as if Microsoft is either unaware of, or uncaring of, the wide array of things TechNet Subscription was being used for.
Honestly, I could actually buy the idea that the company just didn't know how customers were using TechNet. As I wrote in the beginning of this article, my perspective is coming from large enterprise, and those organizations can better afford to build out labs and negotiating licensing to some degree. It's certainly possible that Microsoft didn't consider the small consultancy, small businesses, or even hobbyists and students. Of course, by this point, the hue and cry should have made it apparent that those scenarios do indeed exist, and Microsoft hasn't given any indication that it cares.
It's a frankly confounding situation. There's a lot of anger from Microsoft's community over the entire issue, in part because it just makes so little sense. The uses to which most people put their TechNet Subscription -- entirely legit uses -- benefit Microsoft in a number of ways. The company's answer -- time-bombed trials -- don't satisfy many of those uses. The company could have met those uses by providing things like connection-limited trials, as opposed to time-bombed trials; both would prevent production use, but connection-limited software will still support a permanent lab or learning environment. But... Microsoft has chosen not to go that route, at least so far.
Much of the anger around the TechNet Subscription is not, I feel, about the discontinuation. It's the fact that Microsoft seems to have turned a blind eye to such a large and diverse portion of its customer base, taking something away and offering nearly nothing in return. For folks who rely on TechNet, it's entirely analogous to Microsoft simply discontinuing, say, Exchange Server, and suggesting that everyone hop on Outlook.com instead. The anger comes from bafflement that the company would so completely disregard its own customers.
I'd like to say that it'll be interesting to see how this plays out -- but I suspect it won't have a satisfying conclusion. While I'll entirely admit that my own feelings about TechNet Subscription weren't strong, mainly due to the nature of the environments in which I work, I'll also admit that I'd not considered a huge number of use cases where TechNet made all kinds of sense. Microsoft's sin here is, it seems, in not making that same admission about itself. The first step in solving a problem is admitting you have one, and Microsoft doesn't, to this point, seem interested in taking that step.
Posted by Don Jones on 09/03/2013 at 11:42 AM0 comments
Probably every Microsoft IT Pro in the world knows that the company has discontinued its paid TechNet subscription program, which offered full-version, non-expiring Microsoft software for evaluation purposes only.
The vitriol over this issue is amazing. I'm impressed that IT pros are getting so involved and vocal about what they see as a major snafu.
Let me briefly quote a few passages from the now-defunct Subscription License Agreement:
"Only one person may use or access a single subscription or any subscription benefits;" if the company subscribed, then the company "must assign the subscription to only one person, and only that person may use or access the subscription or any subscription benefits." In other words, the software you got through TechNet was for one human being only to use.
"You may install and use the software on your devices only to evaluate software." Now, they give us expiring eval software for free, and I don't know why an eval would take longer than the extremely long eval period (typically 180 days), and I don't know why you'd want to pay for something that is given away for free. If you do enjoy paying for stuff that's already free, I happily accept checks for your use of Google.com.
"You may not use the software in a live operating environment, in a staging environment...." In other words, you're explicitly not allowed to use TechNet software to build out a permanent test lab. You need a test lab? You gotta pay for it. It's cheaper than production because you don't have to buy CALs, since you have no clients in the lab, but you gotta pay for the software in a permanent lab environment. From the few conversations I've had, this is the big thing people are going to miss. Because that's what they were using it for.
"You may not use the software for application development." That's what MSDN is for.
"You may not share, transfer...or assign your subscription." It's yours and yours alone. So if you set up some TechNet software and you and your colleagues worked with it... well, that's not what you paid for.
Frankly, I think Microsoft probably took this move simply because of rampant license abuse.
But we need a lab environment! Of course you do. Obvious statement. TechNet wasn't it, though. If you're a big enough organization to have a lab environment, you're probably on an Enterprise Agreement or some other contract. Negotiate free lab licenses from Microsoft. Ask for like a 1 percent overage (or whatever your number is) to populate your lab, so you can test and use software on a continuing basis.
I know. Labs take a long time to set up. But you weren't given TechNet for labs. You were explicitly not supposed to use it in labs. It was for evaluating software that you don't own, to see if you want to buy it. That was it. If you can't do that in the 180 days provided, your boss didn't plan the project out very well. And the "they don't provide evals for past products" is a bit silly. Why would you buy a past product? If you want a lab to run Windows 7... it's probably because you already own Windows 7. You're not evaluating it at that point. You bought it.
Yeah, I know folks took forever to catch up to Windows 7, and now Windows 8.1 is on the horizon. I actually sympathize with that point. A lot. But I imagine Microsoft doesn't. And for that matter, guys, we're talking about Windows 7. What's a license of that going to cost you so that you can spin up an eval environment? $180 bucks? C'mon. If you know you're going to deploy it and you're looking to build a test/pilot lab... that's not evaluating. Microsoft wants you to pay for those licenses, and it is Microsoft's software. So negotiate with it. "Hey, we'll deploy this, but we want 10 free copies [or whatever] to play with."
Now, MSDN is an alternative. A mega-pricey one. Microsoft could absolutely produce a "Lab License Pack" or something for IT pros. Maybe they should. Maybe they're entitled to full pricing for their software. I honestly don't know on this point. I'd like to see MS enabling labs, because it's a smart business practice I feel they should encourage. That said, labs bring significant benefit to business in terms of stability and reliability -- and those two things typically cost money.
Azure? Maybe. It's certainly a stopping point between "free" and "full price." But not everything is available in Azure. Yeah, you could build VMs, sure, but you still have to acquire a valid license for everything but the OS. I can't imagine building a client OS deployment lab in Azure.
I don't think the problem here is IT pros themselves. Y'all aren't crazy, and neither am I -- and neither is Microsoft. This whole issue is mainly a disconnect, I think. You need long-term lab/test/pilot environments. You believe the company is taking away the resource you used to build that environment. They're not. They never gave you that source -- TechNet was for evaluations, not testing. Not piloting. Not labs. It was for test drives, and as we all know, you don't get to keep the car when you test-drive it. You gotta buy it.
So: You continue to have an unmet need. Of course you need a lab/test/pilot environment. You need to try a ton of scenarios, and it takes longer than 180 days. You've already bought the software -- Microsoft is telling you that you have to buy it again to build your sandbox.
You want to pay nothing for the lab (me, too). Microsoft wants full price. There must be some middle ground.
What is it? Negotiating lab licensing into EA (or whatever) contracts? Cheaper versions of the software for lab use (how do you prevent it from creeping into production)? Free, non-expiring, "limited" lab editions (e.g., max 10 connections)?
What would work for you that you think you could convince Microsoft to offer?
(And please, I know this is a charged topic -- but polite, professional, realistic responses are best here -- I do plan to collect those and forward them up the chain of command I have access to within Microsoft. Help me make your argument.)
Posted by Don Jones on 08/29/2013 at 11:46 AM0 comments
There's been much ado about Microsoft's cancellation of TechNet subscriptions. Officially, the company says it's already giving you those evil installs for free, so why charge you for the service? Unofficially, we all know we're annoyed because the non-expiring TechNet subs were the basis for our persistent lab environments… even though that use was, ahem, technically against the subscription license. Er.
Putting IT Pros off to MSDN isn't the answer, at least not with the current MSDN packaging and pricing. It's more than most need, and it's expensive given what they need.
A better solution would be to see Microsoft formally embrace labs, and I think we as customers should press them to do this. It'd actually be simple.
Make a "Windows Server Lab Edition" for each version of Windows you ship. Make it the same as Datacenter, but hard-limited to some small number of inbound network connections -- thus inhibiting its use as a production server. Then have the various other products teams (SQL, Sharepoint, Exchange, etc) make similar "Lab Editions" that simply won't install on anything but the Lab OS. Charge a few hundred bucks a year (a la TechNet) for permission to use these non-expiring Lab Editions, and be done with it.
Helping us inexpensively build persistent lab environments helps Microsoft, because it helps us deploy new software more reliably and more quickly -- because we can test it.
Of course, for a few hundred bucks a year you can get the cheapest MSDN subscription, which gives you "non-lab," non-expiring server software for use in a test or dev environment. Maybe that's the answer, after all.
Posted by Don Jones on 08/20/2013 at 2:16 PM0 comments
By the time you read this, the wraps will be off of PowerShell 4 and it's signature new feature, Desired State Configuration (DSC). The shell will ship first with Windows Server 2012 R2 and Windows 8.1, and will be a part of the Windows Management Framework (WMF) 4.0. We can expect WMF 4.0 to also be available for Windows Server 2012 and Windows 8; I also expect it'll ship for Windows 7 and Windows Server 2008 R2. We should know by the beginning of July if that expectation holds true, and if other, older versions of Windows will be supported (my personal bet: no).
So what is DSC? In short: disruptive.
With DSC, you use PowerShell to write very simple (or complex, if you like) declarative "scripts." "Script" isn't really the proper word, as there's no actual programming. You're really just building an exaggerated INI-esque file, where you specify configuration items that must or must not be present on a computer. You also specify specific "DSC resources" -- a special kind of PowerShell module -- that correspond to your configuration items. For example, if you want to ensure that IIS is installed on a computer, then there has to be a corresponding resource that knows how to check for its existence, and how to install or uninstall it. Microsoft will ship some basic resources with WMF 4.0, and because this is now a core OS feature, we can expect other product teams to follow suit in upcoming cycles.
PowerShell compiles your "script" to a MOF file, and you then ship that off to your computers. You can do that by deploying them over PowerShell Remoting, via Group Policy, or however else you normally deploy software. On those computers, PowerShell 4 again kicks in to "run" the MOF. It checks your settings and makes sure the local computer is configured as desired (hence the name of the feature). That re-runs every 15 minutes. You can also configure your computers to check a central URI for their declarative configurations, meaning they can periodically (every half-hour, by default) check in to see if there's been an update. In this "pull" model, you can also centrally locate DSC resources, and your computers can pull down the ones they need on-demand, so you don't have to manually deploy those resources before relying on in them in a configuration.
Were do I personally see this going? For one, replacing the anemic "configuration auditing" feature in System Center Configuration Manager (SCCM). That's a pretty sure bet, although it's hard to tell when it'll happen. Frankly, I can see this supplementing Group Policy objects (GPOs), if not outright supplanting them in time. After all, a GPO isn't a lot more than a bunch of registry writes, something DSC is already well-equipped to handle. Your DSC scripts are also a lot easier to version-control, back up, and so forth.
What's exciting is how extensible DSC is. Those under-the-hood "resources" are actually just PowerShell modules that internally confirm to a specific naming pattern. That means anyone can write one of these things. You could write resources that deal with your internal line-of-business applications, without needing to wait for an application vendor or developer to provide the resource to you. Anything you can do in a PowerShell script can be done in one of those DSC resources. With DSC in place, you'd configure a computer by simply pointing it to the right declarative file. "Hey, you're a Web server -- here's what you should look like. Get to it, and stay that way." Reconfiguring is as easy as changing your script: "Hey, all you domain controllers. I want you to look like this now. Go." The machines handle the reconfiguration on their own.
It's an incredible feature that'll take some time to become fully fleshed-out -- but this is clearly the "policy-based dynamic systems management" that Microsoft has been alluding to for years, and thus far failing to deliver. Now, the framework is in place.
Posted by Don Jones on 06/10/2013 at 1:14 PM0 comments
More on this topic:
Things are changing in the Microsoft IT world. It's happening slowly, but it's happening. We've reached an inflection point, or are reaching it soon – and whether or not today's IT administrators continue to have a job (or at least, the same job) is very much in question.
Why Microsoft IT Admins Are Losing Their Jobs
How happy would you be if, every time you needed a light turned on in your house, you had to call the power company and have them send a tech out to rig up the wiring? Not happy. But you don't do that, because that isn't how utilities work. They're largely automated.
And IT needs to become that kind of utility. Not because we should (although of course we should), but because the business is going to demand it. Large competitive companies are already doing it, and they're loving it. IT is a utility for them. When they want to roll out some new business service -- a Web site, a VM, whatever -- they no longer have to account for the IT overhead needed to implement the service. They've already invested the overhead in automating it all. They know that, once they decide to go ahead, they'll just flip a switch, a miracle will occur, and the new service will be online. It'll be backed up, monitored, managed, patched, and everything -- automatically.
You see, the tools all exist. A lot of Microsoft IT admins just haven't been paying attention. PowerShell's here. System Center is doing this stuff already. OS-level features are supporting this kind of automation. The VMware folks have got a lot of the same automation in place. This is all possible, not some vision of the future.
And what the big companies do successfully, the smaller companies will eventually see and want. So if you're an IT pro who's used to Remote Desktop, wizards, and doing stuff manually each time it needs to be done, you're going to be out of a job. The good news is that you can have another, better-paying job -- the job of creating the automated processes your organization needs.
The trick here is that a given organization needs markedly fewer Automators and Toolmakers than it needs button-clickers. So not everyone in your IT team is going to be needed. We're already seeing a significant falloff in mid-tier IT jobs, and a slight falloff in entry-level jobs. Just like every other part of the IT world, ever, Microsoft IT is consolidating into fewer, higher-end, higher-paying jobs.
So if you think PowerShell, System Center, and related tools and technologies "aren't my job" as a Microsoft IT admin... well, I'd like fries with that, please.
How You Become the New Microsoft IT Admin
Even Microsoft's certifications reflect the fact that Microsoft is marching toward this new inflection point. The new Server 2012 MCSE requires basic System Center knowledge, and the new "MCSE Private Cloud" certification incorporates nearly the whole darn System Center family.
Could you bet against Microsoft being successful in this push of theirs? Sure. Heck, I live in Las Vegas, we'll bet on anything. But you're on the losing side of this one if you think your IT job will always be the same. The economic pressures for Microsoft's direction are too high. This isn't a direction Microsoft chose, it's a direction forced upon them by today's business realities. We need to do more and more and more, with less and less and less, and automation is the way to do it. Companies are realizing they'd rather pay a few people a lot of money to make everything automated, than pay a lot of people less money to do it all manually. Is that fair to all the Microsoft IT folks who joined up thinking they wouldn't have to be programmers and toolmakers? Nope. But life's not fair.
Rather than betting against Microsoft on this, and getting crushed under the coming wave, get in front of it. Actually, you're too late to get in front -- but you can at least pull ahead a bit. Start looking at every process that your IT team does more than once, and figure out how you'd automate it. You're going to need to know all about System Center's capabilities -- all of System Center. You'll need to learn PowerShell, and it's going to take a serious effort, not just reading people's blogs and piecing together commands. You're going to have to learn to research, to find automation tools and technologies that fill whatever gaps you find.
And your employer may not pay for all of the training you'll need -- but it's an investment in you. Get your employer to at least put automation on your annual goals -- "automate 10 person-hours worth of manual labor each quarter" or something. Something you can measure, something that forces them to invest, since they're getting the lion's share of the eventual return on that investment. Commit to doing those things, so that you're the toolmaker in your organization. The one the company can't live without.
Because the alternative is not good. Remember, the square ones are the fish, and the round ones are the burgers.
Posted by Don Jones on 03/27/2013 at 1:14 PM0 comments
More on this topic:
Things are changing in the Microsoft IT world. It's happening slowly, but it's happening. We've reached an inflection point, or are reaching it soon – and whether or not today's IT administrators continue to have a job (or at least, the same job) is very much in question.
Now Hiring Smarter IT Pros
Microsoft has moved firmly into the platform world, with many of the native product administration tools being almost afterthoughts. Use Active Directory Users & Computers to manage a large directory? I don't think so.
Microsoft has realized that it can never build tools that will meet everyone's needs, and so the native tools are just the bare-bones basics that a small organization might be able to get by on. Instead, Microsoft is focusing more and more on building platforms -- great functionality. But how do you administer those platforms?
This is the new inflection point in the Microsoft IT world. Increasingly, Microsoft is giving us the building blocks for tools. Application Programming Interfaces (APIs) that let us touch product functionality directly and build our own tools. Microsoft even has a word for the new IT discipline: DevOps. In part, it means operations folks (admins) taking more responsibility for programming their own tools, so that they can implement their organization's specific processes.
Yes, programming. In many cases, the new operations API will be Windows PowerShell -- but not in all cases. You'll also be using tools like System Center Orchestrator, and may use ISV tools that let you build out your business processes.
In a way, this is completely unfair to the loyal Microsoft server fan. They got on board by clicking Next, Next, Finish, when the rest of the world was off running command-line tools and writing Perl scripts. Now, Microsoft is yanking the rug out from under them. "Psych! Turns out you have to be a programmer after all!"
But there's a reason for it -- and you can either embrace that reasoning, or close your eyes and wait for it to run you over.
Forget Private Cloud. Call it Util-IT-y.
So why is Microsoft so focused on making its loyal IT professionals become scripters and programmers? Funnily enough, it's the private cloud.
Go over to GoDaddy and buy a Web site for yourself. Or, go to Amazon and buy some AWS time. In both cases, you will not find some human being Remote Desktop-ing into a server to spin up a new VM, provision your service, and send you a confirmation e-mail. It's all done automatically when an authorized user (you, a paying customer) requests it. Hosting organizations like GoDaddy and AWS don't treat IT as overhead, they treat it as enablers. Their IT folks focus mainly on building tools that automate the entire business process. Someone buys a Web site, and an automated process kicks off that makes it happen. Nobody sits and monitors it or even pays much attention to it -- it's automated.
That kind of functionality is where "private cloud" got its name. The idea is that your own datacenter (if we're still allowed to call it "datacenter") exhibits those cloud-like behaviors. Marketing needs a Web site? Fine -- it'll push a button, and a requisition gets approved, and lo, there is a Web site. IT doesn't get involved. We built the tool that made it happen once all the right approvals were in place, but we didn't click the individual buttons to set up the VM and the Web site, or whatever. We automated it, using tools like System Center, PowerShell or whatever.
But I hate the term "private cloud." I really do. I much prefer the term utility.
When your organization needs a new fax phone line, you go through some internal business process to obtain the necessary authorization. The phone company isn't involved in that. Once you have approval to pay the bill, you tell the phone company to spin up the line. More often than not, someone on their end pushes a button and lo, a fax line is born. They didn't walk out into the Central Office and manually connect wires together -- that's so 1980. It's all automated, and it "just works." It's a utility.
And that's what IT needs to become. We stay out of the business process. We stop being the gatekeepers for IT services. We stop manually implementing those services. Someone wants something, they get approval and push a button. We just make the buttons do something.
And this is why the private cloud means you're going to lose your job…
Is it evolve or die for the Microsoft IT admin? Don Jones will give you his assessment in his final installment of the Microsoft IT Winds of Change blog series.
Posted by Don Jones on 03/25/2013 at 1:14 PM5 comments
Things are changing in the Microsoft IT world. It's happening slowly, but it's happening. We've reached an inflection point, or are reaching it soon -- and whether or not today's IT administrators continue to have a job (or at least, the same job) is very much in question.
Getting Nostalgic for Microsoft IT Administration
Do you remember Windows 3.1? Not a bad OS for a home user, and a pretty decent OS for a lot of smaller business people. Well, technically not an OS, I suppose -- it was really an operating environment layered over MS-DOS. But it was easy to use, and a lot of people got pretty good at using it. Ah, Program Manager. I miss ya.
What about Windows NT 3.1 and Windows NT 3.51? Those were Microsoft's first credible attempt at a full-scale, business-class operating system. And with them, Microsoft did something pretty clever: unlike the main network operating systems of the day -- think NetWare, VINES, and Unix -- Windows NT was easy to operate and administer. Heck, it looked just like Windows 3.1! You didn't have to memorize obscure commands. You could just click your way to a happy network. Every tool you needed came right in the box: a user manager, a server manager, DNS tools, even a rudimentary Web server. All graphical, and all easy to use. Ah, Program Manager. I miss ya.
That ease-of-use eventually got Microsoft in the door of corporations large and small. As a departmental file server, it was easy to set up and deploy without having to go to the IT department and their big, bad Unix boxes or hardworking Novell servers. And Microsoft built on its success: Exchange Server 4.0 offered a point-and-click, easy-to-administer alternative to cc:Mail and even the old Microsoft Mail. SQL Server came with every tool you needed to run an RDBMS, right in the box. That was Microsoft's pattern: make it easy, and include everything you might need right in the box.
This was an inflection point in the IT world. Suddenly, you didn't need to be an "IT monk." Normal, ordinary people could be IT admins, and hordes or normal, ordinary people took the jump. The release of NT 4.0 with it's Win95-a-like "Chicago" GUI, along with the heavily-promoted MCSE and MCSA certifications of the day, saw all kinds of people changing careers into IT. After all... it was easy!
In the IT Universe, Nostalgia Is BS
If Microsoft got its "foot in the door" by making its products easy to set up, easy to administer, and easy to use -- and by including every tool you needed right in the box -- then that's also where Microsoft set itself up for eventual failure.
First of all, not every tool you needed was right in the box. Plenty of organizations ran up against limitations and inefficiencies, and either ponied up for supplemental tools or just decided to scrape by with the native administration tools.
Second of all, "easy" also means "one size fits all." That is, a product can only lack complexity if it isn't very flexible. Organizations quickly started realizing that, and Microsoft responded by building in more flexibility. The problem is, flexibility always comes with complexity. If you can do something only one way, there's no need to make a decision, consider criteria, or anything – you just do the one thing the one way. As soon as you start having options, you have to decide what to do. You have to weigh pros and cons, decide what option is right for you, and then actually implement that option.
And of course Microsoft still had to actually ship products. Think about that: the OS and their various server products (Exchange, SQL, etc) are now more complex, and offer more options, so they obviously take longer to physically code. That leaves less time for coding tools. And so while the products became more flexible and capable, the tools didn't often keep up. Microsoft started focusing less and less on providing great tools, and instead focused on providing a great platform, upon which other people could build the specific tools a given organization might need. Ever try to pass a SOX audit using the built-in auditing and reporting tools? Exactly.
And this paved the way for the new inflection point, which would be almost the opposite of the last...
Look for Don Jones' assessment of what will make up a future Microsoft IT administrator in the next blog post.
More on this topic:
Posted by Don Jones on 03/22/2013 at 1:14 PM1 comments
A lot of organizations have a "run book" – a binder full of step-by-step instructions for accomplishing nearly every major IT task they perform. In fact, the term run book automation, as implemented by products like System Center Orchestrator, are designed to help automate those tasks.
As a decision maker in your IT organization, if you don't have a run book, start one. Right now. Make your team document every single thing it does. In detail. First, you'll help preserve institutional memory; second, you'll set yourself up to automate those tasks some day.
Tip: Make it a physical book. Electronic documents are fine so long as the electronics are working, but if something goes offline you'll want your run book to walk you through bringing it back online.
Make it a picture book. The simple fact is that, when you need your run book it's because the person who's memorized a given procedure is unavailable, or because the proverbial fit has hit the shan. In either case, you want a clear, easy-to-follow set of directions. Like a picture book. As you create your run book, document each step of each task by using screen shots, not just words, so that the run book is as easy to follow as possible when need arises.
And then start focusing on automating those tasks using an appropriate tool (like Orchestrator, if that works for you). Performing tasks should be as easy as picking the task and clicking "do it."
Posted by Don Jones on 02/28/2013 at 1:14 PM0 comments
With the release of Windows 8 to MSDN and TechNet subscribers worldwide, we're starting to see more and more people setting up their first machines using the final OS code -- and starting to see more questions about some specifics. Unfortunately, Microsoft hasn't been providing much in the way of answers at this point. For example, my colleague, Jason Helmick, contacted me after testing some of the Windows Activation features in Windows 8. I'm providing his narrative below, enhanced with some of my own discoveries and comments in [square brackets]. I'd love to hear your comments and findings, too -- please drop them into the comments area below. With that said, take it away Jason...
Q: I'm confused about the Windows 8 Enterprise/ Pro Activation.
A: The three download versions of Windows 8 can be somewhat confusing at first, until you realize the purpose for each one. Lack of documentation in this initial stage of release has had more than one person download all three just to see which one they can license.
In my case (TechNet key in hand) I ended up downloading all three to see which one would take the key. The answer is the standard download (not Pro or Enterprise) but after working with the Enterprise and Pro versions I ran into the new activation process and had some questions. Without having any documentation to explain the new activation process I did an initial test. I'm left with more questions than answers.
The Pro and Enterprise downloads are designed to receive their activation through a traditional KMS server or the new Active Directory Based activation (ADBA). The Enterprise version of Windows 8 still supports Multiple Activation Keys (MAK) if that's your preference.
[I'll note here that I was able to help Jason find the right download and confirm his observations. The TechNet "Windows 8 Professional VL" ISO image requires a volume license key and Active Directory, or KMS-based activation; the "Enterprise" ISO also requires on-network activation or a Multiple Activation Key, or MAK. The "plain" Windows 8 ISO will accept a Professional key and activate as Windows 8 Professional.]
So, without a KMS server or MAK available, I decided to test Windows 8 Enterprise to see if there had been changes to the activation process, and to test the time it took the OS to expire when not activated. I'm not a hacker, and I'm not trying to pirate software; I'm just trying to understand from an administrative deployment perspective what is going to happen if activation fails. Documentation for this seems elusive at best (or doesn't exist). Here are the questions that I had when starting my experiment:
- How long does it take before the desktop activation message appears?
- How many rearms can I perform?
- How long does it take between rearms until the next activation message?
Perfectly legitimate questions if you're deploying Windows 8. After all, we need to know what happens when things go wrong. What symptoms indicate an un-activated copy of Windows? What will users be telling the help desk they're seeing? What can we expect? Crucial concerns, and I was concerned about the differences between Windows 8 and Windows 7. Hopefully, I thought, they'd be identical.
However, the answers I started seeing my experiment weren't what I expected. Perhaps some documentation or feedback could help understand what's happening. Here are the results from my initial experiment. (I'll try more testing soon):
Changed the clock forward one year (Windows and BIOS) to force activation message.
It did not attempt to activate nor did it display a desktop message.
So, how are they determining when it's time to activate?
I forced activation with the set-forward clock.
Activation failed looking for KMS and again, no activation message.
Checked SLMGR/ dlv for information about activation period (time till activation)
No time period listed, but was shocked to see 1000 rearms available.
1000 rearms? I can rearm this product 1000 times? This seems like a lot.
Tried a rearm – SLMGR /rearm
999 rearms left
I wonder if I can script this to decrement the counter to 0?
I tried to script the rearm
Must reboot after each rearm so I need to make a better script
I decided to leave the box alone to see when it would display the desktop activation required message. The current time was 8:30am
The activation required message appeared almost exactly 24 hrs later.
Ok, so the first activation message occurred in 24 hrs. I have a 999 rearms – Microsoft could not possibly want me to have 999 days without licensing the product. Could they? That's almost 3 years!
Performed rearm and waited for next Activation message
In the interim I parsed the logs for activation.
While you can see events for the activation process, I was unable to find an event logged when the desktop message "Activation Required" occurred.
Why isn't this logged? Such an event could be useful for monitoring computers for this problem.
Activation required appeared on the desktop approximately 8 hours after I rearmed
This makes sense. The activation time is getting shorter, forcing a rearm sooner. That explains the high rearm count as it will get quickly used if the time continues to shorten.
Rearmed the system
Activation required appeared on the desktop approximately 4 hours after rearm
Again, it seems that the time windows is closing.
Rearmed the system
Activations required appeared on the desktop approximately 2 hours after rearm
The time window is definitely getting shorter.
At this point I had to stop the initial test. I may have made errors in this test and I want to examine it further. However, it would be nice if Microsoft would explain it so I didn't have to perform new tests.
Here's the question that is bothering me the most, and what I'm going to script and test for next week: After all the rearms, will Enterprise stop working?
In all previous tests Windows 8 continued to work normally without removing functionality (at least as I could determine). I could join it to a domain, remove it from a domain, etc.
So, what happens when you reach the end?
This definitely seems to be a new twist on the old activation strategies used by Microsoft. Granted, the Enterprise edition is meant for… well, enterprise use, so it's nice to see generous terms and a shrinking activation window. It would still be nice to know how small that window can actually get (if it continues to divide by half, it's be down to nanoseconds pretty rapidly), and what happens if you reach the end.]
Posted by Don Jones on 10/24/2012 at 1:14 PM22 comments
This blog post was written prior to the news that Microsoft's new interface would not be called "Metro." All references to "Metro" were left intact for clarity.
Microsoft may have really messed up with their Windows 8 strategy. Its first big mistake was releasing community previews of the new OS, while barring most employees from talking about it or even admitting to its obvious existence. Release code without being able to explain it? Bad move. What messaging we did get was preliminary and off-base... and the company may pay for that. Here's what they should have told us.
It Isn't a "Metro Desktop"
The most controversial feature of Windows 8 is the so-called "Metro Desktop," a term that Microsoft never should have let pass without comment. Metro was never intended as a noun; it's an adjective, as in "the desktop with the Metro styling." And it isn't a desktop at all, obviously -- it's a dashboard, an evolution of the gadget-laden Sidebar of Windows Vista, and very close in functionality to the Dashboard view in Apple's Mac OS X. It's actually a better implementation than Apple's, because its tiles are more organized, and it also serves as an application launcher -- something Mac OS X distributes across a file folder, the LaunchPad view, and some other mechanisms.
In other words, the Metro-styled Start screen is a great idea. Its bad rep comes almost entirely from Microsoft refusing to speak about it for so long, and then not talking about it very well when they did ungag. Sure, there are some heavy restrictions on the apps that can run on this Start screen, but it's mainly made as a dashboard -- think lightweight gadgets, not full applications.
What's ironic is that the new Start screen is designed mainly to be touch-friendly -- something that's also been controversial, as the thing is practically unusable with just a mouse. Add in some keyboard shortcuts, though, and it's very slick. Start typing application names and up they pop. Hit "Windows+I" for the sidebar menu thing that took me forever to find with my mouse. Eventually, when we start using Apple-esque multi-touch trackpads with our computers (not just laptops), I suspect the new Start screen will be even nicer.
Microsoft just didn't get out in front of this one fast enough. It reminds me of the first ads I saw for Vista, which focused exclusively on the stupid "flip 3D" application-switching feature. Microsoft buried the headline there, focusing on something trivial and not communicating very well on what made Vista different. Microsoft's made that same error with the new Start screen, and the world in general has crucified them for it.
Windows 8 comes across as a consumer release -- so much so that a lot of businesses are disregarding it out of hand. Again, that's because Microsoft is burying the headline. In proclaiming Windows 8 as touch-friendly, etc. etc. etc., it's forgetting that 99.9999 percent of its customer base doesn't use touch-based devices.
But listen to the messaging coming out of Redmond and you could easily imagine that Windows 8 just isn't something businesses need to care about.
Windows 8's deeper integration with the improved DirectAccess is nothing short of brilliance. The ability to centrally manage VPN-like connection options for your users, who can simply double-click and connect, is awesome. DirectAccess finally works, is straightforward, and is something everyone should look into -- and Windows 8 really utilizes it best.
SMB 3.0, the new file-sharing protocol in Windows Server 2012, gets the best love from Windows 8. Automatic -- nay, automagic -- failover of load-balanced SMB shares means a whole new way to think about clustered file servers.
Oh, and three words: Windows To Go. Part and parcel of the enterprise edition of the product, it enables you to build a secured, encrypted (via BitLocker) corporate image of Windows 8 on a USB thumb drive. Users can pop that into any computer and get the corporate desktop right there -- and when they pop out the drive, the desktop goes away, leaving no traces on the machine. This is so cool I can barely even wrap my head around it -- yet it's a feature getting relatively little spotlight.
It's Just as Good as Windows 7
The deployment technologies, support processes, and almost all the rest of Windows 8 are astonishingly similar to Windows 7. I think, in their race to look as cool as Apple, Microsoft is making Windows 8 seem a lot more revolutionary than it is -- and I mean that in a very good way. Enterprises don't like revolution; we like evolution.
Small, incremental changes we can cope with. Just for once, I'd like Windows 7.1, 7.2, and 7.3 – and Windows 8 actually feels a lot like that. It certainly exhibits about the same level of change as you get going from Apple's OS 10.7 to 10.8 -- some major cosmetic overhaul, some new concepts, but the same basic stuff at the core.
I think you'll have no issues running Windows 8 alongside Windows 7, or even skipping 7 and going straight with 8 if that's where you are in your lifecycle. You need to get the heck off of XP, that's for sure.
Yeah, the new Metro-styled Start screen is going to throw some people for a bit of a loop, but so did the Start menu when Windows 95 was introduced. They'll adapt -- they'll just whine a bit because they haven't had to adapt to any major changes since 2002 when you deployed XP the first time. And yeah, the new Metro-styled experience isn't comprehensive -- you'll find yourself dumped out of the "Metro Desktop" and into "classic" applications more often than you want, especially when you start fiddling with Control Panel stuff. That's fine. It's not as elegant or complete as an experience as we might like, but it's perfectly functional. We -- and our users -- are going to grumble about something anyway, so it might as well be that, right?
Posted by Don Jones on 10/23/2012 at 1:14 PM190 comments
Microsoft has a problem. A marketing problem. That problem's name is Windows Server 2012.
You see, as an operating system, Windows is pretty dang robust already. There's not a lot that we need it to do that it doesn't do. So Windows Server 2012 doesn't come with a flash-bang set of features. There are no massive changes to AD. Printing is still printing. Clustering works fine. Sure, it's probably "the most secure version of Windows ever," but I don't think anyone's dumb enough to try and sell that line anymore.
This means a lot of organizations -- a lot of decision makers -- are going to look at Windows Server 2012, say "meh" and ignore it.
Bad move. Windows Server 2012's improvements aren't skin-deep -- they're geek-deep. They're critical, yet evolutionary changes that make this a more robust, more stable and infinitely more usable operating system.
Yeah, maybe it's really Server Message Blocks 2.2, but it should be 3.0, and I'm glad MS is positioning it that way. Massively re-structured, this is a SAN-quality protocol now, capable of moving close to 6 gigaBYTES per second. Yes, gigabytes, not the usual gigabit measurement of bandwidth. It's got built-in failover, too, meaning clustered file servers are now a no-brainer. It lets file servers scale out, too -- something which has never before been possible. There's a geek-speak explanation of all the new hotness in this Microsoft blog, and you gotta believe this is going to be a game-changer for the OS.
Dynamic Access Control
While this will be limited, initially, to the access controls on shared folders (rather than on files, AD or something else), this is showing us what the foundation for ACLs looks like in the future. Imagine complex ACE definitions like "must be a member of Admins, but NOT of HR, or can be a member of Execs" -- and that statement is evaluated on the fly. This truly enables claims-based access control, because it doesn't have to be built on user groups any more. "User must be in the department 'Sales' in AD, and must not be in the Denver office." Keep your AD attributes up to date and suddenly access control got easier -- and much more centralized. This will still layer atop existing NTFS access controls, as share permissions always have, but it's a big deal. Start wrapping your head around this now, because it's a model you'll see creeping further in future releases.
This is the version of Windows we were told six years ago was coming. Almost completely manageable via PowerShell (if not completely completely; it hasn't shipped as I'm writing this, so it's tough to say), this is the version of Windows that starts to deliver on the PowerShell promise: Automate Everything. Combined with PowerShell v3 foundation features like more robust Remoting, Workflow creation and more. Windows Server 2012 is taking a page from the Unix book and rewriting it for the Windows world. That's a good thing because it truly enables enterprise-class sensibility in our server OS.
Explain it to me as many times as you want, and I'll never understand why folks RDP into a server to perform basic day-to-day management rather than just installing the GUI consoles on their admin workstation. But Win2012 raises the stakes, providing a "GUI-free" server console that doesn't have the limitations and caveats of the old Server Core installation mode. Take heed: Whether this excites you or not, it's Microsoft's direction. Start thinking about managing your servers from your clients, because that's going to be the only option in the not-too-distant future. Oh, and as for installing all of those admin consoles on your client? Maybe not: PowerShell Remoting means the admin tools can "live" in the server, but "present" on your client.
Get You Some Win 2012
"The right tool for the right job" is the mantra all of IT should live by, and Win 2012 is shaping up to be a better tool for many jobs. It's worth looking at. Even if you think your organization won't have any 2012 goodness well into 2014, at least familiarizing yourself with the tool's capabilities will put you in the driver's seat. You'll be prepared to help make recommendations about this new OS, speak knowledgably about its capabilities (and about what it can't yet do) and be the one to lead its adoption and deployment. Better to be driving the bus than to be run down by it, eh?
Posted by Don Jones on 06/22/2012 at 1:14 PM9 comments
I'm sure you've seen endless analysis and opinion about Microsoft's re-re-revamped certification program, so I'll avoid adding any more to the pile. However, I do want to ask some questions -- because ultimately the value of these certifications comes from decision makers in organizations. If the boss cares, then the employees care, HR cares, and so forth.
First, one minor bit of opinion: "MCSE for Private Cloud" does, I have to admit, make me puke in my mouth. Just a tiny bit. I'm so sick of the "C" word, and this certification -- simply some Windows Server 2008 exams added to a couple of System Center 2012 exams -- seems to be no "cloudier" than a nice day in Phoenix. But whatever. The marketing people probably could help themselves.
Microsoft's new certification program stacks into three tiers: The Associate level, the Expert level, and the Master level. These each break into two categories: "Certifications" and "Cloud-Built Certifications" (deep breath, hold, out the nose).
So... do you care?
In the beginning, these certification programs -- and I'm talking Windows NT 3-era here -- were largely a play by Microsoft to say, "Look, there are tons of people who can support our products, so why doesn't your business just send us a check for some software, hmmm?" Microsoft's certifications, like most IT certifications, have never been an attempt to protect businesses, to protect the public, and so on -- not in the way other professional certifications, like those in the medical or legal industries, are intended to do (whether they do it or not is, I'm sure, debatable).
So does the large body of Microsoft-certified human beings make you sleep more easily at night?
Do you find that a Microsoft certification acts as anything more than a bare-minimum filter for HR to hone in on when sorting through incoming resumes?
Knowing all about the "paper MCSE" syndrome, the scores of brain-dump Web sites, the certification cheats and all of that, would you still rather hire a certified individual over a non-certified one?
Would you discard, out of hand, the resume of someone claiming eight years of IT experience who doesn't have a certification over someone with less experience who does have a Microsoft title?
If you were to offer some advice to an IT person who doesn't have a certification but who's worked in a lower-tier IT position for a year or so, would you advise them to the exams needed to earn the new MCSE, MCSA or whatever? Or not? Why?
In short, how does Microsoft's certification program affect your business? I'm genuinely curious, and I'd love your comments. Drop 'em in the box below.
Posted by Don Jones on 06/05/2012 at 1:14 PM25 comments