Creating a Culture of Security

Too many organizations treat IT security exactly the same way they treat something like sexual harassment training: it's something irritating you have to pay attention to, but it isn't really at the core of your business. It's a box you have to check. You have to say you care a lot about it, but in reality, you try to ignore it unless it rears up and slaps you in the face.

But IT security is very much -- or should be -- a core part of your business.

Even the tiniest businesses have information that attackers would love to possess --the personal information of a patient (invaluable for identity theft), credit card numbers (valuable in and of themselves), customer contact information (great for building marketing lists to sell to competitors), you name it. For the most part, attackers aren't after your company's "core information." They're after the incidental information that you collect as part of doing business. There's a massive black market for that information -- there's really money here.

Let's state that even more clearly: IT attacks are a business. There's profit to be had.

And the penalties, should you be the victim of an attack, can literally shut you down. Look at the fines and damages Target is paying, which is estimated to be more than a billion dollars. If that happened to a small restaurant -- a far easier target, with just as juicy an information store -- it'd be out of business overnight.

Security has to become a core part of your business. You have to protect yourself. This isn't like a sexual harassment lawsuit, which you might be able to financially survive. A single hit in an IT attack can ruin even an enormous corporation. And I'm not just taking about brand damage and loss of business or customer confidence; I'm talking about real, financial damages, payable immediately.

One attack. That's all it takes.

Now you have to look at your corporate culture and ask if you've cultivated a culture of security. Do your employees routinely consider the security implications of their at-work actions? Do you have regular training sessions to help remind them to be vigilant?

Do you have a central security effort? I'm often amazed at the companies that don't. I recently delivered an on-site seminar for a fairly large client. Before I could show up, their risk mitigation folks demanded proof of my auto insurance, my general liability insurance certificate, proof of worker's compensation insurance and more. Yet that same company allows internal divisions to stand up new public-facing Web sites at almost any time. The company's own IT people have to scan their own public IP addresses to find out about new stuff. That's completely unbalanced -- you're worried about a single contractor not having insurance, but you're not worried about Internet-connected data spews?

Companies' cultures need to change. You need to become concerned about every new piece of software, every new connection and every new piece of data -- whether or not you're required by law to care. If some piece of software doesn't contribute to your core business, don't allow it. If some new Internet-connected service isn't part of your mission, don't connect it. But if you do allow it, and you do connect it, then you have to secure it. Your culture needs to abhor uncontrolled new data stores, connections and pieces of software. You need to consider every access to your resources, including that HVAC vendor whose password will be stolen. Yes, your corporate culture needs to get a little paranoid.

You need defense in depth. Two-factor authentication should become the minimum bar in every company, of any size, ever. Waiters at a restaurant should have to swipe their card and key a PIN, not one or the other, in order to access the point-of-sale system.

This is the world we live in now. Hacking companies of all sizes is big business, and it's just a matter of time until someone tests your defenses. You have to get ahead of this bus, because otherwise it will run you over, eventually.

It amazes me that companies aren't taking the example of Target, Neiman-Marcus, and Equifax more seriously. You are next. Rely on it. And start changing your corporate culture to acknowledge the reality of the world today.

Posted by Don Jones on 04/23/2014 at 1:20 PM0 comments


A Windows XP Post-Mortem: Fallout, Lessons Learned and Next Steps

We know that tens of thousands of copies of Windows XP are still in use, and many will likely remain so. We can't just arbitrarily ditch critical services just because they happen to be on XP.  But we can sit for a moment and think about how we got here. How are we running "mission-critical" services on a 12-year-old operating system? Is there any reason to worry about it? How can we change our IT management practices so that we don't end up in this situation again? In this six-part article, we'll conduct an XP post-mortem, and see what lessons we should be taking away from the OS that won't let go.

Well, Here We Are
Microsoft has officially ended support for Windows XP, meaning there will be no new updates. No Security Essentials updates. No patches. No nothing.

Well, not "ended," really. Some entities -- notably, so far, the Dutch and U.K. governments -- are opting to pay Microsoft millions for "custom support," so that Microsoft will still produce patches and whatnot for them. That support is limited in what it will do, and is intended to give organizations more time to transition. You know, more than the half-decade-plus they've already had.

But for the rest of the world, XP is over. Although plenty of folks will still be using it. According to Netcraft , there are thousands of Web sites still running on "XP servers." Yeah, think about that -- XP servers. But it's OK -- XP isn't alone. The Australian Postal Corporation, for example, appears to be running their online postage sales site on a Windows NT 4.0 server. Yikes.

A lot of folks, myself included, aren't very patient with the amount of XP still running around the universe. In my personal blog, I posted a rant that rails against the managerial incompetency that resulted in governments paying millions of taxpayer dollars to support an obsolete operating system -- a situation I feel was totally avoidable. Don't worry; this blog won't be a rant. It'll be a much more professional, managerial look at how we got here, and what we could have done differently.

Most "we have to keep running XP" situations probably come not from people who were caught unprepared by the whole end-of-life situation (something Microsoft has repeatedly extended in past years), but rather from people who have systems who have to run XP. Embedded systems. Specialized software. That kind of thing. Understand that those situations don't make all of this acceptable, but we're stuck with what we're stuck with. The point of this series is to try and make the current situation a teachable moment, so that we can learn from what we've done, and to see if we can do better the next time around.

Let's be clear: we're not really trying to assign blame here. But running on obsolete OS for mission-critical applications is an incredibly poor practice, and we do need to acknowledge that. We also need to look back and figure out why things turned out like this, and try to learn something from it. There's a very real knee-jerk reaction to say, "well, it's the best we could do," but that isn't true. We can do better.

But first, we need to talk about the very deal dangers of running XP today, and consider some practical steps to make sure we're still safe.

The Very Real Business Dangers Ahead
The problem we now have to deal with is ensuring our mission-critical services running on XP stay running. We're in a bit of a situation that IT teams haven't been in before. In the past, hackers and other attackers were primarily coming after us for the fun of it, and for the street cred in saying they'd successfully done so. There was some risk in running an outdated operating system, but it was perhaps minimal. As the installed footprint for an OS dwindled, it became a less-attractive target to hackers because they couldn't make a big splash anymore.

Hacking today is a for-profit enterprise. Hackers have a purpose: they're trying to, in most cases, steal something that will earn them a profit. They'll install Bitcoin miners and steal processing power. They'll steal customer information and sell it. Imagine being able to steal information from a hospital's XP-based blood analysis computer, or being able to fiddle with the XP-base dispatching system used by a truck company. Those hackers could wreak havoc. But they could also make a lot of money.

Hackers today are a lot less likely to try and make a big splash. They're smarter than that, now -- they're not script kiddies. I wouldn't be surprised, for example, if someone's discovered some exploit in XP and has held on to it for years, waiting until Microsoft would no longer fix the problem. Now they can sail into tens of thousands of computers and start attacking essentially unimpeded.

That should terrify you.

If hackers are able to install some malware, it probably won't be very visible. It'll sit in the background doing its thing. If it lies low, and only attacks XP machines, there are good odds the antimalware vendors of the world won't notice it, meaning it'll be able to run for a good long time.

The lesson of Target is that it only takes one time to cost you billions. Even if you've never been hit before, when you're running an easy target like XP -- and doing so to run mission-critical services, not just someone's word processor -- you're an attractive target.

So what can you do to help protect yourself?

  • Keep your antimalware software up-to-date. Most vendors are continuing to support XP for a while longer.
  • Ensure XP is as cut off from outside communications -- and that includes your own LAN -- as possible. Restrict both incoming and outgoing communications using Windows Firewall or by using third-party local firewall software.
  • Think defense in depth. Assume each defense you have will be broken, and have something behind it ready to pick up the slack.
  • Store as much data as possible off of the XP machine. Make it as much of an appliance as possible. Make an image of it (ensuring there's no malware running at the time), and restore to that image regularly. That's a great way to help limit malware's reach.
  • Upgrade Internet Explorer 6 to the latest version possible. IE 6 is a known cesspool of weaknesses and exploits.

Yes, XP is going to require a lot more work from you to keep it safe – just like owning a 12-year-old car requires you to spend more on maintenance. Don't fail to do that work. I realize that few organizations have security and risk mitigation as a real part of their culture, but it only takes once for an attacker to put you out of business.

So, with an idea of what dangers XP presents, and what you can do to try and mitigate those, let's start thinking about what we could do differently, in IT management, to not be in this situation again.

Next Time: Planning for Obsolescence
Windows XP was likely one of the most popular and best-selling business operating systems of all time. It's where Microsoft really took off, and tens of thousands of companies built solutions for XP. From specialized hardware to embedded systems, XP was everywhere.

When we started making all those XP acquisitions, we never really thought about the day when XP would go away. Security wasn't a big deal to most organizations, and up to that point we'd upgraded our computers every three years or so. Nobody at the time imagined a 12-year-old operating system. Heck, in 2002 when XP came out, most of us hadn't even had a local area network for 12 years, yet! So we made a lot of IT acquisitions -- software, hardware, and more -- without really considering that the OS they all relied on would eventually become obsolete.

But it did.

And now we should know that everything in IT will become obsolete. And we should plan for that. Every single IT acquisition you make -- and I'll discuss a couple of specific situations coming up -- should include the question, "what will we do when this is obsolete? What's the upgrade path?"

Vendors might not be ready to answer that question -- so we're all going to have to work together to do so. I can guarantee that if you buying or not buying a vendor's produts, someone will find an answer for you. And those answers don't actually have to be difficult or complex, as we'll explore. But the point is to ask the question.

And I'm not, as some folks might suggest, being a "single-issue voter," here. I'm not saying that having an upgrade path defined ahead of time is the only possible thing you should consider in an acquisition. But I'm betting it's not been on the list of discussion topics before, and I'm saying it must be something you discuss going forward.

So, aside from getting a vendor's solemn promise to upgrade their software to support all new versions of Windows going forward -- a promise you're unlikely to get, anyway -- what can you do?

Next Time: Specialized Software
Let's start with the question of specialized software. In fact, let's first distinguish "general-purpose software" like Microsoft Office, which we all know will have an upgrade path, from the more-expensive "specialized software" that you use to actually run your business.

Start by looking at vendors' track record for upgrades. Have they commonly offered upgrades to accommodate new versions of Windows? What did those upgrades look like? Were they painful, or relatively smooth? Does the application's basic architecture lend itself to upgrades? For example, does it keep data in an external store, making it less reliant on the local OS? Does it use a lot of version-specific tricks that might not carry forward into later versions?

Most importantly, what can the vendor do to ensure that you have options? Most governments and larger corporations require software vendors to put their source code in escrow. Escrow services keep a copy of the source code, and only release it under specific circumstances that are defined in the escrow agreement. Many customers use checksums of the source code to verify that the code they're running in production matches. With escrowed code, if a vendor goes out of business, you have the option of hiring someone else to maintain the application -- including modifying it to run on a newer OS, if needed. Your escrow agreement might allow you to access the source code if the original vendor no longer maintains that version of the software, so that you can do so yourself. It may sound weird, but escrowing like this has been a common practice for more than a decade -- and it's something any sized organization should do when it comes to specialized software.

If your software acquisition includes a maintenance agreement, then ensure it requires the vendor to support new versions of Windows within some reasonable window of time. After all, that's the whole point of maintenance.

The point is to work with vendors to ensure that your expensive new software purchase won't mandate that you be stuck on a specific version of Windows forever and ever. Give yourself options by planning ahead for obsolescence.

Next Time: Specialized and Embedded Devices
This is a tougher set of considerations. In some respects, specialized hardware solutions running an embedded OS are just appliances. Do you worry about the OS your toaster is running? Well... yes, you do, if it's storing important data or plays a critical role in your business.

Again, this is where a discussion with the vendor is important. How long will you depreciate this new acquisition? 10 years? Then get the vendor to commit to providing updates during that period -- and to escrow their source code in case they go out of business.

Further, understand exactly where the device's potential security concerns exist. Does it communicate with a network? Does it store customer data locally? If we do need to run its operating system for years after obsolescence, how would we lock it down to protect it so that we can continue using it? Knowing those answers up front lets you have a plan in place -- even if that plan includes just leaving the original OS intact for years. In other words, in some cases running an obsolete OS can be fine -- if you're prepared for it.

But be prepared. Today, there are thousands of point-of-sale systems in retail and restaurants that still run Windows XP operating systems, including XP Embedded variants. You know these things are connected to a lightly secured network in most cases (hello, Target), and yet they're dealing with important customer data every minute of the day. "Well, we're a tiny business -- nobody would attack us." Not true. An attacker can make hundreds of thousands of dollars by capturing a few dozen credit cards, and they're more likely to attack a small, poorly defended network to get those card numbers. And the costs to you, as a tiny business, will be ruinous.

On the other hand, you may have computers that aren't network-connected. They just sit in a corner making copies of keys or something. Those can likely be left alone. "Disconnected" is an incredibly safe state, requiring physical access to compromise.

So next time, make sure you know the plan for upgrading those specialized devices. If you're talking to a vendor who doesn't have a plan, consider talking to their competition instead. See if you can get everything your company needs and an upgrade plan -- even if that "plan" doesn't include actually upgrading the OS on the device.

A Checklist for Sustainable IT
The point of this blog has been to use Windows XP's end-of-life event as an example of how sustainable IT matters. We can't continue to just acquire solutions with no thought to their future. Computers aren't cars; they're not easy to swap out, and they're not always easy to just upgrade. The lesson XP teaches us is that we have to plan for obsolescence. It will happen, and having a plan up front is just good management.
  • Ask vendors to participate in source-code escrow for all software that you buy. Dozens of firms offer this service (search for "source-code escrow" in your favorite search engine), and it provides you with a last-ditch way of dealing with software if the vendor can't, or won't.
  • Look at vendors' track records for upgrades. Were they usually forthcoming with new versions, or do they tend to stick you with what you bought and leave you there?
  • Before acquiring new solutions, understand how they interact with the outside world. How can you secure them should you need to run them past the point of obsolescence?
  • Consider the lifespan of the hardware or software you're buying, and try to take steps to ensure its continued support through at least that lifespan. And then plan to replace it once that lifespan is up. A 10-year solution that you're running into its 14th year is probably not meeting your business needs any longer, anyway.
  • Deciding that you'll run an old OS is fine; letting it happen to you is not.

A lot of us are going to be stuck running XP because of non-decisions made years ago -- in many cases, by different technology leaders. But we can learn from the past, and we can try to actively manage so that it doesn't happen again.

Posted by Don Jones on 04/17/2014 at 11:09 AM0 comments


Windows PowerShell: Professional Training Is a Must-Have

More on this topic:

Let's say you own a car and you decide to take it to a new mechanic for its annual tuneup. Your new mechanic doesn't actually have any formal training in automotive repair or maintenance, but before he got the job he spent a lot of time playing Grand Theft Auto on his Xbox. He inherited some tools from the guy before him, so he opens the hood of your car and gets to work. You're later appalled to find that he's rewired your car to a small electric motor and replaced the steering wheel with a two-handed controller featuring X, Y, A, and B buttons.

Stupid, right? You'd probably never do that. You probably take your car to the dealer, which certifies its mechanics with the manufacturer, or at least go to a mechanic that's ASE-certified, meaning he's been to a specified number of classes and passed some tests.

Kinda crazy that you run your network differently than your car, isn't it?

Yet that's exactly what you do. Microsoft has finally started meeting customer demands -- mostly from large enterprise customers, to be sure -- to provide better administrative and operational automation. They're rolling out PowerShell support for nearly everything in the datacenter, giving administrators unprecedented access to, and control of, the technologies they manage.

And how many of your administrators know how to use that power? How many have you sent to training on it? I realize that Microsoft -- for reasons which I suppose I grasp, yet do not agree with -- hasn't released an official PowerShell certification, but it's certainly released classes. There are dozens of books on the market, including the ones I've written. There are self-paced training videos, too, like the ones I've done for CBT Nuggets.

But administrators clearly aren't getting that education. I know they're not, because every day I help answer forums questions from guys and gals who are trying their damnedest to do a good job, and who just haven't been given any kind of grounding in basic PowerShell skills. They're making newbie mistakes, and they're painfully aware that they're doing so -- they even apologize for being noobs when they ask the questions. They're learning from people's blog posts (if it's on the Internet, it must be accurate, right?), from Q&A forums and from magazine articles. They're desperately trying to educate themselves for you --  but this isn't like changing a tire on a car. PowerShell is a complex technology with many unobvious bits that you simply won't pick up on your own.

If your admins are like most of the ones I know, they're go-getters. They know PowerShell is the future, and it's becoming increasingly mandatory. Try doing much of anything in Azure, or Exchange Server, or even Office 365 without using PowerShell and you'll see what I mean. So they put their nose to the grindstone, flatten it out nicely and learn what they can.

And they're learning on your production network. In many cases they're adopting awful habits, and creating an infrastructure full of hard-to-maintain patterns, without even realizing it. They simply don't know any better. They're doing the best with what they've got -- which is virtually nothing.

Yeah, when it comes to a lot of Microsoft server products, you can often get by for a long time with pretty minimal training. After all, it's just point-and-click, right? But PowerShell is a form of software development. There are no wizards, and it's incredibly freeform. That means the shell offers a ton of power, a metric ton of flexibility and a metric butt-ton of room to go wrong.

And believe you me, you've got admins going wrong. I'm seeing it every day. There's a big community of folks trying to set everyone straight, but unless they find us (I'm at PowerShell.org, for what it's worth), they're just scrambling around on their own. That's not good for you.

The moral of the story, of course, is to get your people trained. Find your best and brightest, the ones who can become your Automation Makers, and send them to a quality class taught by an awesome instructor. Buy them a good book, for pity's sake, or get a subscription to a video training library. You don't need to train the entire team; the enterprise pattern is shaping up as having a relatively small number of bright, adaptable IT folks taking on the role of PowerShell Wrangler. They use their shell sk1llz to create easier-to-use units of automation for the rest of the team; they become your Toolmakers. They're the ones who attend specialized events like the PowerShell Summit to hone and deepen their skills. They're the ones you rely on to make everyone else more efficient, and to manage the PowerShell beast in your environment. Give them enough training, and you can measure the man-hours they save you. Yeah, it's return on investment time, in a big way.

But for the love of your entire IT infrastructure, don't make them do it blind. Please, get them some education, or you're going to find yourself in a sorry state in an amazingly short period of time. PowerShell isn't hard, but it is complex, and it offers a lot of opportunity to go down the wrong path.

Get your team on the right path. Get them trained. Please.

Posted by Don Jones on 10/29/2013 at 11:28 AM0 comments


Forget Certifications: Real Resumes Boast Man-Hours

Ten years ago, sets of magic three- and four-letter acronyms were the golden ticket to a promotion, a better IT job or even your first IT job. MCSE. CCNA. CNE. MCSA. OMG! Today, many organizations' HR departments still rely on these TLAs and FLAs as a sort of filter, meaning resumes lacking these special character sequences don't even end up in front of hiring managers or department heads.

It's a pity.

We all know that the certifications associated with most (if not all) of these acronyms were, at best, a pretty minimal indicator of technical expertise. "Paper MCSE" became common enough to earn a place in may urban dictionaries, spawn hundreds of online and magazine articles and to generally put a cloud of derision over that particular title.

Today, Microsoft's made a lot of attempts to restore some glamour to its top-end IT Pro title (which was until briefly not the top end, of course; the retirement of the company's "Master" certifications brought MCSE back to the limelight). Whether they've been successful really doesn't matter, or shouldn't.

Remember that the whole point of those titles was for Microsoft to demonstrate to companies that the world housed plenty of people qualified to support Microsoft's products. Ergo, it's safe to buy Microsoft products, because you'll be able to easily find people to run them. That, of course, means Microsoft really never had a huge stake in making the MCSE rigorous –- it just needed lots of us to jump through hoops. Today, that whole business driver seems a lot less relevant. Plenty of companies already bought Microsoft software, after all, and any company that doesn't use any Microsoft software sure as heck isn't going to be swayed by the existence of a lot of MCSEs, paper or otherwise.

I'll argue, then, that HR should drop or de-emphasize acronyms on their list of low-level resume filters. Hiring managers should give those acronyms less weight. And IT pros should perhaps worry less about including them in the first place. Sure, list 'em if you've got 'em, but I've got something better.

Man-hours.

"Hi, my name is Don, and in my last job I eliminated an average of 4,000 man-hours of manual IT labor annually." Follow that with your education history and hobbies or whatever, and you've got a compelling resume that the most important bullet point would fit on a business card: reduced manual man-hours.

Regardless of your gender, each carbon-based lifeform in an IT department represents up to 2,400 man-hours annually; less in countries and organizations with more-generous paid-vacation policies. If you're good enough at automation to reduce manual man-hours by any significant chunk of that, then you're a major asset to any IT team, regardless of the three- and four-letter designations you may or may not have.

"I can, through my powers of automation, free up or replace two human beings per year" is another way of saying, "I saved 4,000 man-hours annually." That's a huge deal for IT organizations strapped for resources and unable to expand their ranks. That's people to go work on new projects. It's also an almost ironclad guarantee that if layoffs come around, your name won't be on the list, you person-reducing person, you.

So how do you affect this change? How do you document it for your resume?

Dig into your help desk ticketing system, and do some analysis on time-to-close. Find tasks that get repeated fairly often, and figure out how many man-hours are involved in closing those tasks per year -- pretty basic information from a decent ticketing system. Those tasks, and their man-hours, are your target.

Your weapons are PowerShell. System Center Orchestrator. VBScript. Batch files. C#. Whatever. It truly doesn't matter what tools you use to automate -- although certain ones will obviously be more suitable for some tasks than for others, which is why I've always cultivated in myself a fondness for many technologies. Like an action movie hero carrying knives, ninja stars, a 9mm handgun and a grenade launcher, I like to be prepared for a variety of automation situations. In the end, it's not about the tool I use -- it's about the hours I save.

When you're an automator, everything is a button, or ought to be. I go looking for tasks where my fellow admins spend hours, repeating the same sequence of actions over and over in a GUI or whatever. I then create a button -- some automation unit, like a PowerShell script or an Orchestrator runbook -- that accomplishes the same task automatically, with as little intervention as possible. Hellz yes, I'll jury-rig when I have to. The goal is saving hours. Then I'll document that task -- even if only on my resume -- as hours I've saved.

When the time comes to argue about raises, or a new job, or a promotion, or whatever -- I've got my ammunition. Don't think I need more than a 2 percent raise after I made two of my team members superfluous? Maybe I should float my sk1llz around the other companies in town and see if any of them have more value for an automator.

Remember, back in the good ol' MCSE days, when you could change jobs and get a 20 percent boost in salary? You'd be surprised how many organizations will still offer that kind of bump -- just not for four letters. For man-hours.

Posted by Don Jones on 10/14/2013 at 11:00 AM0 comments


System Center's MDM and BYOD Strategy

For months now, I've been bemoaning -- to pretty much anyone who'll listen to me -- Microsoft's mobile device strategy, or seeming lack thereof.

When Windows Phone 7 was announced, I thought, "Aha! This is how Microsoft's going to compete! They'll leverage their deep relationship with business and produce a mobile phone that's cutting edge and manageable, unlike everything Apple and Google have thrown at us!" I note that recent Samsung devices are an exception; Sammy's been getting enterprise-savvy in the past months.

When Surface was announced, I thought "Aha! This is how Microsoft's going to compete! They'll leverage their deep relationship with business and produce a tablet that's cutting edge and manageable, unlike..."

Yeah, not so much. Microsoft seems so "et up" with Apple Envy, Windows Phone and Surface RT both turned out to be almost purely consumer plays (Surface Pro doesn't count; it's not a tablet, it's an Ultrabook with a removable keyboard, which is fine). Nothing in Windows RT or Windows Phone really pointed to proper enterprise manageability. No Group Policy love. No real anything love. Ugh. I keep telling people that I wish Microsoft would spend a bit less time worrying about the phone my Mom buys, and ship a phone my CIO could love. Fewer organizations would feel the need to cave to BYOD if a viable corporate-friendly alternative was available.

Or maybe BYOD is inevitable. Certainly, fewer organizations are paying for those expensive smartphones and their data plans, now that BYOD is rampant. "Heck, if users are willing to pay for these things and use them to check work e-mail... um, OK." But BYOD still has massive downsides, and Microsoft's tablet and phone folks just didn't seem to be attacking the problem.

Leave it to the System Center team, specifically the System Center Configuration Management (SCCM, although I'm told I'm supposed to call it ConfigMgr these days) team. With SCCM (I'm old-school) 2012 R2, these folks have come up with a brilliant solution that recognizes not only the importance of MDM, but the stark reality of BYOD. They're rolling out a "Company Portal" app, which users can download from their device's respective app store, and use to enroll in their organization's infrastructure. SCCM will understand the difference between a BYOD situation and a company-owned device (you tell it which situation a device is in), and offer appropriate data and manageability. For example, company owned devices can be more deeply managed and completely wiped; BYOD devices can have company-deployed apps and data removed, but that's all. Once a device is enrolled, you get inventory, app deployment, and even a great degree of configuration enforcement through SCCM's configuration auditing feature set. The Company Portal app, along with native device features, essentially acts as a local SCCM client.

The Company Portal app also provides an "in" for sideloading enterprise apps without going through the device's native app store. Typically, the Portal app accepts the organization's deployment certificate, which would need to be obtained from Apple or Google or whoever, which enables sideloaded apps to execute on the device. It's a lot like the Test Flight app for iOS, which allows developers to recruit their friends, and "push" app builds to them, bypassing the store during testing phases. That means organizations can offer mobile apps to users -- whether those apps were developed in-house or brought in from a vendor -- and drop the apps directly on the device, bypassing the device's store. Those apps can similarly be wiped -- along with their data -- on demand.

Note that all of the MDM features of SCCM are actually conducted through an InTune subscription; InTune does the management, and integrates with SCCM for a simpler and more centralized administrative experience. It's another clear sign that Microsoft's future consists of a Cloud+Local=Hybrid environment.

For me, this is just one more example of how Microsoft's back-end business units really "get it." Buying Azure? They're happy to have you run a LAMP stack in Azure... you're paying for Azure, and that was their goal. Standardized on iOS, or just inundated by it? SCCM is happy to manage it... 'cuz you bought SCCM, and that was the goal. It's as if Microsoft -- or at least that back-end portion of the company -- has said, "so long as we own the back-end, we don't really care what the front-end is doing, so we're going to be as embracing of the different front-end ecosystems as possible."

Of course, it's a journey. The current MDM capabilities from InTune and SCCM aren't 100 percent complete, but they're pretty darned impressive. Each new release (InTune is on a quarterly rev cycle, like many of Microsoft's cloud offerings) brings new capabilities to Windows RT, iOS, and Android, along with Windows Phone. "Do whatever you want on the front-end," Microsoft is saying. "Our business unit makes money when we're managing it all."

Bravo, Microsoft.

Posted by Don Jones on 09/12/2013 at 2:36 PM0 comments


TechNet Subscriptions, Part 2: I'm Not Crazy, and Neither Are You

Related:

Back in the day, I subscribed to the very first Microsoft TechNet offering, which consisted primarily of a CD-based offline Microsoft Knowledge Base. It was tremendously helpful, and only cost a couple hundred bucks. Until recently, you could pay not much more to have access to non-expiring evaluation software of Microsoft's major business products, with the Knowledge Base being better served from the Internet. Sadly, TechNet Subscription's days have come to an end, a move many see as evidence of Microsoft's increasing detachment from the "IT Pro" side of their audience. So what replaces TechNet?

First up is Microsoft's official answer: free-to-download product evaluations and "test drive" VHDs. Unfortunately, these expire, so while they could certainly be useful for evaluating software that you might purchase, they're not much good beyond that. Even as evaluation software, many have commented that more complex trials can simply take longer than the allotted time. Trying to do a comprehensive eval of the major System Center products can take months just to set up properly, before you even get into the testing and actual evaluation stage.

Beyond those free eval downloads, your options depend a bit on what TechNet was doing for you. Reading the comments in a recent article I wrote about TechNet , it's clear that my perspective comes from using TechNet in large enterprise-scale environments, but that a myriad of other uses exist.

Let's start with that perspective of mine. Unfortunately, I definitely worked with customers whose companies purchased a TechNet subscription, and then used it to set up permanent lab environments that were used by multiple IT team members. These environments weren't often used to evaluate Microsoft software, but instead were used to stand up a small-scale mirror of the production environment so the organization could test in-house software, deployment scenarios and so on. That use seems to be a pretty clear violation of at least the letter of the TechNet license, if not its spirit, and those organizations will have to pony up more money going forward to have that same lab environment.

MSDN is one option, with the cheapest "MSDN Operating Systems" package at $700 for the first year and $500/year thereafter. You don't get server products with that, though, only Windows itself; to get all the server products like Exchange and SharePoint, they'll be spending at least $6,200 for the first year and $2,600 thereafter. But the MSDN license is still for a single user -- it really isn't intended to be used in a shared lab environment. These companies could, I suppose, buy an MSDN subscription for each team member, but that's going to add up. Still, it won't add up as fast as paying for production licenses for all of those products, which is the other option. Some of the organizations I work with are looking at the idea of negotiating "lab licenses" as part of their overall licensing agreements with Microsoft, to get those server products at a lower, non-production cost.

Another perspective is that of the very small -- even one-person -- consultancy that needs to build out simulated customer environments to diagnose issues, test new techniques, and so on. That's a use explicitly allowed by MSDN, but at close to 10 times the price of TechNet, it's been a bitter pill for these small businesses to swallow.

CloudShare.com offers a possible alternative, although the financials aren't necessarily more compelling. For about $500, you can get an online "environment" that contains up to 8 GB of RAM and 10 vCPUs to be shared between a set of cloud-hosted virtual machines. Those VMs can be built from templates that include the OS and software licensing, and products like SharePoint are amongst the templates offered. VMs run for only an hour or two unattended, then they suspend automatically -- making them useful for a true lab, but not for production costs. Upgrade fees get you more RAM, more CPU, "always on" capability, and more -- and those add-ons add-up quickly. For a small consultancy looking to build one environment per customer, it might be possible to justify the price... but it's certainly a compromise situation. Larger environments can get incredibly expensive (think $6k/year and up), though --  and it certainly isn't a stretch to think that larger environments would be needed.

What about Azure? Unfortunately, Azure's pricing model is intended for production uses, and it's an expensive lab option. Its pricing also doesn't include server software aside from the OS itself, so you're really just shifting the hypervisor to the cloud, not creating an alternative to the software that was licensed as part of TechNet. I could certainly see Microsoft positioning some kind of "Azure Lab" product, providing VM templates that include software and perhaps doing something to restrict run-time or incoming connections to tie the lab to non-production uses, but there's currently no such offering. And, as many have pointed out, cloud-based labs aren't going to meet the need in every situation.

A third perspective is that of the hobbyist, student, and self-learner. The folks who build a "basement lab" in their home, and who were in most cases paying for TechNet out of their own pocket. These folks might be able to get by with an option like CloudShare; it depends a bit on what-all they were testing. Certainly, they can't build out complex networks, which in many cases was part of the point. CloudShare and options like it don't offer the full range of MS products, either -- you won't be doing System Center experiments, for example. Like the smaller-sized consultancy, these folks are sadly screwed, and it's a shame, because they're also some of Microsoft's biggest fans. It'd be nice to see some sort of "spark" program from Microsoft that provided software licenses to these people, at a low cost, for their own self-learning purposes... but that's pretty much what TechNet was, and the company's made its interest in that audience pretty clear.

A fourth use-case is the small business that maintains a small lab environment, but that isn't engaged with Microsoft at the Enterprise Agreement (EA) level. In other words, they don't have the negotiation option where they can try to wrangle some lab-use licenses from the company as part of a larger licensing scheme. MSDN pricing for even a small team of IT folks still seems cost-prohibitive; you could end up spending as much on MSDN licensing as you do on healthcare for those same employees. Now, I do believe that labs provide a business benefit in terms of increased stability, consistency, and reduced headache, and I do believe businesses should expect to pay money for those benefits. It's called Return on Investment, right? A 10-person team could have used TechNet for about $3,500 a year; MSDN would be more than seven times as much money. It does seem as if some middle ground should exist. Again, these folks seem a bit left out in the cold, or at least left in some kind of "gap" in Microsoft's strategy.

So what could the company do?

We've actually got some glimmerings of precedent, here. SQL Server, for example, allows you to run a free "hot spare" instance of the product in scenarios like log shipping. Provided the hot spare server doesn't do any production work, and is only used for failover, you don't have to buy it a separate SQL Server license. There's no enforcement of that licensing; it's something Microsoft would catch in an audit, and it, by and large, trust its customers to comply with licensing terms. So let's pull that thread a bit. What if, for example, your Software Assurance (SA) agreement allowed you to mirror the covered software in a non-production "mirror" environment? In other words, what if every SA-covered piece of software included the right to use that software once in production, and once in a production-mirror lab environment? For many organizations of varying sizes, that'd solve the problem instantly. It'd add real value to the sometimes-sketchy SA proposition, too, which would be good for Microsoft. It wouldn't require any extra infrastructure on Microsoft's part. It isn't a complete solution; evaluating new Microsoft software would still be dependent upon 180-day trials, but it would solve a big chunk of the problem for organizations using SA.

Of course, that isn't a one-size-fits-all solution. Your basement hobbyist, your smaller consultancies, and the significant number of businesses who don't use SA would still be left out. Microsoft might manage to get more businesses onto SA if lab licensing was an included benefit, but it still wouldn't be everyone -- not everyone can afford the SA surcharge of 25 percent of the product price per year, and smaller businesses would be impacted the most by that surcharge.

Frankly, Microsoft's not offering a lot of answers. Most folks fully agree that maintaining a lab environment of any size should cost something; nobody's asking for free software. Microsoft's official answers so far -- free evals -- don't support a permanent lab, which organizations clearly need. Microsoft seems to be saying they either have to pony up for MSDN on a per-IT-user basis ($$$), or outright buy full software licenses ($$$$), neither of which is exciting their customers or their community. Cloud-based options only meet a very small subset of lab needs, and it seems as if Microsoft is either unaware of, or uncaring of, the wide array of things TechNet Subscription was being used for.

Honestly, I could actually buy the idea that the company just didn't know how customers were using TechNet. As I wrote in the beginning of this article, my perspective is coming from large enterprise, and those organizations can better afford to build out labs and negotiating licensing to some degree. It's certainly possible that Microsoft didn't consider the small consultancy, small businesses, or even hobbyists and students. Of course, by this point, the hue and cry should have made it apparent that those scenarios do indeed exist, and Microsoft hasn't given any indication that it cares.

It's a frankly confounding situation. There's a lot of anger from Microsoft's community over the entire issue, in part because it just makes so little sense. The uses to which most people put their TechNet Subscription -- entirely legit uses -- benefit Microsoft in a number of ways. The company's answer --  time-bombed trials -- don't satisfy many of those uses. The company could have met those uses by providing things like connection-limited trials, as opposed to time-bombed trials; both would prevent production use, but connection-limited software will still support a permanent lab or learning environment. But... Microsoft has chosen not to go that route, at least so far.

Much of the anger around the TechNet Subscription is not, I feel, about the discontinuation. It's the fact that Microsoft seems to have turned a blind eye to such a large and diverse portion of its customer base, taking something away and offering nearly nothing in return. For folks who rely on TechNet, it's entirely analogous to Microsoft simply discontinuing, say, Exchange Server, and suggesting that everyone hop on Outlook.com instead. The anger comes from bafflement that the company would so completely disregard its own customers.

I'd like to say that it'll be interesting to see how this plays out -- but I suspect it won't have a satisfying conclusion. While I'll entirely admit that my own feelings about TechNet Subscription weren't strong, mainly due to the nature of the environments in which I work, I'll also admit that I'd not considered a huge number of use cases where TechNet made all kinds of sense. Microsoft's sin here is, it seems, in not making that same admission about itself. The first step in solving a problem is admitting you have one, and Microsoft doesn't, to this point, seem interested in taking that step.

Posted by Don Jones on 09/03/2013 at 11:42 AM0 comments


The TechNet Subscription Thing: You're All Nuts. Or I Am.

Related:

Probably every Microsoft IT Pro in the world knows that the company has discontinued its paid TechNet subscription program, which offered full-version, non-expiring Microsoft software for evaluation purposes only.

The vitriol over this issue is amazing. I'm impressed that IT pros are getting so involved and vocal about what they see as a major snafu.

Let me briefly quote a few passages from the now-defunct Subscription License Agreement:

"Only one person may use or access a single subscription or any subscription benefits;" if the company subscribed, then the company "must assign the subscription to only one person, and only that person may use or access the subscription or any subscription benefits." In other words, the software you got through TechNet was for one human being only to use.

"You may install and use the software on your devices only to evaluate software." Now, they give us expiring eval software for free, and I don't know why an eval would take longer than the extremely long eval period (typically 180 days), and I don't know why you'd want to pay for something that is given away for free. If you do enjoy paying for stuff that's already free, I happily accept checks for your use of Google.com.

"You may not use the software in a live operating environment, in a staging environment...." In other words, you're explicitly not allowed to use TechNet software to build out a permanent test lab. You need a test lab? You gotta pay for it. It's cheaper than production because you don't have to buy CALs, since you have no clients in the lab, but you gotta pay for the software in a permanent lab environment. From the few conversations I've had, this is the big thing people are going to miss. Because that's what they were using it for.
"You may not use the software for application development." That's what MSDN is for.

"You may not share, transfer...or assign your subscription." It's yours and yours alone. So if you set up some TechNet software and you and your colleagues worked with it... well, that's not what you paid for.

Frankly, I think Microsoft probably took this move simply because of rampant license abuse.

But we need a lab environment! Of course you do. Obvious statement. TechNet wasn't it, though. If you're a big enough organization to have a lab environment, you're probably on an Enterprise Agreement or some other contract. Negotiate free lab licenses from Microsoft. Ask for like a 1 percent overage (or whatever your number is) to populate your lab, so you can test and use software on a continuing basis.

I know. Labs take a long time to set up. But you weren't given TechNet for labs. You were explicitly not supposed to use it in labs. It was for evaluating software that you don't own, to see if you want to buy it. That was it. If you can't do that in the 180 days provided, your boss didn't plan the project out very well. And the "they don't provide evals for past products" is a bit silly. Why would you buy a past product? If you want a lab to run Windows 7... it's probably because you already own Windows 7. You're not evaluating it at that point. You bought it.

Yeah, I know folks took forever to catch up to Windows 7, and now Windows 8.1 is on the horizon. I actually sympathize with that point. A lot. But I imagine Microsoft doesn't. And for that matter, guys, we're talking about Windows 7. What's a license of that going to cost you so that you can spin up an eval environment? $180 bucks? C'mon. If you know you're going to deploy it and you're looking to build a test/pilot lab... that's not evaluating. Microsoft wants you to pay for those licenses, and it is Microsoft's software. So negotiate with it. "Hey, we'll deploy this, but we want 10 free copies [or whatever] to play with."

Now, MSDN is an alternative. A mega-pricey one. Microsoft could absolutely produce a "Lab License Pack" or something for IT pros. Maybe they should. Maybe they're entitled to full pricing for their software. I honestly don't know on this point. I'd like to see MS enabling labs, because it's a smart business practice I feel they should encourage. That said, labs bring significant benefit to business in terms of stability and reliability -- and those two things typically cost money.

Azure? Maybe. It's certainly a stopping point between "free" and "full price." But not everything is available in Azure. Yeah, you could build VMs, sure, but you still have to acquire a valid license for everything but the OS. I can't imagine building a client OS deployment lab in Azure.

I don't think the problem here is IT pros themselves. Y'all aren't crazy, and neither am I -- and neither is Microsoft. This whole issue is mainly a disconnect, I think. You need long-term lab/test/pilot environments. You believe the company is taking away the resource you used to build that environment. They're not. They never gave you that source -- TechNet was for evaluations, not testing. Not piloting. Not labs. It was for test drives, and as we all know, you don't get to keep the car when you test-drive it. You gotta buy it.

So: You continue to have an unmet need. Of course you need a lab/test/pilot environment. You need to try a ton of scenarios, and it takes longer than 180 days. You've already bought the software -- Microsoft is telling you that you have to buy it again to build your sandbox.

You want to pay nothing for the lab (me, too). Microsoft wants full price. There must be some middle ground.

What is it? Negotiating lab licensing into EA (or whatever) contracts? Cheaper versions of the software for lab use (how do you prevent it from creeping into production)? Free, non-expiring, "limited" lab editions (e.g., max 10 connections)?

What would work for you that you think you could convince Microsoft to offer?

(And please, I know this is a charged topic -- but polite, professional, realistic responses are best here -- I do plan to collect those and forward them up the chain of command I have access to within Microsoft. Help me make your argument.)

Posted by Don Jones on 08/29/2013 at 11:46 AM0 comments


What To Do in a Post-TechNet World

There's been much ado about Microsoft's cancellation of TechNet subscriptions. Officially, the company says it's already giving you those evil installs for free, so why charge you for the service? Unofficially, we all know we're annoyed because the non-expiring TechNet subs were the basis for our persistent lab environments… even though that use was, ahem, technically against the subscription license. Er.

Putting IT Pros off to MSDN isn't the answer, at least not with the current MSDN packaging and pricing. It's more than most need, and it's expensive given what they need.

A better solution would be to see Microsoft formally embrace labs, and I think we as customers should press them to do this. It'd actually be simple.

Make a "Windows Server Lab Edition" for each version of Windows you ship. Make it the same as Datacenter, but hard-limited to some small number of inbound network connections -- thus inhibiting its use as a production server. Then have the various other products teams (SQL, Sharepoint, Exchange, etc) make similar "Lab Editions" that simply won't install on anything but the Lab OS. Charge a few hundred bucks a year (a la TechNet) for permission to use these non-expiring Lab Editions, and be done with it.

Helping us inexpensively build persistent lab environments helps Microsoft, because it helps us deploy new software more reliably and more quickly -- because we can test it.

Of course, for a few hundred bucks a year you can get the cheapest MSDN subscription, which gives you "non-lab," non-expiring server software for use in a test or dev environment. Maybe that's the answer, after all.

Posted by Don Jones on 08/20/2013 at 2:16 PM0 comments


PowerShell 4: Desired State Configuration a Must-Have Feature

By the time you read this, the wraps will be off of PowerShell 4 and it's signature new feature, Desired State Configuration (DSC). The shell will ship first with Windows Server 2012 R2 and Windows 8.1, and will be a part of the Windows Management Framework (WMF) 4.0. We can expect WMF 4.0 to also be available for Windows Server 2012 and Windows 8; I also expect it'll ship for Windows 7 and Windows Server 2008 R2. We should know by the beginning of July if that expectation holds true, and if other, older versions of Windows will be supported (my personal bet: no).

So what is DSC? In short: disruptive.

With DSC, you use PowerShell to write very simple (or complex, if you like) declarative "scripts." "Script" isn't really the proper word, as there's no actual programming. You're really just building an exaggerated INI-esque file, where you specify configuration items that must or must not be present on a computer. You also specify specific "DSC resources" -- a special kind of PowerShell module -- that correspond to your configuration items. For example, if you want to ensure that IIS is installed on a computer, then there has to be a corresponding resource that knows how to check for its existence, and how to install or uninstall it. Microsoft will ship some basic resources with WMF 4.0, and because this is now a core OS feature, we can expect other product teams to follow suit in upcoming cycles.

PowerShell compiles your "script" to a MOF file, and you then ship that off to your computers. You can do that by deploying them over PowerShell Remoting, via Group Policy, or however else you normally deploy software. On those computers, PowerShell 4 again kicks in to "run" the MOF. It checks your settings and makes sure the local computer is configured as desired (hence the name of the feature). That re-runs every 15 minutes. You can also configure your computers to check a central URI for their declarative configurations, meaning they can periodically (every half-hour, by default) check in to see if there's been an update. In this "pull" model, you can also centrally locate DSC resources, and your computers can pull down the ones they need on-demand, so you don't have to manually deploy those resources before relying on in them in a configuration.

Were do I personally see this going? For one, replacing the anemic "configuration auditing" feature in System Center Configuration Manager (SCCM). That's a pretty sure bet, although it's hard to tell when it'll happen. Frankly, I can see this supplementing Group Policy objects (GPOs), if not outright supplanting them in time. After all, a GPO isn't a lot more than a bunch of registry writes, something DSC is already well-equipped to handle. Your DSC scripts are also a lot easier to version-control, back up, and so forth.

What's exciting is how extensible DSC is. Those under-the-hood "resources" are actually just PowerShell modules that internally confirm to a specific naming pattern. That means anyone can write one of these things. You could write resources that deal with your internal line-of-business applications, without needing to wait for an application vendor or developer to provide the resource to you. Anything you can do in a PowerShell script can be done in one of those DSC resources. With DSC in place, you'd configure a computer by simply pointing it to the right declarative file. "Hey, you're a Web server -- here's what you should look like. Get to it, and stay that way." Reconfiguring is as easy as changing your script: "Hey, all you domain controllers. I want you to look like this now. Go." The machines handle the reconfiguration on their own.

It's an incredible feature that'll take some time to become fully fleshed-out -- but this is clearly the "policy-based dynamic systems management" that Microsoft has been alluding to for years, and thus far failing to deliver. Now, the framework is in place.

Posted by Don Jones on 06/10/2013 at 1:14 PM0 comments


Microsoft IT Winds of Change Part 3: Becoming the New Microsoft IT Admin

More on this topic:

Things are changing in the Microsoft IT world. It's happening slowly, but it's happening. We've reached an inflection point, or are reaching it soon – and whether or not today's IT administrators continue to have a job (or at least, the same job) is very much in question.

Why Microsoft IT Admins Are Losing Their Jobs
How happy would you be if, every time you needed a light turned on in your house, you had to call the power company and have them send a tech out to rig up the wiring? Not happy. But you don't do that, because that isn't how utilities work. They're largely automated.

And IT needs to become that kind of utility. Not because we should (although of course we should), but because the business is going to demand it. Large competitive companies are already doing it, and they're loving it. IT is a utility for them. When they want to roll out some new business service -- a Web site, a VM, whatever -- they no longer have to account for the IT overhead needed to implement the service. They've already invested the overhead in automating it all. They know that, once they decide to go ahead, they'll just flip a switch, a miracle will occur, and the new service will be online. It'll be backed up, monitored, managed, patched, and everything -- automatically.

You see, the tools all exist. A lot of Microsoft IT admins just haven't been paying attention. PowerShell's here. System Center is doing this stuff already. OS-level features are supporting this kind of automation. The VMware folks have got a lot of the same automation in place. This is all possible, not some vision of the future.

And what the big companies do successfully, the smaller companies will eventually see and want. So if you're an IT pro who's used to Remote Desktop, wizards, and doing stuff manually each time it needs to be done, you're going to be out of a job. The good news is that you can have another, better-paying job -- the job of creating the automated processes your organization needs.

The trick here is that a given organization needs markedly fewer Automators and Toolmakers than it needs button-clickers. So not everyone in your IT team is going to be needed. We're already seeing a significant falloff in mid-tier IT jobs, and a slight falloff in entry-level jobs. Just like every other part of the IT world, ever, Microsoft IT is consolidating into fewer, higher-end, higher-paying jobs.

So if you think PowerShell, System Center, and related tools and technologies "aren't my job" as a Microsoft IT admin... well, I'd like fries with that, please.

How You Become the New Microsoft IT Admin
Even Microsoft's certifications reflect the fact that Microsoft is marching toward this new inflection point. The new Server 2012 MCSE requires basic System Center knowledge, and the new "MCSE Private Cloud" certification incorporates nearly the whole darn System Center family.

Could you bet against Microsoft being successful in this push of theirs? Sure. Heck, I live in Las Vegas, we'll bet on anything. But you're on the losing side of this one if you think your IT job will always be the same. The economic pressures for Microsoft's direction are too high. This isn't a direction Microsoft chose, it's a direction forced upon them by today's business realities. We need to do more and more and more, with less and less and less, and automation is the way to do it. Companies are realizing they'd rather pay a few people a lot of money to make everything automated, than pay a lot of people less money to do it all manually. Is that fair to all the Microsoft IT folks who joined up thinking they wouldn't have to be programmers and toolmakers? Nope. But life's not fair.

Rather than betting against Microsoft on this, and getting crushed under the coming wave, get in front of it. Actually, you're too late to get in front -- but you can at least pull ahead a bit. Start looking at every process that your IT team does more than once, and figure out how you'd automate it. You're going to need to know all about System Center's capabilities -- all of System Center. You'll need to learn PowerShell, and it's going to take a serious effort, not just reading people's blogs and piecing together commands. You're going to have to learn to research, to find automation tools and technologies that fill whatever gaps you find.

And your employer may not pay for all of the training you'll need -- but it's an investment in you. Get your employer to at least put automation on your annual goals -- "automate 10 person-hours worth of manual labor each quarter" or something. Something you can measure, something that forces them to invest, since they're getting the lion's share of the eventual return on that investment. Commit to doing those things, so that you're the toolmaker in your organization. The one the company can't live without.

Because the alternative is not good. Remember, the square ones are the fish, and the round ones are the burgers.

Good luck.

 

Posted by Don Jones on 03/27/2013 at 1:14 PM0 comments


Microsoft IT Winds of Change Part 2: Call for Smarter IT Pros and Private Cloud Expertise

More on this topic:

Things are changing in the Microsoft IT world. It's happening slowly, but it's happening. We've reached an inflection point, or are reaching it soon – and whether or not today's IT administrators continue to have a job (or at least, the same job) is very much in question.

Now Hiring Smarter IT Pros
Microsoft has moved firmly into the platform world, with many of the native product administration tools being almost afterthoughts. Use Active Directory Users & Computers to manage a large directory? I don't think so.

Microsoft has realized that it can never build tools that will meet everyone's needs, and so the native tools are just the bare-bones basics that a small organization might be able to get by on. Instead, Microsoft is focusing more and more on building platforms -- great functionality. But how do you administer those platforms?

This is the new inflection point in the Microsoft IT world. Increasingly, Microsoft is giving us the building blocks for tools. Application Programming Interfaces (APIs) that let us touch product functionality directly and build our own tools. Microsoft even has a word for the new IT discipline: DevOps. In part, it means operations folks (admins) taking more responsibility for programming their own tools, so that they can implement their organization's specific processes.

Yes, programming. In many cases, the new operations API will be Windows PowerShell -- but not in all cases. You'll also be using tools like System Center Orchestrator, and may use ISV tools that let you build out your business processes.

In a way, this is completely unfair to the loyal Microsoft server fan. They got on board by clicking Next, Next, Finish, when the rest of the world was off running command-line tools and writing Perl scripts. Now, Microsoft is yanking the rug out from under them. "Psych! Turns out you have to be a programmer after all!"

But there's a reason for it -- and you can either embrace that reasoning, or close your eyes and wait for it to run you over.

Forget Private Cloud. Call it Util-IT-y.
So why is Microsoft so focused on making its loyal IT professionals become scripters and programmers? Funnily enough, it's the private cloud.

Go over to GoDaddy and buy a Web site for yourself. Or, go to Amazon and buy some AWS time. In both cases, you will not find some human being Remote Desktop-ing into a server to spin up a new VM, provision your service, and send you a confirmation e-mail. It's all done automatically when an authorized user (you, a paying customer) requests it. Hosting organizations like GoDaddy and AWS don't treat IT as overhead, they treat it as enablers. Their IT folks focus mainly on building tools that automate the entire business process. Someone buys a Web site, and an automated process kicks off that makes it happen. Nobody sits and monitors it or even pays much attention to it -- it's automated.

That kind of functionality is where "private cloud" got its name. The idea is that your own datacenter (if we're still allowed to call it "datacenter") exhibits those cloud-like behaviors. Marketing needs a Web site? Fine -- it'll push a button, and a requisition gets approved, and lo, there is a Web site. IT doesn't get involved. We built the tool that made it happen once all the right approvals were in place, but we didn't click the individual buttons to set up the VM and the Web site, or whatever. We automated it, using tools like System Center, PowerShell or whatever.

But I hate the term "private cloud." I really do. I much prefer the term utility.

When your organization needs a new fax phone line, you go through some internal business process to obtain the necessary authorization. The phone company isn't involved in that. Once you have approval to pay the bill, you tell the phone company to spin up the line. More often than not, someone on their end pushes a button and lo, a fax line is born. They didn't walk out into the Central Office and manually connect wires together -- that's so 1980. It's all automated, and it "just works." It's a utility.

And that's what IT needs to become. We stay out of the business process. We stop being the gatekeepers for IT services. We stop manually implementing those services. Someone wants something, they get approval and push a button. We just make the buttons do something.

And this is why the private cloud means you're going to lose your job…

Is it evolve or die for the Microsoft IT admin? Don Jones will give you his assessment in his final installment of the Microsoft IT Winds of Change blog series.

Posted by Don Jones on 03/25/2013 at 1:14 PM5 comments


Microsoft IT Winds of Change Part 1: Admin Nostalgia Blindness

Things are changing in the Microsoft IT world. It's happening slowly, but it's happening. We've reached an inflection point, or are reaching it soon -- and whether or not today's IT administrators continue to have a job (or at least, the same job) is very much in question.

Getting Nostalgic for Microsoft IT Administration
Do you remember Windows 3.1? Not a bad OS for a home user, and a pretty decent OS for a lot of smaller business people. Well, technically not an OS, I suppose -- it was really an operating environment layered over MS-DOS. But it was easy to use, and a lot of people got pretty good at using it. Ah, Program Manager. I miss ya.

What about Windows NT 3.1 and Windows NT 3.51? Those were Microsoft's first credible attempt at a full-scale, business-class operating system. And with them, Microsoft did something pretty clever: unlike the main network operating systems of the day -- think NetWare, VINES, and Unix -- Windows NT was easy to operate and administer. Heck, it looked just like Windows 3.1! You didn't have to memorize obscure commands. You could just click your way to a happy network. Every tool you needed came right in the box: a user manager, a server manager, DNS tools, even a rudimentary Web server. All graphical, and all easy to use. Ah, Program Manager. I miss ya.

That ease-of-use eventually got Microsoft in the door of corporations large and small. As a departmental file server, it was easy to set up and deploy without having to go to the IT department and their big, bad Unix boxes or hardworking Novell servers. And Microsoft built on its success: Exchange Server 4.0 offered a point-and-click, easy-to-administer alternative to cc:Mail and even the old Microsoft Mail. SQL Server came with every tool you needed to run an RDBMS, right in the box. That was Microsoft's pattern: make it easy, and include everything you might need right in the box.

This was an inflection point in the IT world. Suddenly, you didn't need to be an "IT monk." Normal, ordinary people could be IT admins, and hordes or normal, ordinary people took the jump. The release of NT 4.0 with it's Win95-a-like "Chicago" GUI, along with the heavily-promoted MCSE and MCSA certifications of the day, saw all kinds of people changing careers into IT. After all... it was easy!

In the IT Universe, Nostalgia Is BS
If Microsoft got its "foot in the door" by making its products easy to set up, easy to administer, and easy to use -- and by including every tool you needed right in the box -- then that's also where Microsoft set itself up for eventual failure.

First of all, not every tool you needed was right in the box. Plenty of organizations ran up against limitations and inefficiencies, and either ponied up for supplemental tools or just decided to scrape by with the native administration tools.

Second of all, "easy" also means "one size fits all." That is, a product can only lack complexity if it isn't very flexible. Organizations quickly started realizing that, and Microsoft responded by building in more flexibility. The problem is, flexibility always comes with complexity. If you can do something only one way, there's no need to make a decision, consider criteria, or anything – you just do the one thing the one way. As soon as you start having options, you have to decide what to do. You have to weigh pros and cons, decide what option is right for you, and then actually implement that option.

And of course Microsoft still had to actually ship products. Think about that: the OS and their various server products (Exchange, SQL, etc) are now more complex, and offer more options, so they obviously take longer to physically code. That leaves less time for coding tools. And so while the products became more flexible and capable, the tools didn't often keep up. Microsoft started focusing less and less on providing great tools, and instead focused on providing a great platform, upon which other people could build the specific tools a given organization might need. Ever try to pass a SOX audit using the built-in auditing and reporting tools? Exactly.

And this paved the way for the new inflection point, which would be almost the opposite of the last...

Look for Don Jones' assessment of what will make up a future Microsoft IT administrator in the next blog post.

More on this topic:

Posted by Don Jones on 03/22/2013 at 1:14 PM1 comments


Where's Your Runtime Picture Book?

A lot of organizations have a "run book" – a binder full of step-by-step instructions for accomplishing nearly every major IT task they perform. In fact, the term run book automation, as implemented by products like System Center Orchestrator, are designed to help automate those tasks.

First Point
As a decision maker in your IT organization, if you don't have a run book, start one. Right now. Make your team document every single thing it does. In detail. First, you'll help preserve institutional memory; second, you'll set yourself up to automate those tasks some day.

Tip: Make it a physical book. Electronic documents are fine so long as the electronics are working, but if something goes offline you'll want your run book to walk you through bringing it back online.

Second Point
Make it a picture book. The simple fact is that, when you need your run book it's because the person who's memorized a given procedure is unavailable, or because the proverbial fit has hit the shan. In either case, you want a clear, easy-to-follow set of directions. Like a picture book. As you create your run book, document each step of each task by using screen shots, not just words, so that the run book is as easy to follow as possible when need arises.

And then start focusing on automating those tasks using an appropriate tool (like Orchestrator, if that works for you). Performing tasks should be as easy as picking the task and clicking "do it."

Posted by Don Jones on 02/28/2013 at 1:14 PM0 comments


Clearing Up Some Windows 8 Activation Confusion

With the release of Windows 8 to MSDN and TechNet subscribers worldwide, we're starting to see more and more people setting up their first machines using the final OS code -- and starting to see more questions about some specifics. Unfortunately, Microsoft hasn't been providing much in the way of answers at this point. For example, my colleague, Jason Helmick, contacted me after testing some of the Windows Activation features in Windows 8. I'm providing his narrative below, enhanced with some of my own discoveries and comments in [square brackets]. I'd love to hear your comments and findings, too -- please drop them into the comments area below. With that said, take it away Jason...

Q: I'm confused about the Windows 8 Enterprise/ Pro Activation.
A:
The three download versions of Windows 8 can be somewhat confusing at first, until you realize the purpose for each one. Lack of documentation in this initial stage of release has had more than one person download all three just to see which one they can license.

In my case (TechNet key in hand) I ended up downloading all three to see which one would take the key. The answer is the standard download (not Pro or Enterprise) but after working with the Enterprise and Pro versions I ran into the new activation process and had some questions. Without having any documentation to explain the new activation process I did an initial test. I'm left with more questions than answers.

The Pro and Enterprise downloads are designed to receive their activation through a traditional KMS server or the new Active Directory Based activation (ADBA). The Enterprise version of Windows 8 still supports Multiple Activation Keys (MAK) if that's your preference.

[I'll note here that I was able to help Jason find the right download and confirm his observations. The TechNet "Windows 8 Professional VL" ISO image requires a volume license key and Active Directory, or KMS-based activation; the "Enterprise" ISO also requires on-network activation or a Multiple Activation Key, or MAK. The "plain" Windows 8 ISO will accept a Professional key and activate as Windows 8 Professional.]

So, without a KMS server or MAK available, I decided to test Windows 8 Enterprise to see if there had been changes to the activation process, and to test the time it took the OS to expire when not activated. I'm not a hacker, and I'm not trying to pirate software; I'm just trying to understand from an administrative deployment perspective what is going to happen if activation fails.  Documentation for this seems elusive at best (or doesn't exist). Here are the questions that I had when starting my experiment:

  1. How long does it take before the desktop activation message appears?
  2. How many rearms can I perform?
  3. How long does it take between rearms until the next activation message?

Perfectly legitimate questions if you're deploying Windows 8. After all, we need to know what happens when things go wrong. What symptoms indicate an un-activated copy of Windows? What will users be telling the help desk they're seeing? What can we expect? Crucial concerns, and I was concerned about the differences between Windows 8 and Windows 7. Hopefully, I thought, they'd be identical.

However, the answers I started seeing my experiment weren't what I expected. Perhaps some documentation or feedback could help understand what's happening. Here are the results from my initial experiment. (I'll try more testing soon):

Action

Result

Lingering Question

Changed the clock forward one year (Windows and BIOS) to force activation message.

It did not attempt to activate nor did it display a desktop message.

So, how are they determining when it's time to activate?

I forced activation with the set-forward clock.

Activation failed looking for KMS and again, no activation message.

 

Reset clock

n/a

n/a

Checked SLMGR/ dlv for information about activation period (time till activation)

No time period listed, but was shocked to see 1000 rearms available.

1000 rearms? I can rearm this product 1000 times? This seems like a lot.

Tried a rearm – SLMGR /rearm

999 rearms left

I wonder if I can script this to decrement the counter to 0?

I tried to script the rearm

Must reboot after each rearm so I need to make a better script

n/a

I decided to leave the box alone to see when it would display the desktop activation required message. The current time was 8:30am

The activation required message appeared almost exactly 24 hrs later.

Ok, so the first activation message occurred in 24 hrs. I have a 999 rearms – Microsoft could not possibly want me to have 999 days without licensing the product. Could they? That's almost 3 years!

Performed rearm and waited for next Activation message

n/a

n/a

In the interim I parsed the logs for activation.

While you can see events for the activation process, I was unable to find an event logged when the desktop message "Activation Required" occurred.

Why isn't this logged? Such an event could be useful for monitoring computers for this problem.

Activation required appeared on the desktop approximately 8 hours after I rearmed

n/a

This makes sense. The activation time is getting shorter, forcing a rearm sooner. That explains the high rearm count as it will get quickly used if the time continues to shorten.

Rearmed the system

Activation required appeared on the desktop approximately 4 hours after rearm

Again, it seems that the time windows is closing.

Rearmed the system

Activations required appeared on the desktop approximately 2 hours after rearm

The time window is definitely getting shorter.

 

At this point I had to stop the initial test. I may have made errors in this test and I want to examine it further. However, it would be nice if Microsoft would explain it so I didn't have to perform new tests.

Here's the question that is bothering me the most, and what I'm going to script and test for next week: After all the rearms, will Enterprise stop working?

In all previous tests Windows 8 continued to work normally without removing functionality (at least as I could determine). I could join it to a domain, remove it from a domain, etc.

So, what happens when you reach the end?

[Thanks, Jason.

This definitely seems to be a new twist on the old activation strategies used by Microsoft. Granted, the Enterprise edition is meant for… well, enterprise use, so it's nice to see generous terms and a shrinking activation window. It would still be nice to know how small that window can actually get (if it continues to divide by half, it's be down to nanoseconds pretty rapidly), and what happens if you reach the end.]

 

 

Posted by Don Jones on 10/24/2012 at 1:14 PM22 comments


Windows 8: What Microsoft Isn't Telling You

Editor's Note: This blog post was written prior to the news that Microsoft's new interface would not be called "Metro." All references to "Metro" were left intact for clarity.  

Microsoft may have really messed up with their Windows 8 strategy. Its first big mistake was releasing community previews of the new OS, while barring most employees from talking about it or even admitting to its obvious existence. Release code without being able to explain it? Bad move. What messaging we did get was preliminary and off-base... and the company may pay for that. Here's what they should have told us.

It Isn't a "Metro Desktop"
The most controversial feature of Windows 8 is the so-called "Metro Desktop," a term that Microsoft never should have let pass without comment. Metro was never intended as a noun; it's an adjective, as in "the desktop with the Metro styling." And it isn't a desktop at all, obviously -- it's a dashboard, an evolution of the gadget-laden Sidebar of Windows Vista, and very close in functionality to the Dashboard view in Apple's Mac OS X. It's actually a better implementation than Apple's, because its tiles are more organized, and it also serves as an application launcher -- something Mac OS X distributes across a file folder, the LaunchPad view, and some other mechanisms.

In other words, the Metro-styled Start screen is a great idea. Its bad rep comes almost entirely from Microsoft refusing to speak about it for so long, and then not talking about it very well when they did ungag. Sure, there are some heavy restrictions on the apps that can run on this Start screen, but it's mainly made as a dashboard -- think lightweight gadgets, not full applications.

What's ironic is that the new Start screen is designed mainly to be touch-friendly -- something that's also been controversial, as the thing is practically unusable with just a mouse. Add in some keyboard shortcuts, though, and it's very slick. Start typing application names and up they pop. Hit "Windows+I" for the sidebar menu thing that took me forever to find with my mouse. Eventually, when we start using Apple-esque multi-touch trackpads with our computers (not just laptops), I suspect the new Start screen will be even nicer.

Microsoft just didn't get out in front of this one fast enough. It reminds me of the first ads I saw for Vista, which focused exclusively on the stupid "flip 3D" application-switching feature. Microsoft buried the headline there, focusing on something trivial and not communicating very well on what made Vista different. Microsoft's made that same error with the new Start screen, and the world in general has crucified them for it.

It's Business-Friendly
Windows 8 comes across as a consumer release -- so much so that a lot of businesses are disregarding it out of hand. Again, that's because Microsoft is burying the headline. In proclaiming Windows 8 as touch-friendly, etc. etc. etc., it's forgetting that 99.9999 percent of its customer base doesn't use touch-based devices. But listen to the messaging coming out of Redmond and you could easily imagine that Windows 8 just isn't something businesses need to care about.

Au contraire.

Windows 8's deeper integration with the improved DirectAccess is nothing short of brilliance. The ability to centrally manage VPN-like connection options for your users, who can simply double-click and connect, is awesome. DirectAccess finally works, is straightforward, and is something everyone should look into -- and Windows 8 really utilizes it best.

SMB 3.0, the new file-sharing protocol in Windows Server 2012, gets the best love from Windows 8. Automatic -- nay, automagic -- failover of load-balanced SMB shares means a whole new way to think about clustered file servers.

Oh, and three words: Windows To Go. Part and parcel of the enterprise edition of the product, it enables you to build a secured, encrypted (via BitLocker) corporate image of Windows 8 on a USB thumb drive. Users can pop that into any computer and get the corporate desktop right there -- and when they pop out the drive, the desktop goes away, leaving no traces on the machine. This is so cool I can barely even wrap my head around it -- yet it's a feature getting relatively little spotlight.

It's Just as Good as Windows 7
The deployment technologies, support processes, and almost all the rest of Windows 8 are astonishingly similar to Windows 7. I think, in their race to look as cool as Apple, Microsoft is making Windows 8 seem a lot more revolutionary than it is -- and I mean that in a very good way. Enterprises don't like revolution; we like evolution. Small, incremental changes we can cope with. Just for once, I'd like Windows 7.1, 7.2, and 7.3 – and Windows 8 actually feels a lot like that. It certainly exhibits about the same level of change as you get going from Apple's OS 10.7 to 10.8 -- some major cosmetic overhaul, some new concepts, but the same basic stuff at the core.

I think you'll have no issues running Windows 8 alongside Windows 7, or even skipping 7 and going straight with 8 if that's where you are in your lifecycle. You need to get the heck off of XP, that's for sure.

Yeah, the new Metro-styled Start screen is going to throw some people for a bit of a loop, but so did the Start menu when Windows 95 was introduced. They'll adapt -- they'll just whine a bit because they haven't had to adapt to any major changes since 2002 when you deployed XP the first time. And yeah, the new Metro-styled experience isn't comprehensive -- you'll find yourself dumped out of the "Metro Desktop" and into "classic" applications more often than you want, especially when you start fiddling with Control Panel stuff. That's fine. It's not as elegant or complete as an experience as we might like, but it's perfectly functional. We -- and our users --  are going to grumble about something anyway, so it might as well be that, right?

Posted by Don Jones on 10/23/2012 at 1:14 PM190 comments


Subscribe on YouTube

Upcoming Training Events

0 AM
Live! 360 Orlando
November 17-22, 2024
TechMentor @ Microsoft HQ
August 11-15, 2025