I closed out last year with several articles that prompted you to complete a short online survey. Several of you were kind enough to speak with me on the phone for some follow-up questions, and I'm ready to share some results.
This time I'll focus on my questions about help desk management software. My interested was prompted by the fact that help desk software seems to be so prevalent today, compared to a decade or so ago when there were only a few major commercial solutions and a lot of home-grown ones floating around (I wrote one myself when I was at Bell Atlantic Network Integration). I'm also seeing more and more solutions being released that incorporate help desk software -- which struck me as odd, because I kind of thought everyone already had something in place by now.
Most of you do have something in place -- BMC, Remedy and HEAT were some of the brands I expected to see, and I wasn't disappointed. Other brands, like ManageEngine's offering and ScriptLogic's product, also cropped up. Interestingly, about a third of you said your solution wasn't well-implemented in your organization, and a fifth that the product was too complex. I see and hear that a lot. It seems like help desk solutions can easily become the IT equivalent of SAP -- extensive implementation times without a lot of results. A fifth of you also said that your solution doesn't offer Web/mobile or self-service interfaces, which in this day and age should be unforgivable. A fifth of you also said that your solution takes "too long to use," which tends to keep people from using it. What a waste!
Every single person who answered the survey, however, indicated that a help desk solution that was tightly integrated with monitoring and configuration management tools would be absolutely appreciated and essential -- even though I know most of you don't have such a solution today. I'm actually a bit surprised that Microsoft hasn't bought someone and released "System Center Service Desk" or something, which would, of course, tightly integrate with Configuration Manager and Operations Manager. There are companies playing in that "integrated" space, though, including Nimsoft and ManageEngine, so there's at least some promise for growth in that space.
All in all, this survey suggests that you're all working harder than ever -- with a lot of manual effort -- to keep your users connected and productive. It's the shoemaker's kids, right? We provide great services and tools for the business, but we seem to get so few ourselves!
Posted by Don Jones on 03/08/2012 at 11:18 AM5 comments
Last year I asked you to complete a short survey on your team's essential IT skills. A huge number of you took a few minutes to answer those questions -- thank you! Thanks also to those of you who agreed to speak with me on the phone for some follow-up questions.
The news, unfortunately, is not good. I'd asked about your team's grasp of basics like network troubleshooting, AD basics, and so forth, and almost 50 percentof you said that less than half of your team (but more than a quarter) really understood those basics. Another 25 percent of you said that only about a quarter or less of your team grasped the foundations. That means we're seriously lacking some basic skills -- and I think I know why.
Take a look at nearly any IT-level computer course these days and you won't see much of the basics. They're missing from certification exams as well, even though 80 percent of you said that these skills were "very valuable," with another 15 percent checking in at "valuable." The fundamentals have been pushed out by the ever-increasing number of features that have to be taught, and by some economic realities.
Look, let's just admit that Windows Server 2008, as one example, is a lot more complicated than Windows NT 3.51. Earning an MCSE in Windows NT 3.51 required you to pass six exams, which were supported by about four or five weeks' worth of Microsoft Official Curriculum training. Earning an equivalent certification in Windows Server 2008 requires basically the same number of exams, supported by somewhat less classroom training. That means you're mostly being taught, and tested on, new features -- while the basics have been squeezed out. Nobody's going to accept a 12-exam certification or send their folks to 10 weeks of training, but in reality today's products are probably complex enough to warrant it. So the exams and classes we get have to stuff more into the same amount of time, and so the foundation stuff just doesn't make the cut.
It's a shame. A huge 90 percent of you said your IT teams would be more effective is essential skills were better-understood by a majority of the team, but it's damnably hard to even find training on networking basics, for example. It's like sending your kid to school and having them learn trig without learning basic addition and subtraction, because they just use calculators for that low-end stuff.
Posted by Don Jones on 03/01/2012 at 4:10 PM4 comments
If you're working in the IT industry and not prepared to position yourself as a useful resource in the era of the cloud... well, I hope you have a copy of "What Color is Your Parachute?" sitting around.
Whether the cloud is the right thing or not for your company isn't really important. Some companies will make the right decision regarding the cloud, and many won't. The short-term attractiveness of the cloud's pricing model, if nothing else, will make many organizations take the plunge whether it's the right thing to do or not. Fight that decision when it's a bad call for the company; be prepared to benefit from the cloud whether it's the right thing to do or not. In other words, don't be caught flat-footed.
Start by making yourself a semi-expert on the cloud. What's the cost model? How does it compare to your internal costs? The fact is that most organizations haven't the foggiest notion why their IT department costs what it does -- they just see a giant number on the P&L every quarter. Show them a smaller number from "the cloud" and things start to look interesting, at least to a certain kind of manager. Money will always be the first driver in a cloud adoption, so if you think the cloud is a bad call for your organization, know the money answers. Know what IT costs and why, so that you can help promote a real apples-to-apples comparison.
Then, assume your organization will make the decision to push something out to the cloud anyway. Maybe they won't -- in which case you're fine. But maybe they will. And if you can't fight the decision with solid logic, then be prepared to benefit from the situation anyway. Get yourself skilled-up. No cloud-based solution offers zero management overhead; make sure you're the one who can be the hero.
Here's a simplistic example: Your company decides to go with Office 365 for e-mail and much of its SharePoint-based collaboration. Does that put you, the Exchange or SharePoint admin, our of a job? Possibly. But O365 still has management requirements. Someone has to create mailboxes and manage the service, which actually requires an unexpected amount of PowerShell command expertise. Make sure you're prepared to be indispensible when the cloud comes, and you'll be the one to keep your job.
In case you missed it:
Posted by Don Jones on 02/13/2012 at 9:22 AM0 comments
All of today's talk of "clouds" is often accompanied by "private cloud," a phrase that's nearly as overused and useless as "cloud" itself. Isn't a "private cloud" just what we used to call "our datacenter?"
From a technical perspective, yes. The private cloud is just the stuff that's always been in your datacenter. What's different is in how you manage that stuff, and in how you offer it to your organization.
The public cloud has some very specific characteristics that differentiate it from the type of outsourcing we've used in the past:
- Self-service. You spin up new services as you need them, and they come online almost instantly.
- Pay as you go. You're typically billed for what you use: Bandwidth, disk space, processing power, number of users, and so forth.
- Abstractedness. You don't typically have a technology-centric view, meaning you're not necessarily dealing with servers and disks. You get your resources from a giant pool, which from your perspective is infinite.
A private cloud is simply the old datacenter, re-tweaked to offer those characteristics. Being able to create charge-backs based on individual departments' or users' actual utilization is one characteristic that starts to make your datacenter look cloudy. Spinning up virtual machines more or less on demand is privately cloudy. Being able to shuffle VMs around to whatever physical host has the necessary resources to run it, all invisible to your "customers," is private cloudishness.
As with the concept of "cloud," the private cloud isn't some amazing new set of technologies: It's just a different way of managing your technologies.
In case you missed it:
Posted by Don Jones on 02/09/2012 at 9:49 AM1 comments
Part of what frustrates me about "the cloud" is that it isn't anything entirely new. There are really only two major things that are driving this new wave of cloudiness in IT:
Multi-tenancy. Software vendors are now offering products that have a built-in understanding that multiple customers will be sharing the same infrastructure. The software thus builds walls between those customers, so they each feel as if they have a service dedicated to them. Service providers have been doing this for years by hacking together custom management consoles -- Web hosts being the leader in doing so. The cloud has taken off in large part because the first-party vendors are now building that intelligence into their products.
Bandwidth. The wide availability of cheap bandwidth -- both wired and wireless, including cell-based bandwidth -- means it's easier and easier to get to your data and services regardless of where it lives. In the past, we used dial-up to get everywhere. It made sense to dial into your company's datacenter, so it made sense to keep all of your services there. Nowadays, you're using the public Internet as your "dial-up." That means you're always on the Internet, and then you use it to reach your data. So your data might as well live anywhere, not just in your company datacenter. Security issues aside, there's no connectivity reason to keep data in the datacenter.
It's the confluence of these two directions that's making the cloud (a) possible and (b) attractive. Ten years ago it was ridiculous to think of outsourcing your Exchange or SharePoint services to someone else; today, it's a question of cost, security and availability -- but it's certainly not a ridiculous question.
For an extreme example, take Microsoft's Windows Azure platform. I'm obviously oversimplifying in the extreme, but Azure is (on paper) a hacked-up version of Windows and SQL Server designed for multi-tenancy. Toss in the ubiquitous availability of Internet bandwidth, and Azure becomes positioned for success. Amazon, Google and other cloud computing options are, at their heart, much the same: Always-on, easily-accessible, multi-tenant services that you could build in your own data center, if you wanted to. But why bother?
There's a third factor that really clinches the cloud deal:
Bargaining. Computing resources have become so cheap that bulk purchases are practically pocket change. Amazon doesn't buy servers one at a time, it buys them by the truckload, and it's getting a per-unit price that's astonishingly low. It buys electricity in bulk, too, often positioning datacenters close to the electrical grid and negotiating transmission rates down to comparatively nothing. Thus, the price that a cloud host like Microsoft or Amazon can offer you is markedly lower than you could ever buy yourself.
This is really just good old Moore's Law taken to an extreme. These three points simply prove that the cloud isn't some revolutionary new tactic; it's just a natural evolution of different lines of thinking and technology, finally coming together in what is quite frankly an almost inevitable conclusion.
In case you missed it:
Posted by Don Jones on 02/08/2012 at 9:28 AM1 comments
I think it's a good time for IT Decision Makers to face some stark realities. This "cloud" thing is creeping up on us, and many analysts claim that 2012 will be the year that cloud computing and cloud services really take off. That means, like it or not, you're going to be dealing with something "in the cloud," if you're not already. What's that mean?
Within a couple of years, every single business with more than a couple of employees will, in some fashion, be using "the cloud." Whatever "the cloud" means. Smaller businesses will likely be using cloud-based e-mail services like Gmail or Office 365; many are already beginning to do so. Some businesses will get their cloud-based services -- like e-mail and collaboration -- from a Managed Service Provider (MSP), whose datacenter can now officially be called a "cloud." Even massive enterprises with huge infrastructure investments will, in some way or another, be using something from "the cloud," even if that's nothing more than the cloud-based Web site analytics called "Google Analytics."
I'm seeing an awful lot of IT professionals beginning to live in Cloud Denial. They've spent years honing their skills as Exchange admins, SharePoint admins, SQL Server admins and so forth, and they're full of reasons why this "cloud thing" shouldn't be used in their environments. In some cases, they're correct: For some businesses, certain functions should be in-sourced and not out-sourced. That reason, however, should never revolve around an IT person who fears their job will go away. The decision to in-source or out-source a given service or function should be a 100 percent business-related decision, based upon costs, benefit, control, security, and more.
I recently got an e-mail from a fellow who had recently lost his job as an Exchange administrator. His company had been using Exchange Server 2003 (!!!), and when faced with the costs of upgrading to 2010 -- new servers, new software, new training, new architecture and more -- decided it was easier and more financially efficient to outsource its 5,000 mailboxes to someone else. It still had a degree of administration, such as mailbox adds/changes/deletes, that need to be done, and so it retained the portion of the Exchange admin staff that was needed to perform those tasks. My correspondent, however, had been in denial about the coming of the cloud, and didn't have up-to-date skills (or an interest in obtaining them, from what I could read). So he was let go.
It's unfortunately, but it's going to happen. There may be a zillion legitimate reasons why your company can't outsource some particular function to the cloud, and you should be prepared to make that argument in business terminology. You should absolutely help your company do the right thing. However, you should also be prepared for the "right thing" to include outsourcing, and make sure you're positioned to still have a job if that happens. Frankly, I think it's only practical to also assume that your organization might outsource something even if it's the wrong decision. Companies do make bad decisions, after all, often when looking only at short-term goals. That being the case, make sure you're well-positioned to be retained even if your company does make a bad decision about outsourcing. Don't just fight the tide – be prepared to swim with it.
In case you missed it:
Posted by Don Jones on 02/06/2012 at 9:29 AM3 comments
After a smooth migration from Google Apps to Office 365, we're pretty pleased.
The Exchange side couldn't work more smoothly. SharePoint -- after a couple of hiccups and re-starts -- is also a blessing. Our lesson? Don't mess with the built-in groups that SharePoint creates when you set up a new site. Sure, you can change their membership, but don't delete them or SharePoint just doesn't work right any more.
Our biggest beef is in how external users get invited into a SharePoint site. As near as we can figure, they have to have a Hotmail address, since that's the only method external users have to authenticate to the O365 system. Seems weird. I realize authentication has to come from somewhere, but forcing folks to use Hotmail just feels awkward. Even requiring a generic "Live" account would be better.
We were pleasantly surprised to discover that O365 is extremely Mac-friendly. I use a Mac myself for my "knowledge worker" tasks, like checking e-mail and writing (Office for the Mac is a wonderful product), and have had zero problems. The Outlook Web App experience is fantastic. My Office Mac applications talk to SharePoint seamlessly. Even the latest build of the Lync Mac client works very well for online presence and chats, although that was the last piece of the puzzle to come together. I've even switch from using the Mac's native PIM apps over to Outlook Mac, and I'm pretty pleased with the experience, and with the feature-parity between it and the Windows version. O365 works flawlessly with my iPhone and with other users' Android phones -- my Outlook Tasks even show up in iOS 5's "Reminders" app.
I'm disappointed by the amount of PowerShell we had to use to get our O365 up and running, but those were one-time events for the most part. I do think Microsoft needs to provide a bit more GUI and a bit less PowerShell for those tasks, especially setting up a custom domain name. It's not that I dislike PowerShell (I'm a PowerShell MVP, after all), but O365 just isn't being marketed to businesses that would have any reason to even know what PowerShell is.
That does highlight a good point, though: A lot of Managed Service Providers (MSPs) are miffed at Microsoft for launching O365, because they view it as an end-run to the customer that would normally go to an MSP for outsourced e-mail. Those MSPs can relax. Most SMBs should at least consider buying O365 through an MSP, rather than directly from Microsoft. There are technical bits to set up and maintain, including custom domain names and a lot more. There are caveats in O365 than an MSP can help you understand, like the max-recipients-per-day cap. An SMB with a savvy "tech guy" can probably go O365 on their own; a business who doesn't want to have that on-staff geek is going to have to rent that geek, and he or she might as well come from an MSP that can also provide ongoing support.
In case you missed it:
Posted by Don Jones on 01/23/2012 at 9:34 AM0 comments
To migrate from Google Apps to Office 365, we decided to try a third-party migration tool. We'd been asked to write a competitive analysis of the available migration options, including the built-in one (which is mainly suitable for migrating from an Exchange Server), so this seemed like a good time to try the options. The process went off perfectly, using a self-service option that cost just $10/mailbox and let our users handle their own migrations simply by entering their old and new passwords into a Web page. Mailbox migrations can take a long, long time. Had we been smarter, we might have opted for an admin-controlled migration rather than the user self-service, because the tool would have enabled us to filter out all the old trash content that Google was hanging on to. As-is, we discovered that Office 365 starts applying bandwidth throttles after a few thousand messages are added to your inbox, meaning each mailbox took almost a full day to migrate. We later learned that office 365 support can suspend those throttles during a migration if you give them a call.
Once migrated, we started the process of adding our custom domain name (ConcentratedTech.com) to Office 365. This is where the process becomes slightly less-awesome. Right now, you can add the domain name easily enough, but it won't become your users' default send-as domain, meaning we were still sending e-mail from the "onmicrosoft.com" domain name associated with our account. Changing that default involves downloading the Office Live PowerShell cmdlets, opening a PowerShell Remoting session to our Office 365 server pod and running a couple of PowerShell cmdlets. This is something you should be aware of if you're considering an O365 move; frankly, given that O365 is largely targeted to SMBs who don't have a large IT staff, I think Microsoft should make this a bit easier and Web-based.
PowerShell cropped up again when we needed to mass-import contacts into the Global Address List (GAL) as external contacts. We used Excel to create a consolidated contact list, and then a couple of PowerShell commands brought that information into the GAL. Again, as big a PowerShell fan as I am, I think this is something that needs to have a Web front-end on it. O365 simply isn't being sold with the "by the way, hope you know how to use PowerShell" message as part of its marketing.
With the migration over, we settled in to using O365.
Up next: The Verdict
In case you missed it:
Posted by Don Jones on 01/20/2012 at 9:43 AM1 comments
My company, Concentrated Technology, recently made the leap from Google Apps to Office 365. We mainly work with Microsoft products and we figured "what the heck." In addition, we were really dissatisfied with Google Docs. The inability to create a true folder hierarchy, the difficulty of sharing sets of documents with external contractors...it was just a bit much. We all use phones that are Exchange Server-compatible, so we wanted to get some of the advantages of Exchange. While Google emulates Exchange pretty well, it doesn't have quite the same calendar sharing and other features that make Exchange great.
Deciding on an Office 365 plan was the toughest decision. The various "P" and "E" plans all offer different features. We initially leaned toward the P1 plan, but after reviewing its limitations, we worried that it might not be enough for us. Keep in mind that you can't ever migrate from a "P" plan to the larger "E" plans, or vice-versa, so if the biggest "P" plan might not suit you forever, then you need to up the ante and go "E."
The fine print was also a bit troubling. The "P" plans, for example, have a hardcoded e-mail limit of 500 recipients per day. Per day. If you've got a dozen users, that's actually not very many outgoing recipients -- just about 40 recipients per person, per day, which is something we felt we could easily exceed. The "E" plans have a much larger limit. This is actually the most troubling thing about Office 365; I understand why Microsoft does it (to help prevent O365 from becoming a spam source), but there has to be a better way to achieve the goal than an across-the-board cap.
1/19 UPDATE: I was contacted by a Microsoft spokesperson today who said this limit has been lifted; the number of recipients is no longer capped at 500.
A final caveat: Because two company's two owners are journalists, we'd both been involved in the O365 beta, as well as in a "P" plan trial. That meant the accounts we'd created for those purposes couldn't be used for our new, permanent account. So instead of getting "concentratedtech.onmicrosoft.com" as our account base name, we had to pick something else. Of course, once we migrated, nobody would see that base name because we'd use our own domain name, but it's something to be aware of: If you do a trial, either use a totally fake name, or use the name you plan to proceed with and stick with it.
Up next: The Migration
In case you missed it:
Posted by Don Jones on 01/18/2012 at 9:45 AM3 comments
I've written about it before, and I'm here to do it again -- but it's still worth a read. However, before I dive in, I want to remind y'all that I'm just the bearer of the news. While I personally think this is a good direction for Microsoft, much of that stems from my own IT background. Based on the comments in previous articles, some of you disagree -- and you should make that opinion heard at Microsoft, where it'll matter. In case there's any confusion, I don't work for the company.
You'll want to go here and read a short article from the Windows Server team. Go on, read that. I'll wait.
The points to take away:
- The full GUI experience on the Windows Server OS is now optional. Software meant to run on a server should not assume a GUI will be there, nor should it take for granted any of the many other dependencies that the full server OS has traditionally included. My analysis on this: It's Microsoft's shot across the bow. You'll see a stronger position on this sometime in the future -- maybe a few years off, but sometime. They want server apps to assume they're running on what we used to call "Server Core."
- The recommended way to run the Server OS is without the GUI. Didja see that? No, you don't have to think it's a good idea -- I'm not pushing acceptance. I'm pointing out what's happening. These are the facts on the ground.
- Microsoft has taken a (what I think is a very good) middle-ground step by introducing a "minimal GUI" mode in the server OS. That means you can have your GUI tools on the Server OS, as well as on your client computer, provided those GUI tools play by a few basic rules and don't assume too many dependencies (like the presence of IE). They'll have the full .NET Framework at their disposal, for example, which should help -- especially if they're tools based on the MMC architecture. So this gets you a "lighter" version of the Windows Server OS, but still lets you manage right on the console.
My opinion, for what it's worth: Anyone who thinks "minimal GUI" mode is anything more than a holding measure is crazy. To me, this clearly says Microsoft is trying to get us off the console for good. They know we're not ready to give it up completely, so this is them trying to wean us off. Maybe I'm wrong on this -- it's happened before -- but it sure seems that way when I look at the big picture.
- Notwithstanding the "minimal GUI" mode, Microsoft is recommending to software developers to not assume a GUI will be present. The full, rich GUI experience happens on the client. Not allowed connect to your servers from your client computer? The suggestion appears to be "rethink your architecture."
Again, folks, I'm not advocating the news -- I'm just reporting it and offering some interpretation. I'm well aware that many, many, many admins out there aren't looking forward to a GUI-less server operating system from Microsoft. I've heard and read many cogent and thought-out arguments against it. I'm sure Microsoft has too. They're proceeding anyway. We have two choices: adapt or die. I'm sure the NetWare folks felt exactly this way when they saw their first Windows NT 3.51 server. My point is that Microsoft is going in this direction, despite the fact that they surely know there will be disagreement. They're obviously taking baby steps -- take this "Minimal GUI" thing as a clue. We can either be prepared to manage our servers in the brave new world, get different jobs or get promoted out of harm's way.
Don't think Windows Server without a GUI is a good idea? I'm not judging you, and I'm not saying you're wrong. Think Microsoft is stupid for heading in this direction? Time will tell -- we'll wait and see, since that's all we can do. For myself, I'm going to make sure my job is secure by making sure I know how to operate in the world that's coming. It's like taxes: I don't have to like them, I just have to figure out what numbers to write into the little boxes.
(I'll admit that I'm personally biased to like the direction Microsoft is headed on this. I come from an IT background where going in the datacenter was practically verboten; I spent more time trying to make sure I could remotely do everything, without an RDC connection, then actually doing anything. But the environment I cam from was a minority back then, and it's probably a minority now. Although I'm glad it prepared me for what appears to be coming.)
My opinion is that Microsoft is pointed toward a world of "headless servers:" Minimal functionality from the console, rich management from a client computer. This is a step in that direction, and it's intended to put us, and software vendors, on notice. Me, I'm going to take the hint. I hope y'all do as well. Windows Server "8" is a chance to start getting on board with what Windows will become -- it's not throwing us directly into the fire, but I think we have to take the opportunity to start adapting to this new direction.
Posted by Don Jones on 01/12/2012 at 3:33 PM43 comments
There's a bit of confusion on the topic of Active Directory forest recovery, and I'll admit that I was caught up in the confusion as well. The confusion stems, I think, primarily from third-party software vendors who either have, or do not have, forest recovery tools to sell.
Vendors who don't sell forest recovery tools will tell you that they don't do so because you can only do a forest recovery on conjunction with Microsoft Product Support Services. If you attempt a forest recovery on your own, Microsoft may not support any further issues you have with Active Directory in the future.
Vendors who do sell forest recovery tools will tell you that their tools make a forest recovery faster and easier, and that their tools follow Microsoft's recommendations for completing the process safely.
They're both right.
This is a point I myself was confused about. It turns out that Microsoft is perfectly fine with you using third-party forest recovery tools to speed up the recovery process as you perform it with Microsoft's guidance.
You do have to be on the phone with Microsoft, but you can use tools to speed up the tasks they ask you to perform.
So do you need a forest recovery product?
It depends. I'm told that all of the Fortune 500 own one, and that makes sense to me. Giant companies tend to buy a lot of insurance for a variety of things, and owning a forest recovery product is really a form of insurance. A Fortune 500 company that's missing an AD forest could be losing millions of dollars per hour, and a tool that speeds up the process probably makes perfect financial sense.
On the other hand, a very small company with a single domain and just a handful of domain controllers might not find financial sense in a forest recovery tool. Keep in mind that such a tool will neither prevent a forest failure, nor make recovery instantaneous; there's still a Microsoft-guided process to go through. So for very small companies, the time saved might not outweigh whatever financial loss they could expect to incur during the outage.
Other companies are going to just have to look. How much time will an unassisted forest recovery take? How much will a recovery tool speed things up? How much money will you lose either way? If the additional time -- and thus financial impact -- of not having a tool is greater than the cost of such a tool, then you should probably consider buying the tool. Think of it as a specialized form of insurance policy, much like your flood insurance or worker's comp insurance.
All that said, you should absolutely separate forest recovery -- which is they very definition of "disaster recovery" -- from the day-to-day data restoration that your company also needs. Every company should have a product capable of doing single-item and attribute-level recovery for Active Directory, capabilities not natively offered in any version of Windows (the Win2008R2 "Recycle Bin" feature can't do attribute-level restoration). Such a restoration tool may or may not be part of a change auditing solution, depending on whether or not you need such a thing. Restoration tools exist that include the forest recovery capability, but as you're evaluating such tools try to consider those capabilities independently. You're not likely to need a forest recovery tool very often; that day-to-day object restoration capability, however, will come up frequently.
Posted by Don Jones on 12/29/2011 at 12:09 AM0 comments
Back in the day, everyone took Microsoft's "Network Essentials" certification exam, and many took associated training either in a class, through a book, or in a video of some kind. It covered really, really basic stuff: IP networking, subnetting, the idea of routers and bridges and so forth. Heck, even IPX/SPX was on there.
These days, certification training and exams, and even most non-certification training, skips many of those basics. The theory has been twofold: One, we've got so much more to learn now that products like Windows are so complex that we can't squeeze it all into a class. Two, the higher-level tasks in Windows incorporate these lower-level basics, so there's no need to teach them.
The latter point isn't true. Windows' convenient point-and-click interface doesn't encourage any understanding of ICMP, ARP, or other important low-level protocols. And, by not teaching them, we run the risk of losing a thorough understanding of how the network actually operates -- making troubleshooting a lot harder. I recently taught a class in which not one student had ever run Ping or Tracert from the command-line. None of them had the foggiest idea what ARP was, and therefore had serious problems troubleshooting certain kinds of network problems. They thought they knew what a VLAN was, but couldn't really explain it to each other or to me.
Are you seeing a similar effect in your IT team? Not amongst your senior staff, but perhaps amongst newcomers or career-changers who are new to IT? Do they really have a stranglehold on these "essentials," or do they get a bit wary when people start talking of subnet masks, DNS caches and MAC address resolution? How important are these essentials to your network, and what portion of your staff do you think should understand them?
I'd really appreciate your feedback on this topic. If you've got a moment, please take a very short survey. I'll share the results in an upcoming post.
PS: If you're wondering why I've been asking so many questions and offering so many surveys this month, it's because we're getting close to the end of the year. That's always a good time to reflect on what we might want to focus on in the year ahead, and so next month I'll be sharing all of the insight you've offered. Seeing what your peers and colleagues think might give you some creative ideas for your own IT team's direction in 2012.
Posted by Don Jones on 12/27/2011 at 12:18 AM2 comments
I know "cloud" and "outsource" are dirty words with some IT folks for a variety of reasons. But let's set aside, just for a moment, all of the baggage those two words carry and conduct a quick thought experiment:
We all have things we'd rather not have to do when it comes to IT. For me, it's always been backups. This comes from an unfortunate environment early in my IT career: We lacked a LAN, and PC backup literally meant lugging a Colorado DAT drive around to people's computers at night to run backups one PC at a time. I still have nightmares about it. If you'd told me, back then, that I could somehow outsource that onerous task to some mystical "cloud," I would have been all over it. I had lots of other projects that were more interesting and more value-added that could have more than occupied my time.
Just because something is "mission critical" doesn't mean we necessarily want it in our own datacenter. Take e-mail: Few organizations are professional e-mail hosting companies, yet we all (mostly) host our own e-mail services. We may not enjoy it, but we do, because in most cases we still have to. In most cases, we've all got a long backlog of projects that we'd like to work on, except that we can't get additional headcount these days and the staff we do have are overwhelmed.
It's the "build it vs. buy it" question, to a degree. Some of us won't outsource anything willingly, because we want the control that's offered by doing it ourselves. Nothing wrong with that! Others would prefer to just bring in as much as possible as a service, so that we can focus on the things we have to do ourselves, which aren't available from someone else -- usually business-specific stuff. Nothing wrong with that, either. Most of us wind up somewhere in between: There's a few things we could stand to have someone else manage, but some things we'd never, ever give up control of without a fight.
So indulge me for a second: Assuming that outsourcing one IT service would not endanger any jobs within your organization, that price was no object, and that concerns like security and performance were well-addressed, what one IT service would you gladly get rid of first? What's stopping you from outsourcing that service today?
Drop your comments below – or, if you've got a moment, answer in a one-question survey. I'll consolidate the results and share them in a future post.
Posted by Don Jones on 12/22/2011 at 12:33 AM2 comments
In the past few months, I've had the opportunity to speak with three or four dozen customers and to visit a half-dozen of those at their main locations. In every instance, out of growing curiosity, I asked about their help desk ticket-tracking software.
Now, I know most of us don't like the work that a help desk ticket represents. Too often, they're firefighting exercises, and we all have things we're more passionate about than fixing problems. Besides, nobody likes fixing as much as building, because fixing implies a failure that we usually wish hadn't happened in the first place. But a ticketing system's job is to help coordinate activity and keep balls from being dropped, so on paper such systems have value.
I've yet to run across a single organization that likes their ticketing system.
It seems like it's either (a) badly designed software, (b) badly implemented software, or (c) both. I've run across all the major brands, and I've yet to hear a kind word about any of them. This isn't just folks who dislike tickets per se; most of the IT teams I've spoken with completely understand the value a ticketing system should offer -- they just don't feel they're getting it from their system. Complex software that hasn't been properly deployed has been the most common complaint, which suggests there's really room for a competitor to come in and offer something compelling and useful -- which doesn't seem to have happened, yet.
At the same time, I'm seeing a spike in the number of ticketing systems offered for sale. Some of these are integrated with hybrid monitoring solutions, although I'm not seeing a lot of organizational demand for combo monitoring+help desk software, no matter how much sense such a combo might make in theory. I'm actually a bit surprised that Microsoft hasn't jumped into this space, either through development or (more likely) an acquisition in the System Center family. The integration of ticketing software with System Center Operations Manager, System Center Virtual Machine Manager, System Center Configuration Manager and System Center Orchestrator (in particular) seems like an easy argument to make.
What help desk software is your organization using -- and what do you like or dislike, about it? Would you prefer a help desk system that has tighter integration with your back-end management tools or does your current software handle what you need it to?
Drop your thoughts in a comment, and if you have a moment to answer a short three-question survey about your help desk software. I'll share the results in a future post.
Posted by Don Jones on 12/20/2011 at 11:19 AM6 comments
I used to co-own a company that did outsourced certification exam development, and we helped Microsoft on several projects a few years back. Like many of you, I also held the requisite certs: MCSE, MCDBA, MCT and so on. Like some IT professionals, certification has fallen a bit by the wayside for me. My position doesn't require it or, in fact, do much of anything to make me stand out from the crowd.
But certification is obviously still a factor. I know plenty of IT professionals are still taking first-party exams from Microsoft, VMware, Citrix, Cisco and so forth. Training materials focused on certification are still a big deal -- the SQL Server 2008 exam videos I did for CBT Nuggets, for example, still get pretty good viewership.
As an IT decision maker, however, what role does certification play in your life? Do you encourage -- or even demand -- that your IT team maintain the appropriate certifications? What certifications, if any, do you look for on resumes when you're bringing in headcount? Does a certified individual have a better chance of getting hired or promoted than a non-certified person?
I know for a fact there's still considerable concern about the "paper MCSE" effect (or whatever we're calling it now that the MCSE itself is defunct), wherein exam-crammers obtain certifications without really earning them, thus (according to some) diminishing the value of the certification itself. Is that something you've run into in your own organization?
I'd love to get your thoughts in comments below -- but if you have a moment to answer a two-question survey on the value of certification in your organization, go here. I'll share the results in a future post.
Posted by Don Jones on 12/16/2011 at 10:32 AM11 comments