Writing ... or Just Practicing?

Random Disconnected Diatribes of a p&p Documentation Engineer

  • Writing ... or Just Practicing?

    Measuring Job Satisfaction

    • 0 Comments

    Listening to the radio one day this week, I heard somebody describe golf as being "a series of tragedies obscured by the occasional miracle". It struck me that maybe what I do every day is very similar. If, as a writer, you measured success as a ratio between the number of words you write and the number that actually get published, you'd probably decide that professional dog walker or wringer-out for a one-armed window cleaner was a far more rewarding employment prospect.

    Not being a golfer myself (see "INAG"), I'd never heard that quote before. However it is, it seems, quite well known - I found it, and several similar ones, on various golf Web sites. Including a couple that made me think about how closely the challenges of golf seem to mirror those of my working life. For example, "Achieving a certain level of success in golf is only important if you can finally enjoy the level you've reached after you've reached it." How do you know when you've reached it? Or can you actually do better next time? Or maybe you should just assume that you're doing the best you can on every project? That seems like a recipe for indolence; surely you can always get better at what you do? But if you keep practicing more and more, will you just end up creating more unused output and reduce your written/published ratio?

    Or how about "Golf is the only sport where your most feared opponent is you"? I find that writing tends to be a one-person activity, where I can concentrate without the distractions of the outside world penetrating the whirling vortex of half-formed thoughts and wild abstractions that are supposed to be elements of a carefully planned and managed process for distilling knowledge and information from the ether and converting it into binary data. I always assumed that professional developers tended to have the same issues, so I have no idea how they can do paired programming. Imagine two writers sat side by side arguing about which words to put where, and if that should be a semi-colon or a comma, while trying to write an article.

    I've always maintained that the stuff I create should, by the time it actually pops up in the Inbox of my editor and reviewers, be complete, readable, as free of spelling errors and bad grammar as possible (unlike the subject of one of my previous posts), and - of course - technically accurate. OK, so you can't always guarantee all of these factors, but making people read and review (and, more to the point, edit) stuff that is half-baked, full of spelling and grammar faults, and generally not in any kind of shape for its intended use just seems to be unprofessional. It also, I guess, tends to decrease the chance of publication and reduce your written/published ratio.

    Ah, you say, but surely your approach isn't agile? Better to throw it together and then gradually refactor the content, modify the unsuccessful sentences, and hone the individual phrases to perfection; whilst continually testing the content through regular reviews, and comparison with reality (unless, I suppose, you are writing a fantasy or science fiction novel). Should "your most feared opponent" be the editor? I'm not sure. When it comes back from review with comments such as "This is rubbish - it doesn't work like that at all" or "Nice try, but it would be better if it described what we're actually building" you probably tend to sense a shift in most-feared-opponentness.

    I suppose I should admit that I once tried writing fiction (on purpose), but every page turned out to be some vaguely familiar combination of the styles of my favorite authors. Even the plot was probably similar to something already published. Thankfully I gave up after one chapter, and abandoned any plans to write the next block-selling best-buster. And I couldn't think of a decent title for it anyway. Written/published ratio zero, and a good reason to stick with my proper job of writing technical guidance for stuff that is real. Or as real as a disk file full of ones and zeros can be.

    And while we're talking about jobs, they have a great advert on one of our local radio stations at the moment. I've never figured out what they're trying to sell, but it does offer the following useful advice: "If you work on the checkout in a hand-grenade shop, it's probably best not to ask customers for their PIN". However, in the end, I suspect that none of the quotes can beat Terry Pratchett's definition of the strains of the authoring process: "Writing is easy. You only need to stare at a piece of blank paper until your forehead bleeds".

  • Writing ... or Just Practicing?

    Am I Done Yet...?

    • 0 Comments

    I've been trying something new and exciting this week. OK, so it's perhaps not as exciting as bungee jumping or white-water rafting, but it's certainly something I've not tried before. I'm experimenting to see if I can use Team Foundation Server (TFS) to monitor and control the documentation work for my current project. As usual, the dev guys are using agile development methods, and they seem to live and die by what TFS tells them, so it must be a good idea. Maybe. But I suppose there's no room in today's fast-moving, high-flying, dynamic, and results-oriented environment for my usual lackadaisical approach of just doing it when it seems to be the best time, and getting it finished before they toss the software out of the door and into the arms of the baying public.

    So, dive into the list of work items for the current iteration and see if I can make some wild guesses at how long the documentation work will take for each one. Ah, here's a nice easy one: fix some obscure bug that hardly anybody was aware of. That's a quarter of an hour to add a note about the fix to the docs. But it seems like I can only enter whole hours, so I suppose I'll have to do it slowly. And here's another no-impact one: refactor the code for a specific area of the product. And these three are all test tasks, so I don't need to document them either. Wow, this is easy. It looks like I'll only have three hours work to do in the next fortnight. Plenty of time to catch up on the gardening and DIY jobs I've managed to postpone for the last year or three.

    Next one - completely change the way that the configuration system works. Hmmm, that's more difficult. How many places in the 900 pages will that have an impact? And how long will it take to update them all? Oh well, take a wild guess at four days. And the next one is six completely new methods added to a class. That's at least another three days to discover how they work, what they do, and the best way to use them. And write some test code, and then document them. After a few more hours of stabbing in the dark and whistling in the wind, I can add up the total. Twenty three days. That should be interesting, because the iteration is only two weeks. Looks like I need to write faster...

    Now skip forward to Friday, and go back to TFS to mark up my completions. How do I know if a task is done or not? Will the code change again? Will changes elsewhere impact the new updates to the docs? When will test complete their pass on the code so I can be sure it's actually stable? And do I have to wait for test to review my docs? Or wait for the nice lady who does the English edit to make sure I spelt everything right and didn't include any offending letters (see Oending Letters). I guess I've finished my updates, so I can mark them as "Done". But does that mean I need to add a task for review, test, and edit for my updates? Surely they won't want to work through the doc until it contains all of the updates for that particular section of the product?

    So this isn't as easy as it may have seemed at the beginning. In fact, I've rambled on in the past about trying to do agile with guidance development (see Tragile Documentation). I can see that I'll be annoying people by asking them to test and edit the same doc several times as I make additional changes during the many upcoming iterations. Perhaps I should just leave them all as "In Progress"? But that will surely mess up the velocity numbers for the iteration. And they'll probably think I went off on vacation for the two weeks. Not that the sound of the waterfall in my garden pond and the ice cream van that always seems to go past during the daily stand-up call won't tend to reinforce this assumption.

    Still, it will be interesting to see how it all pans out. Or whether I spend more time fighting with my VPN connection and TFS than actually writing stuff...

  • Writing ... or Just Practicing?

    Having A Bad Where? Day

    • 2 Comments

    Isn't it funny how - after a while - you tend not to notice, or you ignore the annoying habits of your closest colleagues. As I work from home, some 5,000 miles away from my next closest colleagues, the closest colleague I have is Microsoft Vista (yes, I do lead a sad and lonely life doing my remote documentation engineering thing). I mean, I've accepted that sometimes when I open a folder in Windows Explorer it will decide to show me a completely different view of the contents from the usual "Details" view I expect. I suppose it's my own fault because I happen to have a few images in there as well as Word documents, and Vista thinks it's being really helpful by telling me how I rated each one rather than stuff I want to know - like the date it was last modified.

    But worse of all is the search feature, or perhaps I should call it an unfeature. In XP, I could select a folder and enter a partial file name then watch as it wandered through the subtree (which, with my terrible memory of where I put stuff was often "C:\"). It told me where it was looking, and I knew it was just looking at filenames. If I only wanted to search the contents of files, I could tell it to do that. In Vista, I type something in the search box and get a warning that the folder isn't indexed, then a slow progress bar. I've no idea where it's looking, or what it's looking for. And neither does it by the look of the results sometimes.

    It seems to decide by itself whether to look inside files (so when I search for some part of a filename I get a ton of hits for files that happen to contain that text), yet it seems incapable of finding the matching files by name. I have to either wait till its finished or open the Search Tools dialog before I can get at the advanced options to tell it what kind of search I want and if I want all subfolders to be included. And when I do look for something in the contents of the files, I get either 1,000 hits or none at all. In fact, I've actually resorted to using TextPad to search for strings in text files recently. And after all that, I have to go clicking around the folder tree (while trying to cope with the contents oscillating widly from side to side as I open each one) to get back to where I was because it helpfully moved the folder view to the very end of my long and complicated list of folders.

    I can see that the Vista approach may be easier and quicker for simple searches, but I can't help feeling that it often just gets in the way by trying to be too clever and "usable" (something I've grumbled about before - see Easter Bonnets and Adverse Automation). Maybe some of the problem is that I'm continually creating and deleting folders and moving stuff around as I gracefully slither between projects and my other daily tasks. I've tried setting default folder and search options, but I guess Vista can't cope with my indecisiveness. Perhaps I should just keep everything in one folder called "Stuff". But then I'd need a really good search engine...

    Probably a lot of this ranting comes about because of the totally wasted day spent trying to get some updated software to run on my machine. The software in question uses a template and some DLLs that get loaded into Word, some other DLLs that do magic conversion things with the documents, and some PowerShell scripts that drive the whole caboodle. So after the installation, PowerShell refused to have anything to do with my scripts, even though I configured the appropriate execution policy setting. Finally I managed to persuade it to load the scrips, but all it would do was echo back the command line. In the end, I copied the scripts from the new folder into the existing one where the previous version was located, and the scripts ran! How do you figure that? Is there some magic setting for folder permissions that I have yet to discover?

    And then I had to run installutil to get the script to find some cmdlets in another assembly, and delay sign a couple of other assemblies that barfed with Vista's security model. After about 6 hours work, it looked like it was all sorted - until I went back into Word to discover that the assemblies it requires now produced a load error. In the end, the only working setup I could achieve was by uninstalling and going back to the previous version. And people wonder why I tend to shy away from upgrading stuff...

    At least there is some good news - the latest updates to Hyper-V I installed that morning included a new version of the integration components, and (at least at the moment) I've still got a proper mouse pointer in my virtual XP machine (see Cursory Distractions). So I guess the whole day wasn't wasted after all.

    Footnote: Actually it was - my mouse pointer has just gone back to a little black dot...

  • Writing ... or Just Practicing?

    Woefully Inadequate Kollaboration Implementation

    • 0 Comments

    It's a good thing that Tim Berners-Lee is still alive or he'd probably be turning in his grave. I was hoping to find that my latest exploration of Web-based Interfaces for Kommunicating Ideas would lead me to some Wonderfully Intuitive Kit Intended for sharing knowledge and collecting feedback, but sadly I'm Wistfully Imagining Knowledge Instruments that should have been around today - and aren't. And, yes, I'm talking about wikis.

    As an aside, you've probably seen those word ladder puzzles in the Sunday papers where you have to turn one word into another by adding one letter at a time. Seeing as how I talked about Wii last time, and wiki this week, maybe I can continue the pattern. Any suggestions of a five-letter topic that contains the letters w, i, k, and i are welcome... 

    Anyway, coming back to the original topic, it could all have been so different. Instead of the awful and highly limited format capabilities, and the need to spend an inordinate amount of time creating conversion tools, we could have had a ready-built, comprehensive, easy-to-use, and amazingly less grotty technology than wikis if we hadn't let some guy get in the way some time back in the mid nineties. Mind you, it's probably not wholly fair to blame it all on Marc Andreessen and Netscape; Microsoft followed the same path and I guess are equally guilty. I suppose the drive for world-wide adoption, the opening of the Web to the unwashed public, and commercial factors in general were the real reason behind it all.

    You see, when our Tim and his team invented HTML and the associated server-side stuff, the intention was that it would be a collaboration and information sharing mechanism. User agents (what we now call browsers) would fetch content from a server if the user had read permission and display it in a documentation format using markup to indicate the structure and type of content it contained. Elements such as "p" (paragraph), "strong", "ul" (unordered list), "ol" (ordered list), "dl" (definition list), and the "h(x)" (heading) elements would indicate the type of content contained, not the way it should be displayed.

    But, more than that, the user agent would allow the user to edit the content and then, providing they had write permission on the originating server, update the original document with their revisions and comments using elements such as "ins" and "del". However, as we've seen, the elements in HTML have come to represent the displayed format rather than the context of the content, and browsers are resolutely read-only these days. Of course, more recent mechanisms such as CSS and XML transformations allow us to move back to the concept of markup indicating context rather than display attributes. But if you want to see what it should have been like, download and install the W3C reference browser Amaya and see how it allows you to edit the pages it displays.

    So, instead we had to invent a new way to do collaboration, and wiki caught on. OK, it's probably fine for quickly knocking up a few pages to allow users to edit, review, and provide feedback. But it's seriously broken compared to doing anything sensible like you can with an XML-based format (which includes HTML 4.x). It's a kind of "Web for dummies" approach, where the concept of nesting and formatting content consists of a few weird marker characters that easily get confused with the content - even to the extent that you need to "escape" things like variable names that start with an underscore.

    I guess this railing against technology comes about because I just spent two days building a tool to convert our formatted Word docs (which use our own DocTools kit to generate a variety of outputs) into a suitable format for Codeplex wiki pages. I even had to build another "kludge" tool to add to our growing collection - it's the only way I can find to do the final content tweaks. All I can say is, whoever dreamed up this format never tried to do stuff like this with complicated multi-topic Word source documents...

    And, worse still, you have to add each page to the wiki project site individually and then attach the image files to it. OK, so the tool does give you a TOC and text files you can copy, but it sure would be nice to have a way to bulk upload stuff. My current test document set has 59 pages, so I can see I'll be spending a whole day clicking Save, Attach, and Edit.

    But maybe that has some advantages. I'll have less time to spend inflicting the general public with Wild and Incoherent Komplaints and Insults in my blog...

  • Writing ... or Just Practicing?

    I Need a Wii...

    • 0 Comments

    According to Nintendo, the name of their family games console expresses their direction to break down the wall that separates video game players from everybody else, puts people more in touch with their games, and with each other. The two letter "i"s emphasize both the unique controllers and the image of people gathering to play, and the pronunciation "we" emphasizes that this console is for everyone. But I think they only called it this so people in England could make up silly jokes.

    Anyway, having got past the obvious hilarity when my wife told me the other week that she "really, really, really wanted a wii", we've taken the plunge and acquired our first games console. She managed to convince me that it would make us both fit as we while away the evenings playing tennis and ten-pin bowling, contour our upper-body through regular boxing exercise, and master relaxation by standing on one leg on an electronic wobble-board. I can't say I was totally convinced, but—at my stage of middle age and corresponding spread—anything is worth a try.

    Of course, there's pretty much no way it will connect up with our aging TV that's driven by Media Center, or any more room (or sockets) in the lounge. And it seems we need to replace the TV in the office upstairs because it doesn't work with the aerial in the attic now they've built more houses behind us, and it only gets five channels anyway. Besides, how would I watch the educational programs I enjoy, such as racing anything with wheels or any of the myriad Poirot repeats, while my wife is electronically toning her body and mind?

    Ah, but have you tried to buy a stand for a TV that's more than three feet high lately? If you're going to be posing on a wobble-board and leaping around playing badminton, you probably want the TV positioned a bit higher than that. Yes, you can spend four hundred pounds (or six hundred dollars) buying a fancy wall mounted arrangement to hold the TV if, like me, you have a flimsy plasterboard wall to mount it on, but that seems a bit steep; so my wife helpfully suggested I build a nice shelf unit to hold everything. And in a nice wood that matches the rest of the furniture.

    So, after wandering around the Bank Holiday sales at a selection of electronic retailers and DIY stores, we came home with a new flat screen TV package (complete with the incredible assortment of paraphernalia that seems to be standard with these things), a Wii everything, and a truck full of timber and ironmongery. Last time I bought a TV, you only had to plug it into an aerial and an electric socket and it all just worked. This time, it's taken me three days to get to the point where we can watch a DVD, and I've still got to figure out how on earth the twenty or so incomprehensible components of a Wii fit together (I haven't worked out yet how get the back off the controllers to put the batteries in). And I reckon I got my month's exercise just building the wall frame and assembling everything.

    It seems that nobody has just "a TV" any more. Now you have to have a "Home Cinema System". In fact, the box that it came in was bigger than the TV. Do I really need "five plus one" speakers just to hear Louise Redknapp telling me how to stand on one leg on my wobble-board? OK, so the office already looks like a power station with all the wires my computers and the associated junk require, so a couple of furlongs of extra speaker cable will probably meld in quite well. Maybe I should copy the setup a friend has—two big sofas arranged on platforms like in a cinema with the bass woofer underneath them. Makes you really appreciate films with lots of explosions.

    And it shows how out of touch I am with this brave new world of entertainment technology when I discovered that I needed an optical audio cable (obviously wire is old-fashioned now) to connect the bits together. Worse still, I struggled for ages trying to plug it in until I finally discovered that you have to remove the squidgy clear plastic protector caps from the ends first. Well it didn't say anything about that on the packet, and the instruction books for the rest of the kit just contain vague pictures that might apply to any of a selection of products from the manufacturer's range.

    Still, at least we got there in the end. OK, so I ended up having to repaint the wall and relay the carpet afterwards, but—as my wife likes to point out—jobs I tackle never seem to be as easy as it says on the box. All we need to do now is get the TV hooked up to some receiving hardware on the roof so as it actually works. As the TV has a satellite decoder built in, we might as well go that way so I talked to somebody who is brave enough to climb a tall ladder and knows where to point the dish.

    It seems, however, that (according to my brave ladder-climbing man) satellite signals are "very sensitive to trees". Ah, I thought, obviously they have to comply with some government environmental directive, or exhibit corporate "green" credentials. In fact, he tells me, it means that we won't get a signal if there are any tall trees nearby. And there was me thinking that the dish pointed up into the sky. I live in the countryside, where there are lots of tall trees, so I'm still awaiting with baited breath to see if we're in what he calls a "reception-capable area".

    Mind you, when we turned the TV on the first time, it asked for our full postal code so it could ensure that we received "programs optimized for our viewing area". Maybe there is a satellite up there that just transmits programs suitable for our small patch of Derbyshire. Lots of documentaries about sheep and coal-mining, perhaps...

  • Writing ... or Just Practicing?

    Windows 2008 Hyper-V and Service Pack 2

    • 0 Comments

    A quick note to Hyper-V users. When I installed Windows Server 2008 Service Pack 2, it installed fine with no errors, but after a while I was getting NetBT errors in Event Log saying there was a duplicate name on the network, and other issues finding machines on the network.

    Turns out that the service pack had re-enabled all of the network connections in the base O/S running Hyper-V. As two of these need to be disabled (see http://blogs.msdn.com/alexhomer/archive/2009/02/08/Hyper_2D00_Ventilation_2C00_-Act-III.aspx for an explanation), this meant the base O/S had two connections to the internal network, one of which was obtaining a new IP address through DHCP and registering itself in DNS. 

    After disabling these connections again, everything returned to normal. Maybe I should have deleted the connections in the first place instead of just disabling them... any advice from an expert in this area would be very welcome.

    Update: In Windows Server 2008 R2 you can untick the Allow management operating system to share this network adapter option in Virtual Network Manager to remove these duplicated connections from the base O/S so that updates and patches applied in the future do not re-enable them.

  • Writing ... or Just Practicing?

    Currently Overloaded...

    • 1 Comments

    You don't normally expect zonking amounts of current to be flying around inside a computer (unless you've packed it solid with extra disk drives), so tagging a couple of skinny wires to one end of the circuit board is probably an eminently sensible approach. Those five DC volts will eventually find their way along the copper tracks and wander into the odd chip when required, and it's fairly unlikely you'll get flash-over between the connector pins or a nasty smell as several amps of current rumble uncontrolled through the resistors and capacitors.

    In fact, the working bits inside a chip are now so small and close together that we worry about quantum effects (and if all the dead cats will interfere with calculations). I mean, what if an odd electron decided to go walkabout one day and debit your account at the bank instead of crediting it? If the nice man at Intel happened to sneeze when they were making the chip for your machine, it could mean that all of your spreadsheets are calculating the wrong answer.

    See, this is the thing. We're so focused on digital stuff, miniaturized electronics, and tiny voltages squirming their way through the ever-increasing complexity of modern machines that we tend to forget that real electricity doesn't behave anything like this. It needs big chunky wires, and often involves several real amperes rather than those wimpy milliamp things - as I discovered when I went to see if I could help a friend sort out a problem with his model railway (railroad) layout last week.

    When I was younger, and a practitioner in the art of miniaturized transport modeling, the only way you could get the locos to move was to shove lots of DC current through the track, accompanied (in the later period as electronics began to blossom into the modeling world) with a high frequency pulsed current that zapped the bits of dust and cat hair that might impede the flow of amps into the tiny electric motors. It you put two locos on the same track, they went round together and the speed controller box got hotter and hotter until the reset button popped out. Unless, of course, you'd wired them different so they went opposite ways. Then the head-on collision usually occurred before the reset button had time to react.

    I remember reading, just at the end of my modeling days, about the new electronics that were coming to the hobby. Digital Command Control (DCC) was the upcoming thing. You just shove 15 volts AC into all of the track all of the time, and the individual control modules in each loco, turnout motor, and accessory allow you to individually control each one. Up to 99 separate channels, and superbly fine speed control as well because the chip can fire pulsed bursts of current into the electric motor, rather than just feeding it a constant voltage. And, from what I've seen in working layouts recently, it really does make everything easier and better. Another first for technology. It even uses wireless now so that the "driver" can wander around getting the best viewpoint without being constrained by (or tripping over) trailing cables.

    However, the trouble with my friend's layout was that the fine control only worked for about 30 seconds before the control box decided all was not well and turned off the power. We initially suspected a short circuit, but my multi-meter could find no sign of one. We tested several different locomotives with no effect. Finally, we started measuring track voltages and current consumption in multiple places around the layout.

    And, yes, even with DCC and the magic of electronics, model railway locos still do absorb quite a lot of power. The voltage drop as each one started was quite noticeable and it soon became clear that the controller was cutting out not when there were too many amps coming out of it, but when it detected that the voltage in the track had dropped below some predefined limit. Yet it took 30 seconds or so for this to occur.

    Finally, after investigating the wiring layout, it became clear why. My friend had made the same assumption that computer system builders do - that you just need to tag on some skinny wires at one end, and the volts will meander through to where they're needed. In fact, this system used skinny wires to carry the track power all the way from the remote power unit to one end of a long and remote siding so that the whole layout (containing a very large four-track main line loop) dragged these volts through several very thin connectors, and lots of fishplates between track sections that (although soldered together) would move around with changes in humidity and movement of the baseboards and the floor underneath.

    One of these joints was probably not as good as it could have been, and was increasingly resisting the not inconsiderable amount of current flowing through it. My suggestion was to go back to the principles of the electrician. Run couple of bus bars made of thick single-strand copper wire all the way from the power unit, continuing underneath the whole layout, and tag the track and everything else into it at regular intervals.

    I suspect that it's not a solution you could apply to a recalcitrant server, though another friend did have lots of problems with a disk array that kept failing when he was driving it off an old 200 Watt power supply, so maybe there's correlation there. Amazingly, this friend uses speaker cable for his extra-high hi-fi stuff that could comfortably carry the output of a small power station. Mind you, at ten pounds (in money) per metre, it's probably a bit expensive for wiring up a model railway.

  • Writing ... or Just Practicing?

    How Much Configuration Do You Need?

    • 0 Comments

    I endured a severe culture shock this week. And that was without meeting new people from countries afar, or travelling to distant lands. And it didn't involve a trip to some foreign eatery (such as our local Indian restaurant or Greek fish 'n' chip shop) either. No, all I did was respond to a change in the company security policy by replacing the existing well-known virus protection software with the new Forefront Client Security application. All I need to do now is work out how to configure it.

    You see, I'm used to virus scanners (and most other software) that provides oodles of configuration options. I mean, the one we just abandoned has about ten pages of check boxes, option buttons, lists, and text boxes where you can play happily for hours messing up the configuration, then click the Reset button to put it all back to where you started. Ah, the many happy hours spent trying to decide whether to scan zip file contents on disk, or just when the files inside them are opened (and the additional hours spent trying to remember where I found that option the last time).

    But, all of a sudden, I've got almost no configuration options. Forefront Client is one of those applications where they could have fitted all of the UI into a window about one inch square. It only has one page plus a link to open the Help file. And I thought Windows Defender was extravagant with screen estate. OK, so there is a link you can open to look at all the nasty malware that it captured, and one that has a few options to specify the type of scan to perform and the frequency. They even included some options to specify files, locations, and processes you don't want to scan. But you can't help thinking that it all looks a bit sparse, like they have only built half of it so far. I mean, when you flip open one of the tab-bar things, all you get is one line of text telling you that the feature is turned on or off, or is up to date, and no buttons to do anything about it.

    And then, after you install it and run it for a while, you discover that it added a new Event Log named Operations Manager with a size of about 15 MB, and is proceeding to fill it up with error messages that it can't find a management server to connect to (although that may be because I installed our "corporate" version). Obviously the name of the log gives the game away - it's meant to be administered remotely from some Windows System Server management console. Probably that's why the interface is so sparse and lacking in stuff to play with. No problem, I thought, I'll just install the Forefront Server Security Management Console (FSSMC) so there's something for my local machines that aren't joined to the corporate big iron to talk to.

    Maybe you'll remember (if you have a habit of wasting your time reading my weekly disconnected ramblings) that I recently went through the Hyper Ventilation experience and upgraded all my server infrastructure to Windows Server 2008 and Hyper-V hosted machine instances. But the FSSMC will only install (at least in the current incarnation) on Windows 2003 32-bit systems. OK, so I've got such a box running ISA 2006, but I'm not convinced that's a prime location for an admin tool that manages internal network security. Especially as you have to allow DCOM through all your internal firewalls. I did find a link to a page named "Forefront Server Security Products Next Generation", and - since I'm working on Enterprise Library at the moment - it seemed for a while that 2009 might be my Star Trek year. But no such luck, the next generation products don't yet include a Windows 2008-compatible 64-bit version.

    Mind you, there is the Client Security Enterprise Manager, but the reams of installation instructions frightened me off that - at least for the time being. It's probably overkill for managing six machines anyway. And it looks like it all hooks into System Center Operations Manager in a big way, so I reckon that's a "wait and see" job. I'd love to have all that working, but I can't face the effort at the moment. Maybe after I've managed to get a life, and there is some spare left over.

    Meanwhile, at least it seems to be doing stuff. The Event Log says is did a scan when it should, and that it is happily downloading and installing the new definitions every day from my WSUS server. Interestingly, on Vista, I still have Windows Defender running as well. I removed if from the XP boxes, but as its part of the O/S in Vista I didn't know whether to remove it (or how to) and the helpful support guy I spoke to said I should just leave it running alongside. I suppose, when I do pick up some malware infection, they'll have a fight over who gets to quarantine it.

    And what's the best part of all? The Forefront Client UI is plastered all over with the word "Antimalware" which, when you glance at it, always seems to read "Animalware". Every time I decide to check my security status, I end up with visions of horse blankets, fur coats for dogs, and those photos you see on the Web where people dress up their family moggy in some ridiculous outfit.

Page 34 of 40 (319 items) «3233343536»