Writing ... or Just Practicing?

Random Disconnected Diatribes of a p&p Documentation Engineer

  • Writing ... or Just Practicing?

    The Shadow in the Machine

    • 0 Comments

    There's some ethereal guy called "system" wandering around inside my servers stealing stuff. It's a bit like when you were a kid and your parents hid things from you. When my hamster died, my Dad told me it had gone to live on a farm. Of course, when I got a bit older and my Grandmother passed away I realized he was telling fibs because she suffered from hay fever and was afraid of cows, so there's no way she would go and live on a farm. Yet, even though I've now reached the age where people generally feel they can tell me the truth (often, worryingly, to may face), I discover that Windows Server 2008 is still hiding stuff from me.

    I suppose it's all related to the poor decisions I made when ordering my servers. Ever since I set them up with Hyper-V, and virtualized all the machines I find I need for my diminutive network here at chez Derbyshire, I've been struggling for disk space. It seems that 300GB is just an aperitif when you get serious about virtualization. OK, so the Server 2008 docs do say you need a minimum of 40GB for a standard installation, but I made the VMs only 30GB. My Windows 2003 Server VM that runs ISA is 30GB and has 22GB free. Though the Windows 2008 VMs that don't have very much at all installed are both showing only 8GB free of 30GB so maybe they were right...

    Anyway, although my math skills may have waned since leaving school, I managed to calculate that I could run four 30GB VMs on a 150GB disk (yes, I know you're supposed to put them on separate disks, but my network loading is somewhat less than heavy - none of the machines goes above about 3% CPU utilization). Yet I could never get all of them onto the disk. OK, so Hyper-V does use some extra space for each VM when it's running (about 2GB for a 30GB VM), but I should still have space for four of them. In fact, as one of the VMs is a tightly locked down copy of Windows XP used for browsing and troubleshooting while I'm pretending to be a system administrator, and its only 10GB, I should have space left to swing several cats round simultaneously. But I could only ever fit the three 30GB VMs onto the disk.

    I did try reducing the size of the VM with the 20+GB of free space using the Hyper-V tools, but (as they say in several blog posts I found) it's not a trivial exercise. You can convert the VM to a dynamic disk and compact it (it went down to 5.6GB), but when you convert it back to a fixed size disk there is no option to specify the size because it automatically grows to the partition size specified in its boot sector. You need to edit the partition size to reduce the physical disk size, and I didn't fancy playing round with that on a Sunday afternoon. Please, Hyper-V guys, can we have a tool to do this (and better docs that explain why you are wasting good gardening time playing with the existing tools).

    So I've put off dealing with this issue for the last few months since setting everything up, but now that we are suddenly experiencing tropical conditions here in Little Olde England I decided I needed to find a way to get this sorted so I could shut down the "spare" server and reduce the searing temperatures in my server cabinet (see last week's ramblings for details). So out comes the calculator: three times 32 (the three VMs on the disk) equals 96. Check the disk properties and it says 133GB used, 14GB free. So where did all the spare disk space go? Maybe it's got some lost clusters, so I schedule a disk check and reboot. After restarting, look in the bootlog.txt file and - lo and behold - around 40GB is described as "in use by the system". What on earth for? Is it hiding secret documents from me? Does it need some spare disk space for playing Mahjong when nobody is watching? Is it full of dead hamsters that never made it to the farm?

    So I did the usual, check the properties of each folder and add the total sizes together. 96GB. Then turn on "view operating system files" and do the same. Still 96GB. See what I mean? Most things made of metal expand when they get hot, so my disk drives should be getting bigger not smaller. I even considered looking underneath to see if there was a pool of congealed clusters that had leaked out of the bottom (OK, so not really). But then - "Aha!" - I remember seeing the occasional error message in Event Log about something to do with "Not sufficient disk space to create shadow copies". One of those messages that I've conveniently been ignoring.

    So after furkling through the properties of the disk, I find that Windows has allocated 41GB to shadow copies. I suppose the fact that you can see this in the Shadow Copies tab of the Properties dialog means that it's not technically "hidden", but where is the file? You can't see it in Windows Explorer, even with "show operating system files" and "show hidden files" turned on. And how do you stop it happening? After reading some online docs and blog posts, it became clear that the shadow copies are there because the disk has a share set up, and it allows connected users to get at the previous deleted or updated data that was on the disk. I have the disk with the VMs on shared at admin level to be able to do backups, so I can't really just turn off sharing. And according to the Shadow Copies dialog, they are disabled on the disk anyway.

    I had a go with the vssadmin command line tool that is part of Server 2008, but that said it couldn't find any shadow copies (that system guy obviously hides stuff from Windows as well). It seems that vssadmin can only delete shadow copies you create manually. And to make it worse, the more I tried enabling and disabling shadow copies, the larger the shadow copy got. After ten minutes it had grown to 55GB! In the end, more by luck that any administrative capability on my part, I found that by clicking the Schedule button and deleting the two existing scheduled shadow copy tasks, and setting the size to 300MB (the minimum you can specify), the shadow copies magically just disappeared. Suddenly I've got tons of spare disk space on all of my drives!

    Of course, now I can't sleep at night worrying that shadow copies aren't occurring for my shares, but seeing as how: a) I didn't know there were there before, b) I've never had reason to use them, and c) I can't see why I'd need to get a previous copy of a VM when they are all exported and backed up in multiple places regularly (and don't actually change that much anyway), maybe I'm just being as paranoid as usual.

    I suppose I'll find out one day when the sky does fall in, and I can't get to the Internet to update my blog. You'll probably be able to tell when this happens because the post will suddenly end in mid

  • Writing ... or Just Practicing?

    It Ain't Half Hot Mum

    • 0 Comments

    OK, OK, so one month I'm complaining that our little green paradise island seems to have drifted north into the Arctic, and now I'm grumbling about the heat. Obviously global warming is more than just a fad, as we've been subjected here in England to temperatures hovering around 90 degrees in real money for the last week or so. Other than the gruesome sight of pale-skinned Englishmen in shorts (me included), it's having some rather dramatic effects on my technology installations. I'm becoming seriously concerned that my hard disks will turn into floppy ones, and my batteries will just chicken out in the heat.

    Oh dear, bad puns and I only just got going. But it does seem like the newer the technology, the less capable it is of operating in temperatures that many places in the world would call normal. There's plenty of countries that get regular spells of weather well into the 90's, as I discovered when we went to a wedding in Cyprus a few years back. How on earth do they cope? I've got extra fans running 24/7 in the computer cabinet and in the office trying to keep everything going. I'm probably using 95% of my not inconsiderable weekly electricity consumption keeping kit that only uses 1% of it to actually do stuff from evaporating (the other 4% is the TV, obviously).

    Maybe the trouble is that, here in England where we have a "temperate" climate, we're not really up to speed with modern technology such as air conditioning. Yes, they seem to put it in every car these days, but I only know one person who has it in their house, and that's in the conservatory where - on a hot day - it battles vainly to get the temperature below 80 degrees. I briefly considered running an extension lead out to my car and sitting in it to work, but that doesn't help with the servers and power supplies.

    I've already had to shut down the NAS because it's sending me an email every five minutes saying it's getting a bit too warm. And I've shut down the backup domain controller to try and cut down the heat generation (though it's supposed to be one of those environmentally friendly boxes that will run on just a sniff of electricity). And the battery in the UPS in the office did its usual summer trick of bulging out at the sides, and throwing in the towel as soon as I powered up a desktop and a couple of decent sized monitors. It's no wonder UPS are so cheap to buy. They're like razors (or inkjet printers) - you end up spending ten times more than a new one would have cost on replacement batteries. Even though I cut a hole in the side and nailed a large fan onto it.

    Probably I'm going to have to bite the bullet and buy a couple of those portable air conditioning units so my high-tech kit can stay cool while we all melt here in the sun. In fact, my wife reckons I've caught swine 'flu because she finds me sitting here at the keyboard sweating like pig when she sails in from her nice cool workplace in the evening. At least the heat has killed most things in the garden (including the lawn) so that's one job I've escaped from.

    By the way, in case you didn't realize, the title this week comes from a rather old BBC TV program. Any similarity between the actor who played Gunner 'Lofty' and this author is vigorously denied.

  • Writing ... or Just Practicing?

    Measuring Job Satisfaction

    • 0 Comments

    Listening to the radio one day this week, I heard somebody describe golf as being "a series of tragedies obscured by the occasional miracle". It struck me that maybe what I do every day is very similar. If, as a writer, you measured success as a ratio between the number of words you write and the number that actually get published, you'd probably decide that professional dog walker or wringer-out for a one-armed window cleaner was a far more rewarding employment prospect.

    Not being a golfer myself (see "INAG"), I'd never heard that quote before. However it is, it seems, quite well known - I found it, and several similar ones, on various golf Web sites. Including a couple that made me think about how closely the challenges of golf seem to mirror those of my working life. For example, "Achieving a certain level of success in golf is only important if you can finally enjoy the level you've reached after you've reached it." How do you know when you've reached it? Or can you actually do better next time? Or maybe you should just assume that you're doing the best you can on every project? That seems like a recipe for indolence; surely you can always get better at what you do? But if you keep practicing more and more, will you just end up creating more unused output and reduce your written/published ratio?

    Or how about "Golf is the only sport where your most feared opponent is you"? I find that writing tends to be a one-person activity, where I can concentrate without the distractions of the outside world penetrating the whirling vortex of half-formed thoughts and wild abstractions that are supposed to be elements of a carefully planned and managed process for distilling knowledge and information from the ether and converting it into binary data. I always assumed that professional developers tended to have the same issues, so I have no idea how they can do paired programming. Imagine two writers sat side by side arguing about which words to put where, and if that should be a semi-colon or a comma, while trying to write an article.

    I've always maintained that the stuff I create should, by the time it actually pops up in the Inbox of my editor and reviewers, be complete, readable, as free of spelling errors and bad grammar as possible (unlike the subject of one of my previous posts), and - of course - technically accurate. OK, so you can't always guarantee all of these factors, but making people read and review (and, more to the point, edit) stuff that is half-baked, full of spelling and grammar faults, and generally not in any kind of shape for its intended use just seems to be unprofessional. It also, I guess, tends to decrease the chance of publication and reduce your written/published ratio.

    Ah, you say, but surely your approach isn't agile? Better to throw it together and then gradually refactor the content, modify the unsuccessful sentences, and hone the individual phrases to perfection; whilst continually testing the content through regular reviews, and comparison with reality (unless, I suppose, you are writing a fantasy or science fiction novel). Should "your most feared opponent" be the editor? I'm not sure. When it comes back from review with comments such as "This is rubbish - it doesn't work like that at all" or "Nice try, but it would be better if it described what we're actually building" you probably tend to sense a shift in most-feared-opponentness.

    I suppose I should admit that I once tried writing fiction (on purpose), but every page turned out to be some vaguely familiar combination of the styles of my favorite authors. Even the plot was probably similar to something already published. Thankfully I gave up after one chapter, and abandoned any plans to write the next block-selling best-buster. And I couldn't think of a decent title for it anyway. Written/published ratio zero, and a good reason to stick with my proper job of writing technical guidance for stuff that is real. Or as real as a disk file full of ones and zeros can be.

    And while we're talking about jobs, they have a great advert on one of our local radio stations at the moment. I've never figured out what they're trying to sell, but it does offer the following useful advice: "If you work on the checkout in a hand-grenade shop, it's probably best not to ask customers for their PIN". However, in the end, I suspect that none of the quotes can beat Terry Pratchett's definition of the strains of the authoring process: "Writing is easy. You only need to stare at a piece of blank paper until your forehead bleeds".

  • Writing ... or Just Practicing?

    Am I Done Yet...?

    • 0 Comments

    I've been trying something new and exciting this week. OK, so it's perhaps not as exciting as bungee jumping or white-water rafting, but it's certainly something I've not tried before. I'm experimenting to see if I can use Team Foundation Server (TFS) to monitor and control the documentation work for my current project. As usual, the dev guys are using agile development methods, and they seem to live and die by what TFS tells them, so it must be a good idea. Maybe. But I suppose there's no room in today's fast-moving, high-flying, dynamic, and results-oriented environment for my usual lackadaisical approach of just doing it when it seems to be the best time, and getting it finished before they toss the software out of the door and into the arms of the baying public.

    So, dive into the list of work items for the current iteration and see if I can make some wild guesses at how long the documentation work will take for each one. Ah, here's a nice easy one: fix some obscure bug that hardly anybody was aware of. That's a quarter of an hour to add a note about the fix to the docs. But it seems like I can only enter whole hours, so I suppose I'll have to do it slowly. And here's another no-impact one: refactor the code for a specific area of the product. And these three are all test tasks, so I don't need to document them either. Wow, this is easy. It looks like I'll only have three hours work to do in the next fortnight. Plenty of time to catch up on the gardening and DIY jobs I've managed to postpone for the last year or three.

    Next one - completely change the way that the configuration system works. Hmmm, that's more difficult. How many places in the 900 pages will that have an impact? And how long will it take to update them all? Oh well, take a wild guess at four days. And the next one is six completely new methods added to a class. That's at least another three days to discover how they work, what they do, and the best way to use them. And write some test code, and then document them. After a few more hours of stabbing in the dark and whistling in the wind, I can add up the total. Twenty three days. That should be interesting, because the iteration is only two weeks. Looks like I need to write faster...

    Now skip forward to Friday, and go back to TFS to mark up my completions. How do I know if a task is done or not? Will the code change again? Will changes elsewhere impact the new updates to the docs? When will test complete their pass on the code so I can be sure it's actually stable? And do I have to wait for test to review my docs? Or wait for the nice lady who does the English edit to make sure I spelt everything right and didn't include any offending letters (see Oending Letters). I guess I've finished my updates, so I can mark them as "Done". But does that mean I need to add a task for review, test, and edit for my updates? Surely they won't want to work through the doc until it contains all of the updates for that particular section of the product?

    So this isn't as easy as it may have seemed at the beginning. In fact, I've rambled on in the past about trying to do agile with guidance development (see Tragile Documentation). I can see that I'll be annoying people by asking them to test and edit the same doc several times as I make additional changes during the many upcoming iterations. Perhaps I should just leave them all as "In Progress"? But that will surely mess up the velocity numbers for the iteration. And they'll probably think I went off on vacation for the two weeks. Not that the sound of the waterfall in my garden pond and the ice cream van that always seems to go past during the daily stand-up call won't tend to reinforce this assumption.

    Still, it will be interesting to see how it all pans out. Or whether I spend more time fighting with my VPN connection and TFS than actually writing stuff...

  • Writing ... or Just Practicing?

    Having A Bad Where? Day

    • 2 Comments

    Isn't it funny how - after a while - you tend not to notice, or you ignore the annoying habits of your closest colleagues. As I work from home, some 5,000 miles away from my next closest colleagues, the closest colleague I have is Microsoft Vista (yes, I do lead a sad and lonely life doing my remote documentation engineering thing). I mean, I've accepted that sometimes when I open a folder in Windows Explorer it will decide to show me a completely different view of the contents from the usual "Details" view I expect. I suppose it's my own fault because I happen to have a few images in there as well as Word documents, and Vista thinks it's being really helpful by telling me how I rated each one rather than stuff I want to know - like the date it was last modified.

    But worse of all is the search feature, or perhaps I should call it an unfeature. In XP, I could select a folder and enter a partial file name then watch as it wandered through the subtree (which, with my terrible memory of where I put stuff was often "C:\"). It told me where it was looking, and I knew it was just looking at filenames. If I only wanted to search the contents of files, I could tell it to do that. In Vista, I type something in the search box and get a warning that the folder isn't indexed, then a slow progress bar. I've no idea where it's looking, or what it's looking for. And neither does it by the look of the results sometimes.

    It seems to decide by itself whether to look inside files (so when I search for some part of a filename I get a ton of hits for files that happen to contain that text), yet it seems incapable of finding the matching files by name. I have to either wait till its finished or open the Search Tools dialog before I can get at the advanced options to tell it what kind of search I want and if I want all subfolders to be included. And when I do look for something in the contents of the files, I get either 1,000 hits or none at all. In fact, I've actually resorted to using TextPad to search for strings in text files recently. And after all that, I have to go clicking around the folder tree (while trying to cope with the contents oscillating widly from side to side as I open each one) to get back to where I was because it helpfully moved the folder view to the very end of my long and complicated list of folders.

    I can see that the Vista approach may be easier and quicker for simple searches, but I can't help feeling that it often just gets in the way by trying to be too clever and "usable" (something I've grumbled about before - see Easter Bonnets and Adverse Automation). Maybe some of the problem is that I'm continually creating and deleting folders and moving stuff around as I gracefully slither between projects and my other daily tasks. I've tried setting default folder and search options, but I guess Vista can't cope with my indecisiveness. Perhaps I should just keep everything in one folder called "Stuff". But then I'd need a really good search engine...

    Probably a lot of this ranting comes about because of the totally wasted day spent trying to get some updated software to run on my machine. The software in question uses a template and some DLLs that get loaded into Word, some other DLLs that do magic conversion things with the documents, and some PowerShell scripts that drive the whole caboodle. So after the installation, PowerShell refused to have anything to do with my scripts, even though I configured the appropriate execution policy setting. Finally I managed to persuade it to load the scrips, but all it would do was echo back the command line. In the end, I copied the scripts from the new folder into the existing one where the previous version was located, and the scripts ran! How do you figure that? Is there some magic setting for folder permissions that I have yet to discover?

    And then I had to run installutil to get the script to find some cmdlets in another assembly, and delay sign a couple of other assemblies that barfed with Vista's security model. After about 6 hours work, it looked like it was all sorted - until I went back into Word to discover that the assemblies it requires now produced a load error. In the end, the only working setup I could achieve was by uninstalling and going back to the previous version. And people wonder why I tend to shy away from upgrading stuff...

    At least there is some good news - the latest updates to Hyper-V I installed that morning included a new version of the integration components, and (at least at the moment) I've still got a proper mouse pointer in my virtual XP machine (see Cursory Distractions). So I guess the whole day wasn't wasted after all.

    Footnote: Actually it was - my mouse pointer has just gone back to a little black dot...

  • Writing ... or Just Practicing?

    Woefully Inadequate Kollaboration Implementation

    • 0 Comments

    It's a good thing that Tim Berners-Lee is still alive or he'd probably be turning in his grave. I was hoping to find that my latest exploration of Web-based Interfaces for Kommunicating Ideas would lead me to some Wonderfully Intuitive Kit Intended for sharing knowledge and collecting feedback, but sadly I'm Wistfully Imagining Knowledge Instruments that should have been around today - and aren't. And, yes, I'm talking about wikis.

    As an aside, you've probably seen those word ladder puzzles in the Sunday papers where you have to turn one word into another by adding one letter at a time. Seeing as how I talked about Wii last time, and wiki this week, maybe I can continue the pattern. Any suggestions of a five-letter topic that contains the letters w, i, k, and i are welcome... 

    Anyway, coming back to the original topic, it could all have been so different. Instead of the awful and highly limited format capabilities, and the need to spend an inordinate amount of time creating conversion tools, we could have had a ready-built, comprehensive, easy-to-use, and amazingly less grotty technology than wikis if we hadn't let some guy get in the way some time back in the mid nineties. Mind you, it's probably not wholly fair to blame it all on Marc Andreessen and Netscape; Microsoft followed the same path and I guess are equally guilty. I suppose the drive for world-wide adoption, the opening of the Web to the unwashed public, and commercial factors in general were the real reason behind it all.

    You see, when our Tim and his team invented HTML and the associated server-side stuff, the intention was that it would be a collaboration and information sharing mechanism. User agents (what we now call browsers) would fetch content from a server if the user had read permission and display it in a documentation format using markup to indicate the structure and type of content it contained. Elements such as "p" (paragraph), "strong", "ul" (unordered list), "ol" (ordered list), "dl" (definition list), and the "h(x)" (heading) elements would indicate the type of content contained, not the way it should be displayed.

    But, more than that, the user agent would allow the user to edit the content and then, providing they had write permission on the originating server, update the original document with their revisions and comments using elements such as "ins" and "del". However, as we've seen, the elements in HTML have come to represent the displayed format rather than the context of the content, and browsers are resolutely read-only these days. Of course, more recent mechanisms such as CSS and XML transformations allow us to move back to the concept of markup indicating context rather than display attributes. But if you want to see what it should have been like, download and install the W3C reference browser Amaya and see how it allows you to edit the pages it displays.

    So, instead we had to invent a new way to do collaboration, and wiki caught on. OK, it's probably fine for quickly knocking up a few pages to allow users to edit, review, and provide feedback. But it's seriously broken compared to doing anything sensible like you can with an XML-based format (which includes HTML 4.x). It's a kind of "Web for dummies" approach, where the concept of nesting and formatting content consists of a few weird marker characters that easily get confused with the content - even to the extent that you need to "escape" things like variable names that start with an underscore.

    I guess this railing against technology comes about because I just spent two days building a tool to convert our formatted Word docs (which use our own DocTools kit to generate a variety of outputs) into a suitable format for Codeplex wiki pages. I even had to build another "kludge" tool to add to our growing collection - it's the only way I can find to do the final content tweaks. All I can say is, whoever dreamed up this format never tried to do stuff like this with complicated multi-topic Word source documents...

    And, worse still, you have to add each page to the wiki project site individually and then attach the image files to it. OK, so the tool does give you a TOC and text files you can copy, but it sure would be nice to have a way to bulk upload stuff. My current test document set has 59 pages, so I can see I'll be spending a whole day clicking Save, Attach, and Edit.

    But maybe that has some advantages. I'll have less time to spend inflicting the general public with Wild and Incoherent Komplaints and Insults in my blog...

  • Writing ... or Just Practicing?

    I Need a Wii...

    • 0 Comments

    According to Nintendo, the name of their family games console expresses their direction to break down the wall that separates video game players from everybody else, puts people more in touch with their games, and with each other. The two letter "i"s emphasize both the unique controllers and the image of people gathering to play, and the pronunciation "we" emphasizes that this console is for everyone. But I think they only called it this so people in England could make up silly jokes.

    Anyway, having got past the obvious hilarity when my wife told me the other week that she "really, really, really wanted a wii", we've taken the plunge and acquired our first games console. She managed to convince me that it would make us both fit as we while away the evenings playing tennis and ten-pin bowling, contour our upper-body through regular boxing exercise, and master relaxation by standing on one leg on an electronic wobble-board. I can't say I was totally convinced, but—at my stage of middle age and corresponding spread—anything is worth a try.

    Of course, there's pretty much no way it will connect up with our aging TV that's driven by Media Center, or any more room (or sockets) in the lounge. And it seems we need to replace the TV in the office upstairs because it doesn't work with the aerial in the attic now they've built more houses behind us, and it only gets five channels anyway. Besides, how would I watch the educational programs I enjoy, such as racing anything with wheels or any of the myriad Poirot repeats, while my wife is electronically toning her body and mind?

    Ah, but have you tried to buy a stand for a TV that's more than three feet high lately? If you're going to be posing on a wobble-board and leaping around playing badminton, you probably want the TV positioned a bit higher than that. Yes, you can spend four hundred pounds (or six hundred dollars) buying a fancy wall mounted arrangement to hold the TV if, like me, you have a flimsy plasterboard wall to mount it on, but that seems a bit steep; so my wife helpfully suggested I build a nice shelf unit to hold everything. And in a nice wood that matches the rest of the furniture.

    So, after wandering around the Bank Holiday sales at a selection of electronic retailers and DIY stores, we came home with a new flat screen TV package (complete with the incredible assortment of paraphernalia that seems to be standard with these things), a Wii everything, and a truck full of timber and ironmongery. Last time I bought a TV, you only had to plug it into an aerial and an electric socket and it all just worked. This time, it's taken me three days to get to the point where we can watch a DVD, and I've still got to figure out how on earth the twenty or so incomprehensible components of a Wii fit together (I haven't worked out yet how get the back off the controllers to put the batteries in). And I reckon I got my month's exercise just building the wall frame and assembling everything.

    It seems that nobody has just "a TV" any more. Now you have to have a "Home Cinema System". In fact, the box that it came in was bigger than the TV. Do I really need "five plus one" speakers just to hear Louise Redknapp telling me how to stand on one leg on my wobble-board? OK, so the office already looks like a power station with all the wires my computers and the associated junk require, so a couple of furlongs of extra speaker cable will probably meld in quite well. Maybe I should copy the setup a friend has—two big sofas arranged on platforms like in a cinema with the bass woofer underneath them. Makes you really appreciate films with lots of explosions.

    And it shows how out of touch I am with this brave new world of entertainment technology when I discovered that I needed an optical audio cable (obviously wire is old-fashioned now) to connect the bits together. Worse still, I struggled for ages trying to plug it in until I finally discovered that you have to remove the squidgy clear plastic protector caps from the ends first. Well it didn't say anything about that on the packet, and the instruction books for the rest of the kit just contain vague pictures that might apply to any of a selection of products from the manufacturer's range.

    Still, at least we got there in the end. OK, so I ended up having to repaint the wall and relay the carpet afterwards, but—as my wife likes to point out—jobs I tackle never seem to be as easy as it says on the box. All we need to do now is get the TV hooked up to some receiving hardware on the roof so as it actually works. As the TV has a satellite decoder built in, we might as well go that way so I talked to somebody who is brave enough to climb a tall ladder and knows where to point the dish.

    It seems, however, that (according to my brave ladder-climbing man) satellite signals are "very sensitive to trees". Ah, I thought, obviously they have to comply with some government environmental directive, or exhibit corporate "green" credentials. In fact, he tells me, it means that we won't get a signal if there are any tall trees nearby. And there was me thinking that the dish pointed up into the sky. I live in the countryside, where there are lots of tall trees, so I'm still awaiting with baited breath to see if we're in what he calls a "reception-capable area".

    Mind you, when we turned the TV on the first time, it asked for our full postal code so it could ensure that we received "programs optimized for our viewing area". Maybe there is a satellite up there that just transmits programs suitable for our small patch of Derbyshire. Lots of documentaries about sheep and coal-mining, perhaps...

  • Writing ... or Just Practicing?

    Windows 2008 Hyper-V and Service Pack 2

    • 0 Comments

    A quick note to Hyper-V users. When I installed Windows Server 2008 Service Pack 2, it installed fine with no errors, but after a while I was getting NetBT errors in Event Log saying there was a duplicate name on the network, and other issues finding machines on the network.

    Turns out that the service pack had re-enabled all of the network connections in the base O/S running Hyper-V. As two of these need to be disabled (see http://blogs.msdn.com/alexhomer/archive/2009/02/08/Hyper_2D00_Ventilation_2C00_-Act-III.aspx for an explanation), this meant the base O/S had two connections to the internal network, one of which was obtaining a new IP address through DHCP and registering itself in DNS. 

    After disabling these connections again, everything returned to normal. Maybe I should have deleted the connections in the first place instead of just disabling them... any advice from an expert in this area would be very welcome.

    Update: In Windows Server 2008 R2 you can untick the Allow management operating system to share this network adapter option in Virtual Network Manager to remove these duplicated connections from the base O/S so that updates and patches applied in the future do not re-enable them.

Page 32 of 39 (305 items) «3031323334»