Random Disconnected Diatribes of a p&p Documentation Engineer
I don't know how I manage it, but I seem to continually find myself trailing behind in this ever-changing world of digital technology. After the problems of a few weeks ago with a failed Media Center box (which, it seems, can't be fixed) I've finally got the new replacement machine up and running. We've been magically transported from the gray and disappointing confines of Media Center 2005 into the vibrant and exciting new world of Vista Media Center - just a month before Windows 7 is released. I suppose our only hope of actually catching up with O/S releases will be if next door's toddler happens to shove a slice of buttered toast into the DVD drive so we need to buy another new one.
But I must say, the new box is rather nice. It's basically a Shuttle system specially configured by an outfit called I.US to run as a fully capable Media Center. There's a twin digital TV tuner and some software to help Media Center go to sleep and wake up again, a Blu-Ray drive (though Media Center can't show Blu-Ray disks so you have to use the alternative player they provide for them), and a nice wireless keyboard. It came with a suitcase full of extra cables and stuff, and a natty USB wireless network dongle (though we are mainly hard-wired in our house). It looks pretty and, nicest of all, is extremely quiet. Yet it runs Vista and Media Center well, with few indications of performance issues.
And one bonus is that I finally figured out (with some help from other people's blog posts) how to get rid of the annoying auto-login problem with my wife's Vista laptop. Since I set it up, it's been refusing to do auto-login with a "wrong password" error, despite me fiddling endlessly with the limited settings in the Vista account management dialogs. But I had to resolve the problem so I could get an account to auto-login on new Media Center box. The I.US people do a full setup of the O/S, including registering it and running a series of tests such as copying and burning a DVD, tuning the TV cards, and configuring stuff like updates, UAC, and Media Center options.
They create a single admin account with no password, so the system automatically logs on. However, if you want to interact with other machines on a Server 2008 domain (even though you can't actually join the domain), you are stuffed because domain policy prevents the use of blank passwords. Rather than change this behavior, or risk breaking something by changing the default account password, I created a second account on the Media Center box with the same credentials as a user account on the domain so that I can do backups and access the media storage folders on the network that have "Everyone" permission. As I discovered, if you create a mapped drive using a different set of credentials from the logged on user, it conveniently forgets to use them when it tries to map the drive at logon. Most annoying.
I guess it's another indication of how Vista hides stuff and tries to make things too easy in that you have to go through a convoluted process to get an account to auto-logon when there is more than one, or when it has a password. It involves a command line option to open a hidden wizard, and some unintuitive setting and clearing of checkboxes and selecting stuff in a grayed out list box. For details, see Automatic logon at startup in vista. It seems that this process prepares the account for auto-logon by updating the same section of Windows registry as you used in XP to make this work, but it stores the password somewhere safe rather than as text in the registry. Interestingly, after configuring it, you can't just do a "Switch User" and log on with that account. You have to reboot, but then it all seems to just work.
One hang-up was the fact that the home-grown custom photo screensaver we've used in Media Center 2005 doesn't work in Vista. However, there is a nice photo screensaver in Vista with lots of clever display options. Except on the new box, the screensaver settings dialog said that my graphics card "does not support themes". So I could only use the nasty basic version. A few forum posts revealed that the video card needs to support "Vertex Shader 2.0", but that didn't bear any resemblance to the meaningless gobbledygook options in the Nvidia control panel.
However, it turns out that Vista also requires a performance index of higher than three, yet it doesn't actually do a check - it relies on the results of a previous calibration by the "Is My Computer Any Good" wizard in Control Panel. When I opened that, it revealed that my whizz-bang super-fast (and rather expensive) new box scored just 1.0. However, after running it again I got 4.8 - maybe I even qualify for a place in the "my computer's faster than yours" forums now. And, back in the screensaver properties dialog, I've got a choice of any theme I like! Didn't anyone think that this would happen when they were designing it? I suppose it does make sense in some way, but why not provide a link to run the wizard, or at least a note to explain the consequences of this behavior. The help file doesn't suggest running the wizard until you click through three pages. It mainly thinks I should just go out and buy a better video card.
Strangely enough, this "didn't anybody think this was going to happen" scenario also kind of coincided with a newspaper report I was reading this week. It seems that, here in Britain, we are "the swine flu capital of Europe" with over 10,000 new cases occurring every day. The men who supposedly run the country took immediate steps to limit our suffering by setting up a "swine line" call center with 3,000 staff. Of course, none are doctors so they work from a script to determine if the caller has swine flu. Helpfully, they published the symptoms in the script in all the national newspapers and in leaflets sent to every household. If you think you've got it, you call them, answer the questions, and they give you a code number that your "flu buddy" can to take to one of the special "swine flu centers" to get your dose of Tamiflu and a no-questions-asked sick note for a week off work.
Turns out, as you'd probably expect, that most of the people using it are just looking for a free week off work. In fact, the peaks in the graph of "new infections" neatly coincided with the weeks when the weather was nice. I hear that seaside resorts and amusement parks are all reporting record takings this season, so probably there's only seven people in England that actually have flu. However, according to "a Government spokesperson", the "skivers" will have "a real problem" when the flu does its usual trick of reappearing in the autumn, and they really do catch it. Though I suspect it will be their co-workers and the people they share the bus and train with that will actually have the real problem...
But at least we'll be able to watch TV now when we're stuck home with our Tamiflu tablets, hot lemon drinks, and (for medicinal reasons obviously) a few whisky chasers.
In between the usual spates of frantic two-fingered typing of exciting new guidance this week, I've been attempting to expand my brain to the size of a small asteroid (with appropriate apologies to Douglas Adams fans, the size of a planet seems a rather optimistic aim). All this comes about because an increasing amount of stuff in the project I'm working on at the moment, the upcoming version of Enterprise Library, depends on new whizz-bang features of the .NET languages such as lambda expressions, nullable types, anonymous delegates, and implicit typing. As an upgraded VBScripter, much of this might as well have been written in Klingon for all the sense I could make of it.
It means I've had to do a crash course in advanced C# so that I don't look too much of an idiot at product meetings and demos by asking stupid questions such as "did you miss the word 'delegate' out there?" or "shouldn't the 'select' bit come first?". So I've been reading (and re-reading) a book by fellow Englishman Jon Skeet called "C# In Depth" (ISBN 1933988363), and I can definitely recommend it. It skips the usual "What .NET is", "How to do foreach loops", and the history of the Internet; and dives straight into the exciting new stuff in C# 2.0 and C# 3.0. It's well written and understandable to mere humans, yet delves deep into the topics and really makes sense of them. The only criticism I have is the overuse in almost every chapter of the excruciatingly awful term "syntactic sugar" when referring to the new language features.
And I've even had to order an equivalent VB.NET book as well so I can figure out how do code sample translations for the docs, and write my own demo code in both languages without just cheating and using Telerik's useful Code Converter site. It will be interesting to see if the combination of explanations from two different authors, in two different programming languages, makes the topics clearer or just more incomprehensible. Maybe after I've read them both a few times, my brain will be approaching small moon size. Perhaps I can even persuade people that my increasing bald patch is due to "syntactic squeeze" inside my head forcing out the few remaining gray hairs.
But what really prompted this week's doubtful diatribe was a throw-away comment in Jon's book. He says (though I'm not allowed to quote the actual words due to copyright restrictions) that developers previously had to rely on manually written documentation, whereas the syntax of the recent versions of languages such as C# 2.0 and 3.0, combined with the increasing capabilities of IDEs such as Visual Studio, means that this is no longer required. He suggests that documentation is often incomplete or inaccurate, and that developers rarely read it anyway.
This is kind of worrying, especially as a large part of my job is to create documentation for code projects, rather than just architectural guidance. And it seems that the bits developers do rely on most are the API reference sections that simply list and describe the classes and their members; and is built by an automated process from the generated assemblies, XML, and PDB files. OK, so this does depend to some extent on developers putting meaningful summary sections in their code for it to make much sense, but I can't see Microsoft paying me full time just to click the "Go" button in Sandcastle once a month. Do I need to start looking for a new job?
Yet, I'm not convinced that the huge amount of effort we put into producing documentation for p&p projects such as Enterprise Library is a waste of time. Yes, given a requirement to write a function to calculate some esoteric mathematical result, or interact with a Web Service, or even a whole application to do some kind of data processing, you may only need to know about the syntax and usage of elements of the language. Maybe that's what Jon was getting at. But when it comes to understanding a new technology, code library, or development framework, surely developers need some additional assistance in the form of written (or other format) guidance.
Most of the effort we put into documentation for things like Enterprise Library involves an overview of the feature so you can grasp what it does, when you should use it, and see an outline of how it works so you can understand it. Then there's information that's hard to get from looking at the code or using IntelliSense; such as configuration settings and their effects, recommendations on choosing appropriate methods and overloads, explanation of the return values and the ways you can use them, and more. I can't see how the brief descriptions in an API reference can provide this kind of assistance.
Or could it be that I'm just not a capable enough programmer to be able to do it without somebody holding my hand? Maybe I just need to get to know more people with planet-sized brains...
I read somewhere a while ago that the word "expert" comes from a combination of the two Latin words "ex" meaning "a has-been", and "spurt" meaning "a drip under pressure". I'm not sure I actually believe it, but is does seem a remarkably fortuitous match to my capabilities when it comes to the grudge matches I regularly indulge in just trying to keep my own network running. I suppose I've rambled on about my semi-competence as a network administrator enough times in the past. It's one of those areas where you think you're starting to get the knack of it, and then you realize that it's only because you haven't yet discovered all the things you don't know.
I guess it's a bit like the famous quote from Donald Rumsfield (the US Secretary of Defence a while back), who told a Pentagon briefing that "There are known knowns; these are things we know that we know. There are known unknowns. That is to say, there are things we know we don't know. But there are also unknown unknowns. These are things we don't know we don't know." Pretty much sums up the task of network administration I reckon. I just wish I could get a list of the things I don't know that I don't know - probably there's a Web site out there with it on, but one of the things I don't know is where to find it.
Anyway, enough prevarication - time I dragged this post back to somewhere nearer to reality (or at least as near as I usually get). The background to all this is another round of network updates aimed at trying to achieve a more reliable - and faster - connection between the confines of my daily routine in a remote office perched precariously on the edge of the Internet, and the big wide world of high speed connectivity out there. In a shock announcement a few weeks ago, our Government took the bold step of announcing that they expect everyone in the country to have access to "high speed broadband" at a speed of at least 2MB by no later than 2015. I suspect they're thinking we'll catch up with countries like Japan who can get 100MB without really trying.
Well, I get about 1.6MB on a good day over my 2MB ADSL connection, so waiting till 2015 for an extra 0.4MB seems a less than attractive proposition. And I'll still only get 512K up, which is the real killer when you use VPN or need to send large documents and files. And, unless I move house to next door to the local telephone exchange, I'm not going to get anything faster over ADSL. However, our national cable company suddenly seems to have got its act together, and can now provide data connections in our area. They even offer a symmetric service, with packages from 4MB to 100MB both ways! The only hang-up is that the cheapest is around 5,000 pounds ($9,000) per year.
But they do a "business service" that's 20MB down and 1MB up for a much more realistic rate, and so I've taken the plunge and ordered it. However, I hear regular disaster tales about cable services, so I decided to keep the ADSL line as well to provide redundancy in addition to higher speed. LinkSys do a reasonably priced load-balancing and failover router (the RV042) that should do the job nicely. And it will no doubt prove to be a useful source for a forthcoming diatribe on networking for dummies.
In the meantime, it means reshuffling the existing stuff around to make room; and to reconfigure the servers to provide separate connections for the Web server and DNS, which require static IP addresses, and the main network gateway that doesn't. That way I can take advantage of load-balancing for the internal network, while leaving the stuff that needs static addresses - and isn't, in reality, very busy - on the existing ADSL line. It all seemed very simple until I started to Visio the network diagram (is Visio a verb these days?), and discovered I don't have enough network cards in the servers...
So here I can sing the praises of Hyper-V. Previously I used one NIC for the host server O/S, one for the virtual internal switch, and one for the virtual external switch (as described in Hyper-Ventilation, Act III). Now I need two separate external connections, so I reconfigured the Hyper-V network to have one virtual switch connected to the internal network NIC, and two external switches connected to the other two NICs. I took the precaution of removing the VMs from Hyper-V manager first, and - amazingly - when I re-imported them, they retained all of their IPv4 settings! OK, so they did forget all their IPv6 settings, but that's not a huge job to reconfigure. And I even moved one of the VMs from one machine to another using the Export and Import tools, and it worked a treat. I have to say that going Hyper-V certainly seems like it was a good choice.
However, I came across one issue that first raised its ugly head a while ago. Hyper-V tries to be clever and ensure that you always have a working connection in the base (host) O/S. This means that, if you have more than one NIC in the machine, you end up with multiple active connections in the base O/S - and you have to disable the ones you don't need. That's OK, but I noticed that Service Pack 2 helpfully re-enabled all of the connections without telling me, and I only found out when I got error messages saying there were duplicate machine names on the network. Now I only have one connection to the internal network, so that error will never arise. If the connections in the base O/S get enabled again by some update or other factor, I'll have active connections between the external network and the base O/S.
One saving grace is that the duplicated connections have no fixed IP address or DNS entry, and rely on DHCP, so they won't actually work because the external router doesn't support DHCP. But if they were connected directly to an ISP, that IPS's system could provide one quite happily I guess. Would you know about it? I asked some Hyper-V experts whether I should uninstall the duplicated connections in the base O/S, but they advised against it. It seems this is resolved in Server 2008 R2, but you probably want to keep your eye on your base O/S connections until you upgrade.
And Server 2008 isn't the only thing that's waiting for the software to catch up. As part of the upgrades, I replaced the somewhat limited (and nearly full) Buffalo NAS drive with a gleaming new NetGear 4TB X-RAID package to try and keep my backups (including the server VMs) safe. Guess what? Just like the Buffalo drive, it can't talk to Server 2008 Active Directory. The updated system software is in beta and I'm loathe to try that with my precious data, so I'm back to using the drive admin account in my backup batch files. As they say in the trade (though I'm not sure which trade): "Rearrange these words into a well-known phrase or saying: Your Guys Act Together Get".
Finally (yes, there is an end to this post in sight), I discovered an interesting feature of the built-in Windows standard firewall. After moving the external DNS server from the gateway server to the Web server that lives in the perimeter network, I enabled the "DNS Service" filter in the standard Windows firewall to allow access for lookups from the Internet. As usual, after any network updates, I ran a port scan to check I hadn't done anything stupid. So it was a bit of a shock to discover that port 135 (DCOM / RPC) was open. It seems it's to allow external management of the DNS server using the MMC snap-in, and only allows connections to the DNS server executable, but I'm pretty sure that's not really something I want to allow anyone and his dog to do from their spare bedroom. Yes, I know that my external router will block port 135, and they'd need to be able to log on anyway, but it seems like I should have been told this would happen.
The answer is to configure Windows Firewall With Advanced Security (on the Administrative Tools menu). It looks a bit scary, especially as - at first glance - it seems like there are tons of enabled "Allow" rules that you definitely don't want on a public machine. But if you click the Monitoring node in the tree view you should see that the Public profile is active, and so most of the enabled rules (which specify the Domain or Private profile) are not active. Click the Firewall node to see which rules actually are active. To see (and change) which profile a connection uses, open Control Panel | Network and Sharing Center and click Customize for the connection. In my case, I disabled the DNS Service filter in the standard firewall dialog and enabled the two Public rules in WFWAS for DNS Incoming - one each for TCP and UDP on port 53. A subsequent port scan found port 135 firmly closed. Back in the standard Windows firewall dialog, the DNS Service entry now displays a "shaded tick" status.
All this shows how it's a really good idea to always check every update to your network - you can use a site such as Shields Up at Gibson Research for public facing machines and routers. I actually got a "perfect TruStealth rating" from them and from a couple of other sites on one connection, so I feel like a real administrator now. Though I'm sure there'll be some new unknown unknowns to discover tomorrow...
Anyone unfortunate enough to have followed my frantic ramblings over the years (though this blog and my diary from a previous life) will know that, in our house, we are fully paid up members of the modern all-singing, all-dancing, digital media and entertainment society. Well, OK, so we have a Media Center that is our main TV, DVD player, music jukebox, streaming device for our favorite saved videos, and presentation mechanism for a huge library of digital photos. We even use a photo screensaver, so we can relive those wonderful memories of the past whilst daydreaming in our armchairs in the evenings (pipe and slippers being optional accessories).
So, you can image the massed panic and total disruption to our daily life when, the other day, my wife pressed the button on the front to fire it up - and nothing happened. Of course, I didn't panic. Or, at least, I didn't panic immediately, knowing that it was just a glitch that the big red switch on the wall would resolve. And when it didn't, I panicked as well. But after the requisite period of running around waving our hands over our heads and howling in anguish, I came up with the solution: "No problem, we'll just buy another one."
Ha! Where did they all go? Only a couple of years ago there were dozens of different ready-built and off-the-shelf systems available, but a search of the Web revealed that now there seem to be only two - neither of which will work with our aging non-HDMI screen, nor have twin digital TV tuners. And digging deeper, it seems that they are both discontinued products anyway; in fact, if Sony is an example, the retail lifetime of a new model is about six weeks. The young sales guy at our local Sony Centre had never even heard of the two unsuitable ones I found still listed for sale at a few suppliers.
Eventually I found a company locally that advertises some nice-looking custom built machines, with a huge range of tempting options for power supplies, video cards, and the rest. But they don't have a showroom or any machines you can look at, don't do demonstrations, and only take orders over the Web. From previous experience, I reckon that a Media Center machine is definitely something you need to see working (and hear how loud it is) before parting with the not inconsiderable volumes of credit card.
But then I found a refurbished Acer 510 (like we have now) available from a London company, and snapped it up - and it turned out to be just as faulty as ours. The refurbishment obviously consisted of losing some of the screws, bending the cabinet lid so it didn't fit properly, and disconnecting the power button. So it's gone back to them. In fact, the behavior was not unlike that of our own broken one, which is currently languishing in the workshop of a local PC repair guy. I've told him he needs to get it to work, even if it means bodging an external power supply (which seems to be the fault), but he doesn't seem to hold out much hope.
The alternative looked like some consumer-related setup such as Sky Plus (satellite), or a "normal" TV with a slot for a memory card. But none of these can provide the total immersive experience of Media Center. And I've yet to find a "standard" PC that, while it might have Vista with Media Center installed, is quiet enough for the living room - or hibernates and wakes reliably to record stuff. For the last few days we've been using a very old Humax DVR that I bought in a sale when they were discontinued some years back, but the interface and capabilities make you feel like you're using a home computer from about 1985.
And then, after much more Web research, I came across the I.US range of Athlon-powered Media Center machines. The prices are a bit scary, but they get great write-ups from the H-Fi magazines (they seem to be aimed at hi-fi-retailers rather than computer dealers). I even found some available from Amazon, so - with a view to regaining marital harmony - I've bitten the bullet and flashed the cash. At least it's a nice looking case, so we can use it as an ornament if nothing else.
So, here's the question: As the average Joe (and Joanna) become more and more media-immersed with their digital cameras, downloaded music, and huge wall screens, surely there should be more systems available rather than fewer? I can't even find systems that use the competitors to the Windows Media Center software (Dell did have a version at one time). And why don't those cinema sound systems or digital video recorders that seem to be obligatory with modern TVs have facilities for seamlessly displaying photos and playing music?
Footnote: Interestingly, I've had replies from a couple of companies that do Media Center systems, which never showed up in my endless Web searches. One company (Russound) tell me they tried to get into the market but just couldn't sell them. Maybe it's just too specialist an area - I know only one other person who uses Media Center. Yet, once you're into it (and accept that it is, after all, a computer so it does need some TLC at times), you'll never go back to "ordinary" TV again.
FootfootNote: I also found this site that UK readers may be interested in if searching for a ready-built Media Center system: http://www.mediacenter-tv.co.uk/. I don't have any other information about them, but they do seem to list plenty of highly configurable systems.
FootfootfootNote: You might also like to take a look at http://www.media-centre-pc.co.uk/index.php?dispatch=categories.view&category_id=165 and http://www.vivadi.com/Media%20Centres.html.
It's a strange experience when you open the curtains in the morning to be faced by men in high visibility jackets and hard hats only a few yards away, and 30 feet above the ground. Mind you, the noise made by the assortment of cranes, diggers, and other plant they use - combined with regular hammering and occasional swearing - means you don't get to overlay in the mornings. I've even got to know most of them, and give them a cheery wave as I try and convert from half-asleep to some state of semi-awakeness. Though they do seem somewhat reticent about waving back to a zombie-like character with a dragged-through-a-hedge-backwards hairstyle, and still adorned in a bright blue check dressing gown.
What's interesting, though, is how they seem to build houses these days. In some ways, it's quite reminiscent of the agile process, which we follow here at p&p. OK, so they do have a detailed plan when they start, but you can see the way that this gradually morphs as they turn it into bricks and mortar. And even more so with some of the actual construction processes they follow. For example, the bricklayers leave slots in the wall for the scaffold poles as they build upwards, but obviously nobody told them how long a scaffold plank is - so they leave slots at seemingly random intervals. This means that the scaffold guy has to bodge together extra bits so the joints between the planks are properly supported. Maybe nobody thought of implementing an IScaffoldPlank interface.
And they leave nice neat holes in the wall for the pipes and stuff as they build, but they never seem to be in quite the right place. Though they do have metal frames they build into the wall for the windows, yet they tell me they don't order the windows until they've measured the holes after the brickwork in finished. So even if they do have specifications, they don't actually trust them. I suppose it's their equivalent of test driven development.
Agile development encourages completion of iteration tasks as chunks of a complete project, and the bricklayers obviously follow this technique. They even do paired development (funny how you never seem to see just one). And in an effort to complete their part of the process during the current iteration, they even build the extremely precarious "sticking out bits" on the corners that finish off the end of the fascia where walls, tiles, and gutters meet. And then, when the carpenters lift the roof trusses into place, they invariably knock these off. And even when they don't, the bricklayers seem to have to come back and alter them because they don't line up with the other bits.
After the roof trusses are in place, the roofers arrive and carry bundles of tiles up onto the roof and place them in nice equally-spaced heaps over the whole roof area. Yet when they come to lay the tiles onto the roof, they have to start at the bottom so that they overlap properly. By the time they are a third of the way up the slope, hammering in the fixing nails has the occasional interesting side effect of causing tiles to slide off the heaps above them and crash down onto the scaffolding - often narrowly missing them. Mind you, the experts seem to be able to catch these as they go past, like it's some kind of game. They rest end up in pieces in my garden.
You'd think that it would make sense to just carry them up onto the roof as they needed them. Yes, I know they have to build up equally on both sides of the slope (or the roof will fall off the walls with the unbalanced weight), but I reckon they do it so that they can see if the roof will actually take the weight of several hundred concrete tiles. The fact that they seem to put them all up there and then go away for a couple of days only reinforces this opinion. Probably it's their implementation of experimental spikes.
Still, it's going to seem very quiet here in a few weeks time when they've finished. I know you can buy CDs that just play the relaxing sounds of waves and birdsong. I wonder if I can get one that plays building noises...