Random Disconnected Diatribes of a p&p Documentation Engineer
OK, so we don't actually make cheese sandwiches here at p&p. Well, as far as I know we don't (but if we did, they'd probably be the best cheese sandwiches in the world...). When I'm over in Redmond I have to stroll across the bridge to Building 4 and buy one from the canteen, though it's worth the effort because you get four different kinds of cheese in it - as well as some salad stuff. Only in the USA could someone decide that you need four different cheeses in a sandwich. Here in England a cheese sandwich is basically a chunk of Cheddar slapped between two slices of bread. Take it or leave it. Maybe it's because there is always so much choice over there, and people can't make up their mind which cheese to have.
And why is it so hard to order stuff in the States? I usually find it takes ten minutes just to order a coffee in Starbucks 'cos I have to answer endless questions. Do you want 2% milk sir? No, fill it up to the top please. Any syrup in it? No thanks, I want coffee not a cocktail. What about topping? Some froth would be nice. Am I going to "go" with it? No, I'll just leave it behind on the counter. In fact, when we go out for a meal I like to play the "No Questions" game. Basically, this involves waiting till last to place your order, and specifying all the details of your required repast in one go so the waiter doesn't have any questions left to ask. I've only ever won once, and that was in a pizza takeaway. I think they dream up extra imaginary questions just to make sure they get the last word.
Anyway, as usual, I'm wandering off-topic. We really need to get back to the cheese sandwiches. So, as a developer, how would you go about making a cheese sandwich? My guess would be something like this:
Looks like a good plan. So how would we do the same in the documentation department? How about:
Yep, it seems to be a completely stupid approach. But that's pretty much what we have to do to get documentation out of the door in the multitude of required formats. HTML and HxS files for MSDN, CHM files for HTML Help, merge modules for DocExplorer, and PDF for printing and direct publication. Oh, and occasionally Word documents, videos, and PowerPoint slide decks as well. Maybe you haven't noticed how complicated the doc sets in Visual Studio's DocExplorer tool and HTML Help actually are? There's fancy formatting, collapsible sections, selectable code languages (and it remembers your preference), multiple nesting of topics, inter-topic and intra-topic links, a table of contents, an index, and search capabilities. It even opens on the appropriate topic when you hit F1 on a keyword in the VS editor, click a link in a sample application, or click a Start menu or desktop shortcut. Yet it all starts off as a set of separate Word documents based on the special p&p DocTools template.
Yes, we have tools. We have a tool that converts multiple Word docs into a set of HTML files, one that generates a CHM file, and one that generates an HxS. But they don't do indexes, which have to be created by hand and then the CHM and HxS files recompiled to include the index. Then it needs a Visual Studio project to compile the HxS into a DocExplorer merge module, and another to create a setup routine to test the merge module. But if you suddenly decide you need to include topic keywords for help links, you have to edit the original Word documents, generate and then post-process the individual HTML files, and start over with assembly and compilation.
We have a tool (Sandcastle) that can create an API reference doc set as HTML files from compiled assemblies. But you need to modify all of them (often several hundred) if you want them indexed. But we have another (home-made and somewhat dodgy) custom tool for that. And then you have to recompile it all again. And then rebuild the merge module and the setup routine.
What about PDF? The starting point is the set of multiple Word docs that contain special custom content controls to define the links between topics, and there appears to be no suitable tool to assemble these. So you run the tool that converts them to a set of HTML files, then another dodgy home-built custom tool to process the HTML files and strip out all the gunk PDF can't cope with. Then you build a CHM file and compile in a table of contents and a much-tweaked style sheet. Finally, run it all through another tool that turns the CHM into a PDF document.
Need to make a minor change to the content because you added a late feature addition to the software? Found a formatting error (that word really should be in bold font)? Got a broken link because someone moved a page on their Web site? Found a rude word in the code comments that Sandcastle helpfully copied into the API reference section? For consistency, the change has to be made in the original Word document or the source code. So you start again from scratch. And it seems that there are only two people in the world who know how to do all this stuff!
Well, at least we've got a process that copes with changing demands and the unpredictability of software development. But it sure would be nice to have it all in a single IDE like Visual Studio. Even really good sandwiches with four different cheeses don't fully soothe the pain. Mind you, I hear from RoAnn that they have cheese sandwiches on flatbread in the fancy new building 37 cafeteria that they toast in a Panini grill. Being foreign (English), I'm not sure what "flatbread" actually is - surely if they used any other shape the cheese would fall out? Reminds me of the old story about the motorist who turns up at a repair shop and is told that the problem is a flat battery. "Well", says the customer, "What shape should it be?"...
I thought I’d better start off this week with that well-known email disclaimer "INAG" (I’m Not A Golfer). Mind you, when I was a lot younger and fitter, I did occasionally caddy for a few affluent visitors to the R.A.F. Changi course in Singapore. Though when I say "caddy", what I actually mean is "carry the bag and look for lost balls", but you get the drift.
Anyway, the story goes that two of these affluent golfers were walking down the third fairway one day when one said to the other "I hear that John has bought another set of golf clubs." "Really", replied the second golfer, "TaylorMade? Wilson? Dunlop?" "No", said the first, "St. Andrews, Augusta National, and Royal Troon."
Sorry about that ... but isn’t it a strange coincidence how much golf is similar to developing software. They both involve a very limited start location, provide various ways to progress, and have an extremely hard to reach destination.
I mean, for a golfer, you can start each hole from anywhere you like as long as it’s within a patch of grass about 4 yards square. For the software developer, you start with whatever system the customer is running. If you are lucky, it will be some reasonably modern hardware, a reasonably up to date operating system, and a database that is at least contemporary with modern thinking. If you are unlucky, you’ll start in 1980. It will be an old lump of big-iron and a database that uses text files with fixed width fields ("...and can you make it do Web services please...?").
As to progressing towards the target, thoughtful golf course designers generally provided a range of routes that include plenty of rough, some out of bounds if you're lucky, and usually a nice selection of bunkers. In fact, one guy I used to caddy for never once ended up at the green with the same ball he started the hole with, and religiously navigated via each of the bunkers in turn. We sometimes had to come back the next day to finish the round.
Likewise the software developer has a vast range of tools, development environments, software frameworks, and languages to choose from. Mind you, for the golfer, the rough and the bunkers are generally in the same place when they come back next month, next year, or even five years later. This is, of course, unheard of for the developer. They’ll be lucky if the software, tools, and languages are the same next week. In a year’s time they will be "legacy", and in five years time they will be obsolete.
Then there’s the target. OK so it can be tough for golfers. Course managers have a habit of moving the pin around, so the player may have to exert extraordinary powers of observation by standing on the edge of the green and scanning up to 20 degrees left and right to find the large pole with a flag nailed to the top. But it only takes half a dozen putts to get the ball into that little white cup, and you know that you are there.
However, developers probably never even get to see the hole at all. Effectively arriving at the green with a prototype, they are likely to be greeted with "Why does this button do that?" "How do I tell if a customer has paid their last bill when I'm entering an order?" "Three of our operators are allergic to Arial font." or even "Why does it keep asking me to enter a customer order? We asked for a stock control system".
Maybe we should all investigate taking up golf as a profession instead...
I was party to a discussion a couple of weeks ago that wandered off topic (as so many I'm involved in seem to do) into the concepts of whether a programmer is actually "OO" or not. I guess I have to admit to being a long-time railway (railroad) fanatic - an unfortunate tendency that has even, in the past, extended to model railways. So in real life (?),"OO" is the gauge of a model railway. But then someone suggested that many programmers, especially those coming from scripting languages such as classic ASP, are more "OB" than "OO". It turns out that what they mean, I'm given to understand, is that a large proportion of programmers write code that is object-based rather than object-oriented.
I suppose that I've generally fallen into the "OB" category. Having proudly mastered using objects in my code (starting, I guess, with stuff like FileSystemObject in ASP), it was kind of disappointing to realize that I'm still a second-class citizen in the brave new world of modern programming languages. I mean, when I use Visual Basic in .NET I purposely avoid importing the VB compatibility assembly, and I use "proper" methods such as Substring and IndexOf rather than Mid and InStr. I even use .NET data types, such as Int32 instead of Integer, though I regularly get castigated for that. Especially when I write C# code and use Int32 instead of int, and Object with a capital "O". As I discovered, if you want to start a "discussion", tell a seasoned C# programmer that they are supposed to use the .NET classes instead of all those weird data type names.
I'm told that the compiler (C# or VB) simply translates the language-specific type name into the appropriate type automatically, and it makes sense for programmers to use the types defined in the coding language rather than specifying the .NET data types directly. Does it make a difference? I'm not clever enough to know for sure, but I can't see how it would affect performance because it's all IL code in the end. My take is that more and more programmers are having to (and are expected to) cope with more than one language. If you do contract work, and prefer to work in C#, do you turn down a job working on a project originally written in Visual Basic? If we are in for a global recession, as many suggest, can you afford to turn work away?
And what about those C# programmers who tell me they "can't understand Visual Basic". I can imagine that, if you've never used it, you would find it hard to write VB.NET code from scratch, though it surely can't take long to figure that you use "End" instead of a closing curly. OK, so the OO-features have quite different keywords (like "Friend" and "MustOverride"), but it's a lot easier than learning Portuguese (unless you're Portuguese, of course) or any other foreign language. Hey, almost all of it is .NET classes. Mind you, someone to crack you across the knuckles with stick whenever your fingers stray towards the semi-colon key would help. And I can't see how any C# programmer can say they can't read Visual Basic code. Again, the class modifiers and inheritance keywords are a bit different, but they aren't that hard to figure out.
OK, so it may sound like another rant against C# programmers, but I can assure you it definitely is not. After all, I'm half C# programmer, and I really do like the language. Weirdly, though, most of what I write (in terms of programming rather than documentation) is in VB.NET - I suppose it's force of habit and the comfort factor having come from classic ASP. Yet most of what I read (docs and code) is in C#. And, as they say in the movies, some of my best friends are C# programmers...
I suppose what I really want to know is: what's the test to see if you are "OO" rather than "OB". Is there a fixed number of interfaces and base classes you have to include in a project? Do you have to have at least 10% inherited properties and methods, and use polymorphism at least twice per 500 lines of code? If you forget to refactor one class, or use an array instead of a generic list, does that automatically disqualify you? Perhaps there is a minimum set of design patterns you have to implement per application. And what about the fact that I still use FileSystemObject in an old Web site I never got round to converting from classic ASP? Or is it that, secretly, you can only really be "OO" if you write in C#, Java, C++, and other "proper" languages...?
A couple of initially unconnected events last week conspired to nudge my brain into some kind of half-awake state where it combined them into a surreal view of "automatic" stuff. One of the events was the return from Tina, our editor and proof-reader, of my article about the Team System Management Model Designer Power Tool (a product that, thankfully, I'm legally permitted to refer to as just "TSMMD" - and will do so from now on). The second event was deciding that I ought to get a laptop sorted ready for an upcoming trip to Redmond. The combined result is some manic ravings on the meanings of stupid words, and the fact that Windows Vista obviously hates me.
TSMMD is a new add-in for Visual Studio Team System that I have been documenting for the previous few CTP releases. It's a really neat tool that allows you to build management models that describe the health states and instrumentation of an application, and then generate the appropriate instrumentation code as part of your VS project (see http://www.codeplex.com/dfo/ for details). The article is one of those "About..." and "Getting Started" things that compares what the product does to some commonplace everyday situation - in this case the way repair shops can do computerized diagnosis of faults in a modern motor car. So the article came back with editorial comments such as "Err...what does this mean?" where I had written stuff like "...without having to look under the bonnet" (Tina asked if I was taking part in an Easter parade), "...hatchback or saloon car" (is this one that has a drinks cabinet built in?), and "...look for some tools in the boot" (surely that's where you keep your feet?). And, of course, "When you say 'motor car' do you mean 'automobile'?"
Some time ago I rambled on about the way that the culture and language in the US and UK are so very similar, and yet so subtly different (see Two Nations Divided by Light Switches). But the one area where almost everything seems to be different is in the realm of motoring. I mean, to me, a car starts with a bonnet and ends with boot. Just like a person (and not necessarily one in an Easter parade). Makes perfect sense. As I said to Tina, here in England we just keep our wellingtons in our boots... except when going to a car boot sale. Why on earth would a car start with a hood and end with a trunk. Sounds more like an elephant going backwards. And what do you call the bit of an open-top sports car that keeps the passengers dry when it rains? Surely that's the hood (as in "It looks like rain, better put the hood up"). Still, I suppose cars have plenty of bits with stupid names anyway. I know that "dashboard" comes from the days of carts and wagons where they nailed a plank across the front to stop galloping (dashing?) horses hooves splashing mud onto the drivers new breeches.
Notice how I avoided saying "trousers" there. I once heard a conference organizer ask all the speakers to wear black pants for their presentations as part of a consistent dress code. I wondered how attendees would know what color underwear I had on. But that's a whole different topic area.
Anyway, getting back to cars, I suppose we now refer to the bit with the speedometer on it as the fascia. However, we still have a "fan belt" even though the radiator fan is electric and the belt drives the alternator and the pumps for the power steering and the air conditioning instead. And it isn't just the car itself. What about how, here in the UK, we have "slip roads" on our motorways? It's not like they are surfaced with super-smooth tarmac (asphalt) so you slide around. I guess the idea is that you use them to slip into the traffic stream (in which case, the way most people drive here, they should be called "barge-in roads"). The US "on-ramp" and "off-ramp" make more sense, even when they don't go uphill or downhill.
And why "freeway" in the US? My experience of driving in Florida is that you have to carry $20 in loose change for the toll booths that they planted every two miles. Although, around where I live, when they build a new bypass round a town or village, everyone refers to it as "the fast road". Even when there's traffic lights every 20 yards and a half-mile tailback most of the day. Again, the words conspire to confuse. "Traffic lights"? A nice sensible term I reckon. Yet when we were working on the Unity Application Block they wrote a sample they called "Stoplight". As the UI was just three oblong colored boxes it took me a while to figure that this was the US equivalent. Is it still a "stop light" when it's showing green?
But enough motoring invectives. The other conspiring event this week was battling, after a few months away from it, with Vista. I have to say that there are lots of things I really love about Vista, but it seems to have been designed to annoy the more technical and experienced computer user. A lot of the aggro is, I know, attributable to my long-established familiarity with XP and its internal workings. Vista is no doubt ideal for the less experienced user, as it hides lots of complexity and presents functionality that works automatically.
Yes, I've finally given up and turned off UAC so I can poke about as required and use weird scripts and macros required for my daily tasks. But it would be really nice to have an "expert" mode that lets you see (and change) all the hidden settings without having to go through several "inexperienced user" screens. I mean, it keeps complaining that my connection to the outside world through my proxy server is "not authenticated" even though it works fine, and I can't find any way to change this. And it won't let my FTP client list files, even though it works fine on the XP machine sitting next to it.
What finally got me going this week, however, it backing up the machine. I carry a small USB disk drive around with a full image of the C: drive that I can restore if it all falls over. For years I've been using TrueImage (see diary entries passim) and it works well. However, when I bought this laptop I paid extra for Vista Ultimate so I'd get the proper built-in backup software to do disk imaging. I imaged the machine when it was new, but the configuration is much changed since then so I thought I'd do a complete new backup. As there isn't a lot of space on the USB drive, I deleted the existing backup first. But Vista still thinks it's there (obviously there's some secret setting somewhere in the O/S, and it's not in the Registry 'cos I searched there) that makes Vista think I have an existing backup. So it will only do an incremental backup - there is no option to say I want a whole new one.
And it also insists on backing up the drive D: restore partition, even though I don't want that backed up. So I ran it anyway, but afterwards all it said was "the backup is complete". Did it do an incremental one or a full one? Did it skip stuff that it thinks is in the backup image I deleted? Will it actually restore to give me a working machine? In the end I deleted the backup and used TrueImage (I've got version 10 and it works fine with Vista). It asks you everything it needs to know to create the kind of image you want, and then just does it. And I've restored machines in the past using it, so I feel comfortable that I can get back to where I was when the sky falls in.
You see, this is where I worry about "automatic" stuff. For some things it seems like a really good idea, and often it "just works". Drifting back to cars, my latest acquisition has automatic climate control that just works. You can, if you wish, dive into the settings and specify hundreds of individual parts of the process, but why bother? Just set the temperature you want and let it get on with it. The car also has automatic window wipers (notice I avoided saying windscreen or windshield), which is great. They wipe the window when it rains.
But it also has automatic headlights that come on when it's dark. And this feature is turned off because I always worry that they'll come on just as I get to a junction and someone will think I'm flashing them and pull out right in front of me. Notice the important point. You can turn off the automation if you don't want it...
Here in our quiet little corner of the People's Republic of Europe, our Government decided a while ago to flog off the radio spectrum in order to pay for their countless spin doctors, pointless focus groups, endless ministerial jaunts, never-ending quangos, and failed experiments with Socialism. In return, they gave us the opportunity to enter the brave new world of Digital Broadcasting. And, rumor has it, they will eventualy build enough transmitters so that those of us who don't live in London will actually be able to receive it. Last I heard, the target date is 2013. Meanwhile, I've had to fill the entire attic of our house with bits of bent aluminium to try and drag some scraps of DAB (Digital Audio Broadcasting) out of the airwaves and down to the kitchen so my wife can have rock music on loud enough to drown out the sound of me washing the dishes.
Anyone brave enough to have tackled the diary entries from my previous life will know that, up until now, we've been using a rather nice stand-alone Soundbridge Internet Radio to get a constant stream of rock music that generally smothers my unfortunate domestic noises. However, since the BBC released their iPlayer, the fragility of the copper-wired Internet in our part of the country has been exposed for all to see. Now all we get from Virgin Classic Rock in the evenings and weekends is "It looks like you can't get our digital stream..." followed by several seconds of rebuffering and then another five minutes of music. So the boss gets to hear me clanking her best plates together.
Those distant diaretic ramblings also documented the problems of trying to get Microsoft Media Center working with the exciting new digital technologies in our forgotten little corner of Merry Olde England (well, actually almost the geographical center, but still a long way from London). Suffice to say that it basically involved turning the roof of our house into a miniature version of Jodrell Bank, but at least we now get (depending on weather conditions) around 80 channels of digital stuff on the TV. Which, apart from 30 TV channels that seem to still be showing programmes from 1973, includes loads of radio channels. As I couldn't figure a way to drag the 42" wall-mounted screen into the kitchen every time I did the dishes, I suggested to my wife that she just turn one of these up really loud and pretend we live next door to a rock festival, but she wouldn't go for that.
So, for her birthday the other week, I treated her to a shiny new DAB radio. It's a really neat thing that consists of five different lumps of plastic - two speakers, a control unit, a combined bass woofer, and a separate tiny little matrix display thingy that you stick on the wall. This means that I can hide everything but the display thingy on top of the cupboards out of the way of the soapy fountain that is me doing the dishes. And combined with some low-loss cable and the aluminium-filled attic, we can actually get Planet Rock and a couple of dozen other stations. In fact, there's even one that just plays birdsong all day!
One really neat thing with DAB is that you can view the secondary information stream. Saves loads of arguments about which band it was that recorded the track you're listening to. It even tells you the name of the program you're tuned to, and what's coming next. I guess this is pretty much underwhelming to those who have already had DAB for ages, but - as late arrivals to the digital scene - we got really excited about it. Shows how interesting my life is most of the time. However, after playing with this for a while, I suddenly realised that the people who make the hardware obviously don't do any field testing of their products. I mean, the options for the "extra info" on the neat little display thingy are the MUX channel name (such as "DigitalNetwork1"), the time and date (in case you don't own a clock), the signal strength, the bit rate, and the rather nice scrolling text messages.
Now, as a developer, what would you do? Have it remember what you selected last time and go back to that option automatically? Have it default to the rather nice scrolling text messages? Allow the user to select which they want as the default in the setup menu? All, in my opinion, obvious options. But no, they decided that it should always default to the MUX channel name every time, and you can't change this behavior. You have to press "Info/Display" twice every time you turn it on or change channel. Imagine if Windows started with a DOS prompt every time and you had to type "WIN" and click "Yes" to get to your desktop. Err... a bit like Windows 3.0 in fact. Maybe the radio's O/S developers were still using that.
And here's another thing with digital radio. It can't tell the time. With the old-fashioned steam radios of the FM and AM variety, the time signal was pretty much accurate. OK, so if you lived a long way from the transmitter you were maybe a picosecond or two behind as the radio waves fought their way through the clouds and trees, but it was near enough. Now its a second or two behind because it has to go through some magic process to get converted to digital and back again. How do I know? Because the kitchen clock is one of those radio-controlled things. Supposedly it uses proper radio waves so as to be accurate to a fraction of a second. Even when those waves have to come all the way from Rugby, which is nearly 60 miles away. And, of course, the same happens with digital TV. I recently read a letter in the paper from someone who has three DAB radios as well as digital TV, and they said they all vary by several seconds. So in our brave new digital world, you never actually know what time it is. Maybe that's what they mean by Internet time - everyone has their own version.
I suppose I could just use the fancy radio-controlled watch that my wife bought me for Christmas instead. Except it has 97 functions and only four buttons. And one of those just turns the backlight on. Every time I put it on it tells me the time in Hong Kong. I have to carry the instruction book around with me so I can reconfigure it - possibly another good example of lack of field testing. And they say that software is hard for "ordinary people" to understand. Just imagine how much fun we'll have once they get Word and Outlook to run on a wristwatch. Not only will you need to carry a box of instruction manuals around (which I guess is good for us here in the documentation team), you'll probably miss your train because you won't know what time it is, or if your time actually is the real one...