August, 2005

  • Eric Gunnerson's Compendium

    Another ASP question...


    I've been working on my website a fair bit. Working on it locally works well for some things, but the google maps stuff only works if it's running on my real domain, so I have to get it up to the website.

    I used "publish" to do that, which turns out to be a bit of a mistake. When you do this, VS says that it will delete files, and asks for confirmation. I assumed this was just about overwriting the files that were up there, but it's really a "scorched earth" approach which toasted every file on my website before deploying the application.

    A day later and a few $$$ lighter, I had my content back off of backup.

    So, what should I be doing? Publish is convenient in that it gets everything up there, but it takes roughly forever to do so, so it's not the solution I'd prefer. I've looked at "copy website", which might work, but I presume that I would have to a) figure out what I've changed and b) copy each file up to the server in the right place. Doing this correctly (ie getting all the files and any assemblies I use) up there is pretty tough, and I don't want a site that's sometimes published and sometimes web copied...

    Is there a better way of doing what I want to do? Or is the presumption that I will do all my development locally?


  • Eric Gunnerson's Compendium

    Untempered ambition


    Author's Note: I've provided some links to help make my case, but I suggest you read through without them first...

    It wasn't at all like Archimedes - it didn't happen all at once. It started with a gnawing at the back of the skull, an itch that you can't scratch, and then as the days pass, it worms its way into your conscious mind.

    Through a quirk of chance, we'd crossed paths before. That time he was hustling to get a new magazine going, something devoted to "the latest thing". My contacts told me that he'd made some dough running that scam in the past, but this one didn't pan out. He melted away, and I forgot about him for a few years.

    Then one day, I came across his name again. I had to give him credit this time. Blogs had been hot for years, and with the Texans's seventh win, cycling was in the news. Put those two together, and you've got a hook. Add a bit of human interest, and it's foolproof, especially when you have a friend who specializes in anatomical prosthetics.

    I've got to admit that his writing's good. Great, even. Some might say he has a gift for comedy. He had me fooled, and he could have gotten away with it for a long time, but he'd gotten what he wanted, and made it to the top. And, like some many of his ilk, he got careless.

    He even admitted it in his blog, when he wrote, "There is no single entry in this blog that is entirely honest."

    But I don't think he's going to come clean, and since I have eyewitness proof, I think I'm going to have to force his hand.

    Some are fat. Some are cyclists. But like that story of so many years ago, I'll have to be the small child here, and be the first to point out the truth.

    The Fat Cyclist has insufficient weight. 167.2 pounds does not a compelling story make.

  • Eric Gunnerson's Compendium

    Zoo Two


    A few weeks ago, Bret, one of my PM friends (well, I call people like Bret friends since they're people that tolerate my presence), told me he was looking at my bicycle climb site and and saw a climb named "The Zoo". One day, he said, "I think I'd like to climb the zoo - let's set up a time and go up it together".

    I should perhaps step back a bit and explain a bit. The Fat Cyclist has written at length (and at more length) about how to size up cycling competition. While such guides are useful if one wants to avoid human interaction, if the rider in question lives on the Eastside of Seattle, you can get all the information you need with a single question:

    "Have you ever done the Zoo?"

    There are three answers you get:

    • What's the zoo?
      This rider really isn't worth your time. Even if they can drop you on the flats, they haven't suffered sufficiently, and therefore any of their achievements can never rise to your level.
    • No
      This rider is no competition to you. No matter what happens on the ride, you have tried and triumphed, and therefore possess an inner strength that they are lacking. This is great consolation when they ride away from you on their big chainring.
    • Yes. It's pretty steep.
      Beware this rider. Not only have they tried and triumphed, they are playing mind games with you. The zoo is "pretty steep" in the same sense that on Oxy-Acetylene torch is "pretty hot", or Everest is "pretty tall".

    Cyclists have this weird thing about shared suffering. Hard climbs are always better when somebody else is suffering along with you, and if you can sucker in an unsuspecting rider who doesn't really know what they're in for, all the better.

    So I was happy to set up a time to ride the Zoo with Bret. Happy... No, that's not it, what's that word again? Ah, that's it. Disturbed. I was disturbed about it. But there's nothing to be done about it - if you've ridden the zoo, you can't wimp out when somebody else wants to try it.

    The weather this morning was perfect. About 60 degrees, and sunny. On the ride there, I toyed briefly with saying that it had been a little cold when I rode up it early this morning, but I decided that that would be too cruel. Okay, that's not really truthful. I just didn't think I could pull it off.

    The ride was about as good my first trip up it last year. In other words, 25 minutes of suffering, but not as bad as before. Bret suffered well and made it to the top, all 1200 feet of it.

    Oh, and we had a surprise companion with us. More on that later...

  • Eric Gunnerson's Compendium

    Are you greedy or lazy?


    I wrote a "short" explanation of how greedy and non-greedy (aka lazy) work with regular expressions for a mailing list, but it turned into a couple of pages, so I thought I'd record it here.


    The default regex behavior of something like ".*" is to match as many characters as possible. So, if you write something like:


    And match against a string with it, the engine will give the *longest* string it can for ".*" that still results in a match. Or, in other words, the "<" will match against the last "<" in the string.

    This is pretty much backwards to the way most people expect things to work, but AFAIK came from the fact that if you have only one kind of behavior, it had better be greedy, since there are things you can't do if lazy is the only behavior you have.

    After some time, lazy got added, so if you write:


    It means "when you match, match the shortest possible string for .*".

    The type of match you choose can have a big effect on the perf you get. An example is in order.

    Assume we write:


    Which is a perfectly reasonable regular expression. When the engine goes to execute it, it goes from left to right. Assume this is the string we're matching against:


    So, the engine starts out by matching the ">". That's pretty easy to do, it just advances character by character, until it gets to the first ">".

    It then starts to match the .*, which (of course) means that it matches as much as possible. In this case, it matches:


    (as many characters as possible, right?) It will then try to match <, but it can't where it is, so it has to backtrack. Essentially it tells that the part that matched .* that the match fails and the length that .* matches is reduced by one. So, it changes the match to:


    And we try to match < again, and we fail again. We continue backtracking until we get far enough back so that .* has only matched:


    At which point the match for < succeeds, and we can look at whatever is after < in the pattern. There isn't anything, so the whole match succeeds. The regex works, but it can take a *long* time. If you have more than one use of .* in the expression and the strings are long, you can get regexes that would take hours or even days to run.

    You get around this by either using non-greedy ".*?" or a negated character class "[^<].". Non-greedy is slightly more efficient, more flexible, and easier to read, so for me it's the pretty clear choice.

    So, for short strings, non-greedy is better. In this case, it will fail on "", "a", "ab", and "abc" before succeeding on "abcd"

    The converse is true for long strings - if what you're matching is near the end of the string and you use non-greedy, you'll need to keep backtracking to lengthen the string by one character each time, so greedy will be the better choice.

  • Eric Gunnerson's Compendium

    Trivia: What are these the names of?


    What are these the names of?

    • Norma
    • Orion
    • Perseus
    • Cygnus

    There are two answers, one commonplace (ie one that I knew), and one quite a bit more obscure. I'm looking for the obscure one...

  • Eric Gunnerson's Compendium

    Cycling, diet and weight loss


     A post over at the Fat Cyclist (entitled "I Fear My Bathroom Scale") got me thinking.

    I first started cycling seriously a few years ago, when 9 months of being a PM and not working out had added about 20 pounds on my frame. The weight came off fairly easily, but I was hungry a fair bit, and it took me quite a while to come up with a nutrition plan that worked, both when I'm training hard and when I'm not.

    The basic problem is that if you are a recreational athlete, you need two different diet approaches. Both have the aim of keeping your blood sugar at a consistent level, but the way that you do that during (and after) exercise is very different from how you do it the rest of the time. You also need to realize that an approach that works for mostly sedentary people may be the wrong thing for you as an athlete, with Atkins being the prototypical example of this.

    There are two good books that I know that can help a lot. The first is Chris Carmichael's "Food for Fitness". Chris' hypothesis is that you should match what you eat to the period of training that you're in. That conceptually makes a lot of sense if you're on a fairly serious training regimen, but it probably over the top for many recreational athletes. That doesn't mean that this book isn't valuable, however - it has a lot of great basic nutritional information and covers fairly well how your diet needs are different than those of the sedentary part of the population.

    The second book is "The South Beach Diet". In general, most diet books aren't very useful, but there's a lot of good science - and clinical research - behind the South Beach approach. To sum up, each fewer processed foods, more natural foods, and you'll keep your blood sugar more constant, and therefore not be hungry all the time. I know several people who have lost good amounts of weight while not spending a lot of time hungry. There are some sacrifices here - I don't eat as much pasta as I used to, nor rice, and when I do, they're the whole-wheat varieties. Same with bread. But it's something that's sustainable.

    So, for me, I'm "South Beach" on most days, trying to eat things that will give me sustained energy. That often means eating a little more fat that you would on low-fat diets, which is a good thing in my book. There are a bunch of "south beach" brand foods in the supermarket, but the ones I've tried have been pretty poor, so I'd suggest staying with the natural food.

    I then modify on days when I work out. During workouts, my goal is to get enough glucose into my system on a consistent basis so I can burn fat efficiently. For me, this means a snack about an hour before (banana or clif bar, something like that), then Accelerade to drink now and then plus something else to munch on during stops (sometimes Clif bars, sometimes newtons). If I get it right, I'll have a nice constant stream of glucose so that I can get most of my energy from my fat stores. If I do this right, I don't get that "I've got to eat and eat and eat" feeling that Mr. Fat Cyclist (can I can you "Fat"?) speaks about in his post.

    On the first day of RSVP, I rode about 6 hours on Accelerade, a couple of clif bars, some beef jerky (sodium), and a few other assorted nibbles. That's not a lot of food, which means the bulk of my energy came from my fat stores. That's good - not only does it help with weight loss, it means that I can have plenty of energy without trying to each a lot, which is bad - you can only expect to get a limited number of calories from eating without getting too much food in your stomach.

    I should also probably note that you may need to back off a bit if you're in a sport for weight loss. The goal is to get your fat-burning metabolism working well and to use that for the bulk of your ride - that means you need to spend most of your time in a comfortable aerobic range. If you can find a good group ride that isn't too gonzo for your fitness level, you can stay comfortably aerobic on the flats and then push yourself (if you want) on the hills. If you push too hard, you won't establish the aerobic engine that you need. Carmichael talks about this in "The ultimate ride", also a good book.

    Oh, and I use my scale, but mostly to weight myself before and after workouts to see how I'm doing on hydration.


  • Eric Gunnerson's Compendium

    USS Hornet Museum


    Last week we were on vacation at my sister's place in Walnut Creek, CA (a bit to the east of Oakland). Vacation consisted of a few day trips, some sitting by the pool, and a fair amount of work with on the website. It's nice when your sister has wireless broadband.

    One day, we headed to Alameda (west of Oakland) to spend some time on the USS Hornet Museum. The Hornet is an Essex-class aircraft carrier, commissioned in 1943, serving in WWII, and then in Korea with a new flight deck. Hornet was the recovery ship for both Apollo 11 and Apollo 12.

    The Hornet was decommissioned and sold for scrap in 1993, but was saved at the last minute, and was donated by the Navy to a foundation. So, that's enough for the brief history - you can find more here.

    There Hornet is currently berthed in an out-of-the-way slip in what used to be Alameda Point Naval Base. Easy enough to find once you know where it is - if you need help, here's the Terraserver image.

    We arrived when the ship opened at 10AM, expecting to spend a couple of hours. We left when hunger overcame us at about 2:30 - there is so much to see. You enter into the hanger deck, which is about as big as you'd expect for a ship that is 900 some-odd feet long. There are a number of displays of Navy aircraft in the hanger deck, and also the Mobile Quarantine Facility from Apollo 14.

    Unlike a lot of historical places, the museum is actively working at getting as much of the ship open and refurbished as possible. We spent an hour on a tour of the areas underneath the hanger deck, including the surgery, the mess halls, the ready room, the brig, the cat room, and one of the engine rooms. Essex-class boats were steam-driven, and the surprising part of the engine is how compact it is. There are 4 boilers and four turbine pairs, with each boiler having a high-pressure and low-pressure turbine. The low-pressure turbine would fit inside a Microsoft office, and the high-pressure turbine was smaller still. Pretty impressive, given that each set of two turbines produces 37,000 horsepower. We also saw lots of other rooms which I've forgotten.

    After that tour, we went up to the flight deck on the escalator, which is not yet operational. The escalator was installed in the 1950s when the ready rooms were moved down a deck - apparently it was too taxing for the pilots to climb that far up with all of their gear. We toured the island as well - not as exciting as the underdeck tour, but still interesting. We finished by walking the flight deck from back to front after taking in the killer view of San Francisco from the stern of the ship.

    The docents who did the tour were all great.

    If you're around this area and like that sort of thing at all, you should definitely do this. The only military tour I've enjoyed more was the tour we got of the USS Santa Fe back in 1999 or 2000, but that's another story...

  • Eric Gunnerson's Compendium

    UI Updating can be hazardous to your performance...


    Dare writes about a perf and memory usage issue in RSS Bandit.

    I've come across this at least 10 times in my career - developers tend to underestimate the amount of time it takes to update a UI. I've seen cases where 70% of the cpu time is going towards updating the UI.

    There are two classical fixes for this:

    1. Updating every <n> items. This gives you 90% of the gain possible, and it very simple to implement.
    2. If the updates are chunky - ie sometimes you get 100 per second and sometimes you get 2 per second - a time-based approach works better. Don't make an update until at least a specific interval has passed since the last one. I generally use 250 milliseconds for the interval.


  • Eric Gunnerson's Compendium

    Coelacanthe (George Hrab)


    I found out about George Hrab from an interview on the Skepticality podcast.

    George is an interesting fellow. He's the drummer for the Philadelphia Fund Authority. And he writes music and words for his own albums, which has a decidedly skeptical bent.

    Coelacanth fuses the lyrical talent of a group like Uncle Bonsai (or, perhaps Neal Peart...) with a skeptical viewpoint and a musical menu from funk to gospel, and throws in a bit of MC Hawking.

    Here's a good example ("some girls are like an SNL skit at a quarter to one...")

    Definitely recommended.

  • Eric Gunnerson's Compendium

    Review: Waltham (Waltham)


    Last week, I spent some time listening to the Rock and Roll Geek Show Indie Cast. One of the featured bands was Waltham, a band named after their hometown in Massachusetts. Michael, if I recall correctly, played two songs - Cheryl (come and take a ride), and Hopeless, which led me to the band's website to order their CD.

    I've always thought that a band gets named after a town on accident - somebody forgets to write "Effervescent Puppies" where it says "Band namee", but does write "Boston" for their home town, the DJ misreads the sheet, and a rock-and-roll mega-group is born. You've got to think that there are groups out there named "Cleveland" or "Tampa" that are waiting patiently for their name choice to bear fruit...

    Waltham has a somewhat retro sound - if you listened to Rick Springfield when you were younger, you will find a lot of similarities here. That's a good thing in my book - they're doing something different than a lot of what's coming out these days, though there's at least a bit of Jimmy Eat World here.

    The songs range from good to better than good. Eschewing the traditional rock-and-roll troika of sex, drugs, and rock-and-roll, Waltham writes songs about women. In fact, with the exception of one song about... well, no, without exception, all the songs on their debut CD are about women, following the all-to familiar themes of "I want the girl I can't have", "Talking to groupies is all part of being in a band", and "Yeah, I'm in a band. Hey, I'll write a song about you"...

    If you don't mind the somewhat repetitive lyrics, there's a lot to like in the sound and the harmonies. There are a few samples on the website linked above.

  • Eric Gunnerson's Compendium

    RSVP 2005 non-trip report


    I joined somewhere around 900 other riders on RSVP this past weekend, for my second time in as many years, and I'm now recuperating in sunny Walnut Creek, CA.

    I had intended to write a general ride report, something like, I did last year. Though restricting myself to topics that people find interesting has never been one of my guidelines in writing blog entries - a fact that should be painfully obvious thus far - I've decided not to tell you that I had one flat, and consumed 132 oz of Blueberry Accelerade. Nor will I tell you my maximum heart rate (163), the total number of miles (around 195), or other minutiae.

    Instead, I'm not going to write that at all. In fact, I started doing a "trip report lite" (30% less boring), but just deleted 250 words of it.

    Instead, I'd like to talk about nicknames.

    The first - and arguably lamest  - example really wasn't a nickname, but a description. "Yellow Jersey Women" describes a women who I pulled through one unexpectedly windy section of the first day, but who disappeared before we could learn her way. Yellow Jersey (not her real name) also rode with us the second day and was a nice addition to the group on the second day, when she held up her end by riding in another yellow jersey, though there was an unconfirmed report of an early morning sighting of a white jersey. Tamara was a good addition to the group.  

    The second was coming up with a nickname for Jeff, which was an undertaking of the utmost importance. I tried out "Georgie" a few times on Saturday morning. Jeff was at the front, doing his best imitation of George Hincapie, spending extended time at the front of the group, while I did my best imitation of a tour team leader - hanging back and not doing any work. "Georgie" stuck okay until we made a stop at a store near Lynden to get some hot food. Jeff stopped for some mac & cheese, decided to add a piece of chicken, and then walked out of the store with an entire roast chicken. So, "Chicken boy" was awarded, though I'm unsure if it will remain sticky over time. Jeff confounded the whole thing by wearing his "Sponge Bob" jersey the second day, which provided some unfortunate competition with "chicken boy" (hmm. Perhaps "Mr. Chicken" or even "Señor Pollo" would be better...), and was certainly a crowd favorite.

    There was no obvious choice for Gustavo on the first day. "Guy who can outride me pretty much anywhere" was a bit ungainly, and "somewhat unattentive son-in-law", while a fair description, lacked the necessary panache. Gustavo solved things the second day by showing up in white calf-height socks, and "sock boy" was awarded at the appropriate time. Through a rather bizarre juxtaposition of the addition of another Microsoft rider, a discussion of the difficiencies of nutrition bars, a 15-minute ferry wait, a felicitously positioned pickup-load of potatoes, and a lack of free time on Jeff and my part to form a band using a specific name, he was temporarily awarded the appelation "groin potatoes", which is one of the least sticky nicknames I've heard of.

  • Eric Gunnerson's Compendium



    I haven't been paying much attention to Whidbey stuff in the past year or so, but today I came across something that I hadn't known about (or, perhaps, hadn't really understood).

    If you've written code in C#, it's likely that you've done some interop, and faced "the cleanup problem", where you have to write finalizers, implement IDisposable, etc. That's really not that bad once you get your head around it (assuming you can ignore some of the corner cases), but the underlying system has some issues with it, covered by Chris in this post. I was going to say "summarized", but that's not an appropriate description for what Chris does...

    Anyway, the short story is that for Whidbey, there's been a new abstraction added in the form of a SafeHandle, which is something like a C++ smart pointer, but with GC-based rather than scope-based semantics. It provides a stronger contract around being run in all situations, and makes things nicer for the GC because the finalizer is only in the SafeHandle-derived class, not in the object that holds it.

    To expand a bit:

    In the current way of doing things, if I have a class that has an IntPtr to hold an unmanaged resource and some other member variables, those other member variables can't be collected until I'm finalized (strictly speaking, until the next GC after I'm finalized). This is unfortunate, since, having survived the first GC due to needing finalization, these objects will be promoted to the next generation.

    With a SafeHandle instead, the SafeHandle needs to be finalized, but the rest of the object - and any objects they reference - can be reclaimed at the first GC. That's nice.

    Note that this does not remove the need for IDisposable - if I want to be sure that something is reclaimed, I need to do it myself, rather than waiting for a GC at some time in the future.

    So, what does that mean?

    Well, it means that if you're writing code for Whidbey, you should avoid IntPtr for handles, and use SafeHandle instead. There are several prebuilt classes that may work for you in the Microsoft.Win32.SafeHandles namespace).

  • Eric Gunnerson's Compendium

    ASP.NET 2.0, NUnit, and TDD


    I got my website set up last night and got a real ASP.NET page pulling data from a database and writing it out.

    I did the bulk of my web programming when web-servers were still steam powered (does the name "EMWAC" mean anything to you?), so my skills are mostly Perl-based, so I'm in search of a few opinions...

    A few questions:

    1) Do you usually build the website locally (using whatever the new desktop-only server is called) and then only copy it over when you're done, or do you work on the live site? (I think I know the answer to this one already...)

    2) What's the best approach for doing TDD on website-based programs? I took a quick look at NUnitAsp, but I think I'm more likely to write a set of classes that sit under a very thin presentation layer, and do unit testing on the classes instead. What has worked well for you?

    3) Any issues with using NUnit and Whidbey beta 2? I could never get it to run with Beta 1?

    4) For the database, should I run based on a local database, use the real database, or just talk to a mock database?

    5) Anything other comments about tools/development approaches?

  • Eric Gunnerson's Compendium

    C# and databases...


    I've been "a database guy" for quite a while. Way back in 1989, I left the security and blandness of Boeing for a new project a Microrim (of R:Base fame), which was building a new database solution (by "solution", I mean the whole shebang - server, UI, connectors to other databases, UI abstraction to run both on (hold on) OS2 and Mac).

    As part of that job, I wrote the Query By Example part of the database front end (what we called a "surface"). If you've ever used Access or SQL enterprise manager, you've used QBE - you just get a table to fill in, and the system generates the SQL query under the covers for you. It's an interesting problem - it's fairly easy to represent everything from a QBE table as SQL, but going the other way is more problematic - there are some queries that just don't represent themselves well, even before you get to degenerate cases and inner joins.

    Along the way I learned a fair bit about database design, normalization, and how to think like a database engine. This carried on to my later jobs, where I often chose a database as the best way to store information, and therefore I wrote a fair bit of data access code. As I moved into the object-oriented world, I wrote a fair number of what I'll call "class adapters" - classes whose purpose is to adapt between the database view of the world (rows and columns, with separate statements to query, update, or delete information) to the object-oriented world (classes with fields and methods to perform operations).

    These classes aren't hard to write, but they are a bit tedious. How many times can you write something like:

    m_name = (string) currentRow[NAME_COLUMN].Value;

    or deal with the fact that string values in insert statements are quoted, but integers are not, without getting tired of it?

    When came out, I played around with it (not being a developer at the time, I didn't really use it). It does some neat things - strongly typed datasets, for example, but after using it for a few projects, I decided that it was a pretty big stick for the kind of stuff that I was doing. I didn't need disconnected datasets, I rarely (if ever) have written an editable grid, and I didn't like having to regenerate datasets every time I modified my schema. I also had a hard time keep track of all the pieces - queries, adapters, connections, transmissions, valves, condensors, centrifugal clutches, ultrasonic cleaners, etc.

    So, I switched back to the tried and true:

    execute query
    while (result has rows)
        process rows

    style of programming, but I wasn't fully happy with it. There was still - to use the terms Anders uses - a big "impedence mismatch" between the programming language world and the database world. There's a lot of improvement to be had, if a solution to the problem can be found.

    A couple of years ago, the C# language design team started talking about whether there was something that could be done at a language level, and I got to participate in that discussion for around 9 months. The coolest part of the plans are that...

    Well, I can't really tell you. Partly because I'm not the person to talk about the details, but mostly because I've been away from the language design team for nearly a year now, so I'm not sure where the discussions led.

    But I know who can tell you. The members of the team will be speaking about what may be in store for C# 3.0 at the PDC.

    If you can't make it to the PDC, I'll try to link to documents as they become available, or you could just wait for Luca to post them

  • Eric Gunnerson's Compendium

    Power Collections for .NET released


    Peter Golde, noted C# language designer and original compiler dev lead, has released Power Collections for .NET.

    If you don't like collections that come with C# 2.0, consider using these as a supplement.

  • Eric Gunnerson's Compendium

    Bugfixing mode and checkin email


    When we check in changes to our product, it's traditional to send check-in email to the people that are impacted (and, if the emails I get are any indication, a dozen people pulled randomly from the Microsoft address book as well).

    These emails usually say something like:

    Fixed bug #11881 - added support for negative anti-pressure to the gravitational support code.

    or sometimes just:


    Where the numbers are bug numbers in our bug tracking system (named Product Studio but often referred to as Raid, after it's predecessor). A seemingly irrelevant piece of info that should become slightly more relevant in the near future.

    Back in April, I'd just listened my way through the Hitchiker's Guide radio shows, and decided to write my checkin email in the form of a radio drama.

    Which I would have shared with you, except for the fact that I'm not allowed to talk about what I've been doing (yet...).

    In the weeks that followed, I did a scene from a bad sci-fi movie, a Led Zeppelin takeoff ("Fixed a whole lotta bugs"), a bit of Dickens, and ...

    Well, you get the idea.

    Some of them pretty funny, some of them not really that funny, but none of them fit to share.

    This morning, however, I wrote one that I think I can share with you (or, at least that part without the bug details), so here it is:

    Checkin mail: Bug Fixes

    It was a hot day, hotter than I was used to.
    The first two hours had gone well. 800 feet of climbing up Lakemont, down the south side, and then around the south end of the lake. The sun was shining, the wind was fresh off the lake, and the "beep beep" of forklifts announced to all around that the time Seafair was near.
    It all started to deteriorate near the U. The heat, the pavé that passes for pavement on the Burke, and the earlyness of my start all weighed on my weary quads. Juanita hill passed slowly, made tolerable only by an encounter with a fellow sufferer near the top. Market street further lowered my spirits.
    And there remained only one more hill. I took it slowly and steadily, finding a rythm that worked well. I was halfway up when the wispy thread of a thought burst forth, shouting, "Hey moron, you need some salt. Find a place to get some".
    After half a bag of beef jerky, the snack food with the highest sodium/weight ratio, I continued on my way, arriving home just under the four hour mark. As I dragged myself into the kitchen, my wife asked, "Good ride?"
    "No", I said. "But I did resolve 16 bugs in Product Studio".
    There are those who think I watch too much TV...
Page 1 of 1 (16 items)