May, 2004

Posts
  • Eric Gunnerson's Compendium

    Nullable types in C#

    • 107 Comments

    Nullable Types in C#

    One of the "late breaking" features in C# 2.0 is what is known as "Nullable Types". The details can be found in the C# 2.0 language spec.

    Nullable types address the scenario where you want to be able to have a primitive type with a null (or unknown) value. This is common in database scenarios, but is also useful in other situations.

    In the past, there were several ways of doing this:

    • A boxed value type. This is not strongly-typed at compile-time, and involves doing a heap allocation for every type.
    • A class wrapper for the value type. This is strongly-typed, but still involves a heap allocation, and the you have to write the wrapper.
    • A struct wrapper that supports the concept of nullability. This is a good solution, but you have to write it yourself.

    To make this easier, in VS 2005, we're introducing a new type named "Nullable", that looks something like this (it's actually more complex than this, but I want to keep the example simple):

    struct Nullable<T>
    {
        public bool HasValue;
        public T Value;
    }

    You can use this struct directly, but we've also added some shortcut syntax to make the resulting code much cleaner. The first is the introduction of a new syntax for declaring a nullable type. Rather than typing:

    Nullable<int> x = new Nullable<int>(125);

    I can write:

    int? x = 125;

    which is much simpler. Similarly, rather than needed to write a null test as:

    if (x.HasValue) {...}

    you can use a familiar comparison to null:

    if (x != null) {...}

    Finally, we have support to make writing expressions easier. If I wanted to add two nullable ints together and preserve null values, if I didn't have language support, I would need to write:

    Nullable<int> x = new Nullable<int>(125);
    Nullable<int> y = new Nullable<int>(33);
    Nullable<int> z =  (x.HasValue && y.HasValue) ? 
    new Nullable<int>(x.Value + y.Value) : Nullable<int>.NullValue;

    At least I think that's what I'd have to write - it's complex enough that I'm not sure this code works. This is ugly enough that it makes using Nullable without compiler support a whole lot of work. With the compiler support, you write:

    int? x = 125;
    int? y = 33;
    int? z = x + y;
    
  • Eric Gunnerson's Compendium

    No longer Large, and rarely in charge...

    • 44 Comments

    For those of you who have never met me, I'm approximately 6'2” tall, and currently weigh about 168 pounds. For all of my (reasonably) adult life, I've been a “Large”. If I needed a T-shirt, or a coat, or a sweatshirt, I was always a Large.

    My consistent largeness provided comfort as I navigated the often turbulent waters of life.

    But recently, things have changed. America has - and I'm sure I'm not telling you anything new, unless you've been living in a cave for the past 10 years (presumably a cave with a nice 100 MBit network, a cable modem, and a kickin' gaming system), you've probably noticed that America has been getting fatter. Not the country, which at my last measurement, was just a bit under 1000 leagues wide, but the people.

    For a while, this was no issue, but clothing manufacturers have started to adjust their sizes upwards, which means that Large isn't the Large that I have grown to love. But, of couse, all of them haven't done it, so now I sometimes end up with a large, and sometimes with a medium. Not that bad if you can try them on ahead of time, but for TechEd I got two Large shirts - you have to order large because if medium is too small, you're SOL - and I swim in them a bit. I guess I could wear 10 or 12 T shirts underneath to fill it out a bit, but that hardly seems practical.

    Argh.

  • Eric Gunnerson's Compendium

    More on virtual by default

    • 37 Comments

    Gary and James took issue with my position, so I'd like to expand a bit on what I said.

    Gary wrote:

    Eric suggests that final should be used in most cases with virtual reserved for the special occasions. I totally disagree, you can't lay down the law as to how another developer is going to extend your class, (other than in the special circumstances I mention here). If the choice is being flexible versus being static, then in the interests of good software engineering you *must* go with the flexible option. Or is that just the Smalltalk programmer in me coming out :)

    My answer to that question would be “yes”. I'm not being flippant on that, and I'm about as far from being a Smalltalk programmer as you can be, but I think it comes down to the tradeoff between robustness and flexibility. If you allow more extensibility, you are being more flexible, and also being less robust (or, to be precise, being potentially less robust), especially if you aren't testing the extensibility.

    James wrote:

    The long and short of it - Eric thinks that library designers know everything and that one of the primary jobs is to protect those dumb application developers. There's just no telling what they might do if we let them.

    James has it backwards. I don't think that library designers know everything. I think that they know very little about how users might use their classes, and therefore shouldn't be making promises their code can't keep. If they want to support extensibility, then that should be something they design well and test for. If they aren't going to go to that effort, than I don't think providing extensibility that will “probably work” is doing their users a service, and it may be actively doing them a disservice. Not only will such code be more likely to fail, but the constraint of such virtuals limits how the class can evolve in the future. It's bad when you'd like to make a small change that would help 99.9% of your users, but you can't because somebody might have existing code that depends on your behavior. But maybe that's just the compiler guy in me...

    I do think that this depends on the kind of library you're building. In general, the more closely coupled the api writer and user ae, theless of an issue this is.

  • Eric Gunnerson's Compendium

    C# Community Review

    • 30 Comments

    One of my responsibilities is overseeing the C# team's involvement in community, where community is anything we do that has direct customer touch. Everything from working with MVPs to design reviews on new features to the C# Dev Center. Duncan, Dan, and I collaborate on the whole community effort.

    We now have a process where we do quarterly reviews of our community process, and we had our first quarterly review, which consisted of all the C# PM team and the first and second level managers of the three of us. Getting the review ready consumed a considerable amount of time in the past few days, but except for the fact that we only got through about half the slides in two hours, the review went quite well.

    Which brings me to the point of this post.

    If a friend came up to you and said, “I'm thinking of using C#, but I'm concerned that there isn't a good community around it”, what would you say? What are the good things about the C# community? What are the bad things? If you wanted us to change one thing, what would it be?

    I'll summarize and post so that you don't have to read the comments.

  • Eric Gunnerson's Compendium

    Virtual by default or not?

    • 29 Comments

    I've been skimming through “Hardcore Java”, and I came across a section on the use of 'final' in Java. In it, one of Simmons' comments (okay, I'm not sure it's his comment because he's listed as editor and not author, but that sounds better than “whoever wrote this section) is (and I'm paraphrasing here):

    Don't use final on methods unless you're sure you want them to be final

    He then goes on to say that this is because you never know who might want to extend your class.

    There are really two viewpoints on this issue. They are:

    “Make it virtual in case somebody wants to extend it”

    and

    “Don't make it virtual unless you're sure somebody wants to extend it”

    I'd like to expand on the first viewpoint, which I've sometimes labelled as “speculative virtuality”. I don't like it.

    My reasons have everything to do with predictability and robustness. If you can't conceive of a user extending your class in a specific way but still choose to provide such extensibility, then it's pretty clear that you don't have an extension scenario in mind, which means that neither your code nor your tests are likely to ensure that it does work - especially across versions. Sure, it's possible that it may work in the current version, and may even continue to work in future versions, but I wouldn't call it a supported scenario. I'd prefer to use classes where I know what I'm doing is supported.

    The second issue is around understandability. If I walk up to a class (in the metaphorical sense, of course, you can't really “walk up” to a class) and it has 29 virtual methods, it's hard for me to tell whether the author *intended* me to extend the class through a certain method or set of methods, or whether they just left them virtual “just in case”. Where if a class only has one (or a small number) of virtual methods, that's a good indication that the designer wants me to use them (ie they're part of a supported scenario).

  • Eric Gunnerson's Compendium

    Best C# Bloggers

    • 28 Comments

    Here is a summary of the community-recommended bloggers.

  • Eric Gunnerson's Compendium

    On probation

    • 25 Comments

    I got an email today from the owner of all the MSDN columns, telling me that if I wasn't able to produce a column every other month, my column would be put on probation, and then cancelled.

    I'm frankly surprised it took this long - the whole essence of a column or any other periodical is that it is just that - periodical. The combination of me writing a blog and spending a lot of time doing PM stuff has meant that my column has been neglected. June, September, and February does not a periodical column make.

    I'm looking for some comments on how to spend my time. If I decided to try to keep the column alive, it would definitely take away from the time that I spend blogging. If I let the column go, I still have the opportunity to write articles for MSDN, but I wouldn't have to do it on a schedule.

    Comments?

  • Eric Gunnerson's Compendium

    Enums and validation

    • 25 Comments

    We've been talking about enums a bit, in one of those cases where I'm looking into something and I then get a request about that same thing from some other source. Or multiple sources, as it happened this time.

    Yesterday, I was working on my TechEd presentation (DEV320  Visual C# Best Practices: What's Wrong With this Code?), and one of my examples deals with enums. The following is a spoiler, so if you come to my talk, you'll need to sit on your hands and look smug during that part of the talk.

    First Question:

    Consider the following code:

    enum Operation
    {
        Fold,
        Spindle,
    }

    class BigExpensiveMachine
    {
        public void Initiate(Operation operation)
        {
            // code here
        }
    }

    what are the possible values of operation at the “code here” comment?

    A: The number of possible values is not defined by what you write in the enum, it's defined by the underlying type of the enum, which defaults to int. Which means somebody could write:

    bem.Initiate((Operation) 88383);

    So, my first point is that enums are not sets - they can take on any value from the underlying type. If you know this, you might already be writing code like:

    class BigExpensiveMachine
    {
        public void Initiate(Operation operation)
        {
            if (!Enum.IsDefined(typeof(Operation), operation)
                return;
            // code here
        }
    }

    Q: Does this code solve the problem?

     

    A: It solves some of the problem, but there is still a latent issue that can be fairly bad. It is probably fine when first written, but it has no protection for when this happens:

    enum Operation
    {
        Fold,
        Spindle,
        Mutilate,
    }

    When I change the enum and recompile, my Initiate() method now accepts “Mutilate“ as a valid value, even though it wasn't defined when I wrote the routine. Whether this is a problem or not depends on what's in the routine, but it could lead to weird behavior or a security issue. IsDefined() is also fairly slow - I took a look at the code, and there's a lot of it.

    So, where does that leave us?

    When most people ask to be able to validate an enum, what they really want is the ability to limit the enum values to those that they had in mind when they wrote the routine. Or, to put it another way, to support additional values should require a deliberate act on the part of the programmer rather than happening by default.

    To get this behavior requires hand-coding the value check as part of the routine. It could either be a separate check:

    if (operation != Operations.Fold && operation != Operation.Spindle)
        return;    // or an exception as seems prudent

    or as part of the logic of the routine:

    switch (operation)
        case Operation.Fold:
            ...
            break;

        case Operation.Spindle:
            ...
            break;

        default:
            return;
    }

    Either of those constructs ensure that I'm dealing with the world I understand inside the routine. Which is a good thing.

  • Eric Gunnerson's Compendium

    Updated C# V2.0 Specifications now available

    • 21 Comments

    After a lot of work on Anders' part, we now have an updated version of the C# 2.0 Language Specification on the website.

    Not only does it have updated information on the “big 4” - Generics, iterators, anonymous methods, and partial classes, but it also has new information on a few other areas.

    The biggest one is nullable types, which are a new way to creating nullable versions of value types and specialized language syntax to support using these types. There are also a few “miscellaneous” ones, the most awaited one surely being different accessability on property getters and setters.

  • Eric Gunnerson's Compendium

    Hackers and Painters by Paul Graham

    • 17 Comments

    On Wednesday night, I spent some time at the O'Reilly reception. In the SWAG bags that they gave us was a copy of the book, Hackers and Painters by Paul Graham.

    The O'Reilly website has this to say about the book:

    "Hackers & Painters: Big Ideas from the Computer Age, by Paul Graham, explains this world and the motivations of the people who occupy it. In clear, thoughtful prose that draws on illuminating historical examples, Graham takes readers on an unflinching exploration into what he calls "an intellectual Wild West."

    The ideas discussed in this book will have a powerful and lasting impact on how we think, how we work, how we develop technology, and how we live. Topics include the importance of beauty in software design, how to make wealth, heresy and free speech, the programming language renaissance, the open-source movement, digital design, Internet startups, and more. "

    I wanted to like this book, but after reading about half of it, I have mixed feelings. I think some of Graham's observations are interesting, though some aren't new (the fact that some programmers are wildly more productive than others will not surprise many developers). But in others I think he makes assertions that aren't well supported.

    In one of the early chapters (I think it's chapter 1, but I don't have the book here right now because it's in my luggage at the hotel), he makes the assertion (and I'm simplifying a ton here) that since children in medieval times started apprenticeships when they were in their early teens and we don't have record of them having the same sort of problems today's teenagers have (which I'll label as "teenage angst", though that's not exactly what he's talking about), then the behavior we see today must be societal and environmental in nature.

    The problem with this argument is that there is some very good research that says that the brains of teenagers are not fully developed until around the age of 20 (see this from NIH, or Bradley and Geidd's excellent book "Yes, your teen is crazy!"), which means that the assertion that there's some biological basis for teenage behavior has some good support.

    Given that this book is a collection of essays, I don't expect the same level of research I would in a full book devoted to the topic, but it's unfortunate to have this sort of oversight.

    Another example is in one of the chapters on wealth, which I think are pretty good overall. Graham's assertion here is that before the industrial revolution, there has been wealth transference, but not a lot of wealth creation, and that wealth has been accumulated through theft (either directly or via taxation). I think this is basically true. His second assertion is that this changes with the application of technology. I agree that technology has brought new wealth (if you measure wealth by standard of living).

    He then uses this as a basis that inequities between rich and poor are not a bad thing, because the creation of wealth is a good thing. But I think he ignores the fact that the old methods of wealth accumulation are still alive and well - the rich are (not surprisingly) interested in staying rich, and are willing to use their influence to make this happen. Given that, I don't think one can make the "Greed Is Good" argument without at least some qualification.

     

     

  • Eric Gunnerson's Compendium

    Best practices talk

    • 15 Comments

    I gave my Teched talk at 5PM today, entitled “C# Best Practices - What's wrong with this code?”. Rather than take a more lecture-based approach, I tried something more interactive. It was an interesting, if not fully successful talk. If you attended, please leave me some comments - what did you like, what didn't you like, etc. If you didn't attend, you should see slides on the dev center *eventually*.

    Good:

    • The warmup went well, with laughs where I expected to get laughs, and many of them better than I expected.
    • I got to jump off the stage and go out into the audience, something I also like to do.
    • I got lots of questions afterwards, which is always a good sign.

    Bad:

    • I didn't get as many answer from people as I expected. That might be because the material is too hard, or just no accessible in the time I spent on it.
    • I missed points on a few of my explanations
    • I had one glaring error in an explanation that was brought to my attention later, which I should have caught. The guidance is still correct, but I hate those kinds of errors.
    • The snippets part was fairly dry, and could have been more interesting.

    Overall, I think it went fairly well, but I haven't talked to anyone in my group, so I don't know what they think.

    Oh, and the worst part was that I had to share the stage with two rack mounted servers that made it really hard for me to hear anything.

    Afterwards, one of the room supervisors told me that the talk was SRO, and that it seemed to be a “younger crowd” than many of the other talks. I'm not sure what to make of that.

    Overall, I'm not sure how happy I am yet.

  • Eric Gunnerson's Compendium

    App Building

    • 15 Comments

    The C# PM team is spending Wednesday-Friday of this week on app building, where we spend time using the product to write real apps. We book a conference room in a building away from our main building, put OOF messages on our email, and then spend our time programming (and reporting the bugs we find). I'm running the build that we're going to be giving out at TechEd at the end of the month, and I'm working on my GPS tracking project. I'll need to back-port it to Everett when I'm done, since the project is destined to be featured in a C# column.

    Today I wrote a windows forms control that does graphing. The cool part is that it supports multi-level zooming in on the data. Tommorrow I'll hook it up with my existing app, which uses MapPoint, and then go from there.

    This is the first time I've used a fairly recent Whidbey build, and I'm pretty happy with the way it feels. The refactoring works well (when I remember to use it), and Intellisense is bothering me less (in early Whidbey versions it was a little too aggressive about coming up).

    Finally, the new debugging value inspection is great. I did a fair amount of debugging today, and I didn't have to go to the watch or locals window the whole time.

     

  • Eric Gunnerson's Compendium

    App Building day 3 - data smoothing...

    • 14 Comments

    I worked some more on my GPS app, and I'm looking for some advice from somebody with more experience in dealing with chunky data than I do.

    One of the problems that I have with the GPS data that I get is that it's noisy. First of all the altitude data for GPS isn't as good as the location data. It acts as if there's a constant bit of noise added to it. The second problem is that the receiver can't always maintain a sync, and if it doesn't, it ends up with a straight-line projection of the last data, and then abruptly re-syncs when it gets a good fix again.

    That means I end up with discontinuities in the altitude data, which messes things up when I try to figure out the gradient of the data, and makes the plot look pretty ugly.

    What I need is a good way to smooth over those sections of bad data, and I'm open for ideas.

    I did realize as I was writing this that I may be able to use the quality of data information that the GPS sends me to decide what data is bad.

  • Eric Gunnerson's Compendium

    Still more on virtual by default...

    • 12 Comments

    In a bid to keep my blog hits high, I've decided to revisit this again.

    The comments (I read all comments, though I don't respond to all of them) have been fairly split.

    There has one group who agrees with me, and another group that is opposed to my perspective. I'd like to address those who disagree with me, as there are a few points I didn't cover last time.

    I do want to state up front that this is an area in which there is room for honest disagreement.

    I think the point that I didn't make very well earlier was that one of the reasons we're somewhat - okay, perhaps “paranoid“ is too strong of a term, but it has the right “feel“ - is that if we had a virtual method on a class but didn't support it, there's a likelihood that it will break in the future (the discussion of how likely that is is an interesting side discussion. Some may argue that it is likely, others may argue that it's not likely at all. ).

    If that break happens, I don't expect the customer response to be, “Oh, it broke, so I guess I shouldn't do that“. I expect it to be more along the lines of “Why didn't Microsoft build this thing correctly in the first place?“, which is why we work very hard to avoid that situation.

    There's one other point I'd like to raise. When you look at a class and there's a method that isn't virtual, you rarely have any idea why it isn't virtual. It coul be that the designer just didn't want to support that as a virtual. But it could also be that making it virtual would raise a security issue, or cause your cat's fur to fall out.

  • Eric Gunnerson's Compendium

    TechEd and Rio

    • 11 Comments

    For TechEd attendees, you can request meetings with Microsoft people through the Rio system. I went to look at my schedule, and found out that either nobody had requested a meeting with me, or I couldn't get to my meeting requests.

    Have any of you tried using Rio to do this? Have you been successful?

  • Eric Gunnerson's Compendium

    Join the fridge-o-lution

    • 10 Comments

    Last night, I attended a TechEd bloggers party, held in a pretty nice suite on the 40th floor of the Hyatt. (Not the 4th floor, as many of us had assumed. I realized later that this is because I'm used to the Microsoft buildings that I work in, where the first digit is the floor number. It didn't help that the party was in the Elizabeth Suite, and there's an Elizabeth ballroom on the 4th floor).

    So, there we were, a bunch of bloggers, talking the sublime expressiveness of Cezanne's painting of Mont Sainte-Victoire.

    and then somehow we got onto the topic of technology.

    There were - as there always are in such a gathering - a few TiVo fans, and we got into discusing the "wired house", and the utility of the internet refrigerator:

    The problem with the internet refrigerator is that it doesn't go far enough. The first obvious thing to add is a way to keep track of the inventory. That can be easily added though a bar code scanner and a few weight sensors, and then your fridge can make sure that you're alway stocked with a healthy supply of pickle slices, Miller Lite, and non-dairy whipped topping.

    Do a little more, and you can have a fridge that won't let you eat that last slice of pizza, but will let you make yourself a nice healthy salad (sans dressing, of course).

    All those are great features, of course, but the real benefit comes from solving the “what's for dinner?” problem. By tying in TiVo's “suggestion” feature, your fridge can notice that since you bought a frozen pizza, you might also like other tomato-sauce-based products, and presto, when you come home, you not only can have the pizza, but you can also have a serving of lasagna, or a nice tomato soup.

    But it gets better. Just picture this...

    You're at home on a Friday night with a date, cudling on the couch and watching the latest episode on Emeril on the Food Network. Just as Emeril finishes off making Emeril's Taco Salad with a Roasted Poblano Buttermilk Dressing and your date leans closer, your hear your fridge say, “You could make that right now!“.

    Women dig talking appliances.

  • Eric Gunnerson's Compendium

    Books on my desk

    • 8 Comments

    I have a couple of books that have showed up on my desk recently.

    The first is “Test-Driven Development in Microsoft .NET”, by James Newkirk and Alexei Vorontsov. They started writing the book while working at ThoughtWorks, though Jim is now working at Microsoft in the Patterns & Practices group. I reviewed early drafts of the book, and Jim was nice enough to drop by a signed copy a week ago. I confess that I haven't opened it again - one of the disadvantages of reviewing books is that you get the feeling that you already know what the book says, despite the fact that the book has undoubtably changed since you read it. I like the practical approach they've taken in this book, and I think Jim (I don't know Alexei) is largely immune to the more fervent side of the XP movement.

    The second book just showed up today - it's “TCP/IP Sockets In C#”. I'm embarrassed to admit that while I remember helping out on this book, I don't remember exactly what I did, but the authors were nice enough to send me a copy. I have a feeling I'll be delving into this a bit deeper since all the socket code I've written has been very cookbook-ish.

    The third book is Hardcore Java, which may seem like a strange thing for me to be reading, but I wanted to understand some of the best practices around Java so I can better relate them to what we do in C#. I'm reasonably sure that I ordered this book, though I confess I was a bit surprised when I pulled it out of the shipping cardboard.

    Oh, and to round things out, I'm also re-reading “Flight: My life in Mission Control“, Chris Kraft's excellent biography about the flight control side of Mercury, Gemini, and Apollo. I think I like it slightly better than “Failure is not an option” by Gene Kranz.

  • Eric Gunnerson's Compendium

    TechEd from a Java Persective...

    • 8 Comments

    I've been reading a series of posts by N. Alex Rupp on his experiences at TechEd, and his thoughts on the Microsoft and Java communities.

    I'm especially interested in this because I'm going to JavaOne at the end of the month...

  • Eric Gunnerson's Compendium

    TechEd and tempting the slide gods...

    • 6 Comments

    About 4 or 5 weeks ago, the C# team was working on our TechEd presentations. There's an organized (ish) process to get all the speakers to get their slides done in a timely manner, but we decided to add a bit more rigor on top of that process, so that we wouldn't be “behind the 8 ball” and could avoid “starting down the rhino” on our way to “communicating with the masses”.

    One of the general checkpoints was to make sure our draft slides were turned in so our track owner (Brian) could pass them on to our technical reviewer (Dan), so he could review them. But in this case, Dan had already signed off that our process was fine with him and he would be getting slides earlier, so turning our drafts was largely a pointless exercise.

    I joked that I was going to turn in a presentation with 28 slides, all of them having pictures of kittens on them. I then thought better of it, saying that if I did that, I would show up at TechEd for my talk, do my introduction, hit the space bar, and then hit space bar, only to be confronted with...

    Kittens...

    I should have known better than to tempt the demo gods.

    The first problem was when I got an email on 4/12 that said, "Thanks for getting your final slides done early". Which was interesting, since I hadn't gotten my slides done early. My kitten comment was coming back to haunt me. Or, to be more correct, it was coming back to haunt the whole team - Anders' slides were okay, but all the others were the draft ones we had put out weeks earlier. Which means that if I had sent in slides with kittens, they would have been my final slides .

    A few emails later, I thought I had the problem resolved, but when I showed up at the convention center yesterday morning, I checked in with the slide team and there were no slides for my talk. I took them a new copy of them yesterday, but I'm showing up to the talk with my laptop and with my slides on a USB memory stick.

    And I've been trying not to think about

  • Eric Gunnerson's Compendium

    Canis Novus

    • 6 Comments

    After fighting a spirited but doomed-to-fail delaying action for several years, the two caniphiles in the family finally overcame my defenses on Saturday, and there's a new addition to the household.

    He's part Australian Cattle dog, and part unspecified (some think Staffordshire terrier), and he looks remarkably like the picture below.

    And today, he's at the vet to get “tutored” (there's an old Gary Larson “Far Side” cartoon where one dog is saying to the other, “Guess What! I get to go to the vet today to be tutored!”)

  • Eric Gunnerson's Compendium

    Flyin'

    • 5 Comments

    My long-term readers know that I'm in training for a couple of bicycle rides later this summer (a century in June, and a two-day double century in August). The rest of you haven't had the joy of reading endless descriptions of my cycling exploits, but I'm sure you'll get  a chance to do this in the future.

    Anyway, I've been slowly upping my mileage to about 90 miles/week, including a 45-ish mile trip around the top half of Lake Washington, including a ride across Lacey V. Murrow memorial floating bridge (otherwise known as the I-90 bridge). That in itself is a experience, riding across a lake 20 or 30 feet off the water, both for the view and for the risk, given that a big chunk of the bridge sunk in the early 1990s due to an impressive display of cooperative incompetence between the state and the renovation contractor. The route around the Lake is really nice on a sunny day, but involves a lot of starting and stopping north of the University of Washington on the Burke-Gilman trail.

    Last Saturday, I decided to vary my route. After riding about 25 miles and feeling good, I decided to ride on the east side of Lake Sammamish (lots of lakes around Seattle), which I hadn't done before. I had one of those rare moments when everything seems to be working currently, and I rode for about 15 minutes at 21-23 MPH (about 4-5 MPH faster than normal), and then another 15 minutes flying (for me) up a 2-5 mile hill.

    Overall, I did 48.6 miles in just under 3 hours, averaging a little over 16.2 miles per hour.

    Which makes me very slow compared to more serious cyclists, but much faster than I was before.

     

     

  • Eric Gunnerson's Compendium

    TechEd and the C# team

    • 4 Comments

    The deadline for the C# team to have our TechEd slides done is this Friday, so we've been working hard on our slides. My talk (DEV320 - C# Best Practices: What's wrong with this code?) has been taking a lot of time, but I think it's finally shaping up, delta one problem.

    The problem relates to the kind of talk that I'm doing. I like to do lots of walkthroughs on my talks so that I feel really comfortable with the content and timing. I can usually hit my time within a couple of minutes.

    But this talk has a large customer-interaction part, and that's something that I can't practice and can't time. That makes me a little bit nervous and fairly excited, as I really like the challenge of doing a good job on a customer interaction talk.

    I've also been devoting some time reviewing the talks by other C# members.

    While we're at TechEd, we plan on spending lots of time with customers. There's a scheduling system named “Rio“ through which attendees can make requests to talk to MS people about specific subjects. If you want to talk to somebody from the C# team, you can schedule them directly (see below for who owns what), or if you're not sure, drop me a message, and I'll try to point you in the right direction.

    Here's the full set of C# team activities while we're there.

    Mon 5/24 (5:00PM - 6:15PM) - C# Best Practices - What's wrong with this code - Eric (Room 6A)
    Mon 5/24 (6:00PM - 7:00PM) - Ask the experts. All the C# team will be there
    Tue  5/25 (6:00PM - 8:00PM) - San Diego .NET User Group - Joe, Anson
    Wed 5/26 (10:15AM - 11:30AM) - Visual C# 2005: Language Enhancements - Anders (Room 31ABC)
    Wed 5/26 (2:00PM - 3:15PM) - Cabana Session with Anders (Cabana 05)
    Thu   5/27 (10:15AM - 11:30AM) - Visual C# 2005: IDE Enhancements - Joe, Anson
    Thu   5/27 (3:15PM - 4:30PM) - Visual Studio - Best Practices for Debugging - Scott, Habib
    Fri    5/28 (10:45AM - 12:00PM) - Visual Studio 2005: New enhancements for debugging - Scott, Habib

     

     

     

    Other talks

  • Eric Gunnerson's Compendium

    App Building - Day 2

    • 3 Comments

    Today was the second day of app building. But first, a few comments on the comments on my last post.

    One set of comments is around what I meant by “real apps”. It's true that when I say “real apps”, I don't mean the sort of apps that our real customers write. And that's one of the problems that we face - we can't simulate what you do every day with our product.

    We do attempt, however, to try out a variety of different features in ways that we think you'll be using them, so we can figure out how the product is shaping up overall.

    The app building is just on the PM side. We have devs that use C# every day (lots of .NET is built with C#), but they don't try to take the broader approach that we do.

    On the intellisense question, I wasn't talking about the March version, but versions earlier than that.

    *****

    I spent the morning fighing with Mappoint to try to get it to run faster when adding lines. It's painfully slow. I explored writing the lines myself as an overlay, which I did get to work but not so that events could get underneath.

    This afternoon I gave up on that approach, decided to go with a single polyline, and go the app essentially working. It's really fairly cool right now, but needs some more work on it.

    My IDE experience as pretty good today - no crashes.

  • Eric Gunnerson's Compendium

    See Cyrus' head explode...

    • 0 Comments
    Cyrus is one of the devs on the editor/intellisense/refactoring team (we call it the IDE team, which is the totally wrong name to use with customers) got his blog set up last Friday, and has already written 29 posts.
Page 1 of 1 (24 items)