August, 2003

Posts
  • Eric Gunnerson's Compendium

    Columbia Accident Report Released

    • 0 Comments

    Yesterday, the Columbia Accident Investigation Report was released. In 248 pages, it covers both the accident and the causes of the accident in detail, and presents a fascinating bit of forensic science. The investigation that that board and NASA did into the accident is top-notch.

    Unfortunately, it appears that NASA didn't learn from Challenger, as there are eerie parallels between the two accidents. Both are issues that had been observed for years, and were gradually downgraded from "critical safety issue" to "known risk", apparently on the theory that there hadn't been any serious problems yet, and therefore there wouldn't be any in the future.

    Like the o-rings in the SRBs, there has been a long history of continual damage to the orbiter due to shedding of the foam. Page 127 has a very telling report - none of the flights were free of damage from the foam (the chart covers lower-surface dings > 1" in diameter). In most flights, there were 10-20 areas damaged, and in 4 flights, there were over 100. Before Columbia, there were 5 known cases of the foam coming off  in the same place as in Columbia (there are likely more, since less than half the flights had enough imagery to be sure).

    Even if NASA didn't see this as a safety issue, repairing the damage after every mission (and there is lots more damage at smaller sizes) takes a considerable amount of time and expense. For a organization who wanted to reduce operating costs, not solving that problem was a big issue.

    Overall, it's still a case of NASA trying to do too much with too little, and not being frank about what they could really get done. I don't think NASA as an organization is going to achieve low-cost access to orbit. My money is on guys like Burt Rutan...

     

     

  • Eric Gunnerson's Compendium

    PM Duties: A new awakening

    • 3 Comments

    Last Friday, I worked a bit on the project templates, and then went home. Monday morning, I went on a bike ride, and then went to work. Somewhere in between, I stopped being a the pm responsible for the project system and community and started being the C# compiler PM.

    It all started this spring, when one of our PMs decided to leave Microsoft and go back to school. The proximity of his school to his girlfriend made this a bit less of a surprise, but left us somewhat understaffed. So Anson Horton, our current compiler PM, started working on both the compiler and the IDE, which made him a fair bit too busy.

    We interviewed a few candidates for the compiler PM position, but as it requires a unique combination of talents - somebody who is passionate about the C# language and has some idea of what you need to do to build a good language - we didn't find anybody. I spent a little time thinking about it, and then decided I was interested in the position.

    So, as of this Monday, I'm back on the C# design team, this time as a PM instead of the test lead that I was when we were first developing the C# language. On Monday afternoon, I found myself at the C# design meeting on the same day, same time, and even the same conference room where I spent so much time. I'm really excited to be back on the design team.

    This will likely change the content of my blog, though I'm not exactly sure how it will change. I'd like to talk more about what really happens in a language design team, though there are some things I won't want to talk about, mostly because a lot of our discussions are about features that aren't finalized, and I don't people to expect things that we may not do.

  • Eric Gunnerson's Compendium

    Update windows forms from another thread

    • 9 Comments

    I wrote this in response to a customer question today, and thought it might be interesting:

    ******

    To update something on a form from another thread, you need to use Invoke() to get to the right thread. If I have a method that's called through an EventHandler delegate, it will look something like:

    public void ButtonClick(object sender, EventArgs e)
    {
    	// update the form here...
    }
        

    If this is called through a different thread, and it tries to update the code, bad things may happen. So, you need to have a way to get there, and that's what Invoke() does - it arranges for the code to be called on the right thread. The simplest thing to do is to just use Invoke() to the same function, with a test to see if we're on the right thread. Windows forms provides the function to do the testing...

    public void ButtonClick(object sender, EventArgs e)
    {
    	if (InvokeRequired)
    	{
    		Invoke(new EventHandler(ButtonClick), new object[] {sender, e});
    	}
    	else
    	{
    		// update the form here...
    	}
    }
        

    That works well, though it is a bit ugly. When you move to the Compact Framework, things get a bit weird. You can still use Invoke() on the compact framework, but it has two strange restrictions:

    1) You can't pass any parameters to the method, because Invoke() only has one parameter, the delegate.

    2) You need to call through an EventHandler delegate.

    You can get around the first one by saving any parameters in a field before the Invoke(), but it's a bit ugly. The second part is weird because EventHandler has two parameters, but since you can't specify any parameter values, there's no way to use them.

  • Eric Gunnerson's Compendium

    Comments on new project templates

    • 9 Comments

    I got a ton of comments on my post on the new project templates, and rather than try to answer them in the comments, I thought I 'd answer them here, so they're at least somewhat coherent to read.

    Oh, and thanks for all the comments.

    One more note: Some of the things that I did in the console app do not apply to all apps. For example, if you're writing a class library, you're going to get /// comments in front of your class and constructor, because I'm going to assume that you want somebody to actually use the library at some point.

    Dude, the tabs are best set at 4

    The templates are all defined with hard tabs, but when they're converted to real code, they'll get massaged into what you want (if you have the option set in Tools->Options).

    You should leave the command-line arguments in there

    I think we can agree that experienced programmers will know what to do here, so it's mostly an for inexperienced programmers. An important argument for not having them is that if they are there, when you are teaching programming, if they are there, you have to explain what they are. Since you haven't talked about what a string variable or an array is, you would prefer to put off those sort of discussions until later. That's pretty much verbatim a request I got from a professor who teaches C#.

    I'm a big fan of the building-block approach - which you already know if you've read my book or columns. Having to talk about things before you want to is a problem.

    This is not an easy decision to make, but in this case, I think simplicity is the more important concern. I will, however, take your comments under advisement.

    The namespace should stay

    Namespaces are mostly about organizing your code into useful components. But in a console app, that really doesn't apply, since it is very unlikely to be used by other applications. The namespace just adds complexity that you don't need.

    One other comment was along the lines of "I didn't know you could do that". Namespaces are just a convenient way to give classes and other types long, hierarchical names.

    The comments / TODOs should stay

    I think there are some comments that can be useful - for example, the SQL templates will have comments that explain what you need to do to implement an aggregate function in C#. Useful guidance there is great.

    But comments are not always useful, and the ones we had in the console app fall into the "not useful" bucket. They don't help novices because novices don't know what "entry points are". I guess you can argue that the TODO is helpful because it guides the novice on where to start, but the difficulty in finding the right place to put your code is minimal in comparison to the difficulty in figuring out how to actually write code that works.

    If you consistently have trouble finding Main(), I respectfully submit that C# programming may not be for you.

    Credit to the team for removing the cruft but, hey, they put it in there too.

    Absolutely. Our bad.

    Editing and Customizing Templates

    The way in which project templates are stored is more than a little ugly, and it's not terribly obvious what file you should edit. This is deliberate. In previous versions, there were concerns about improper modifications to the project templates causing problems, so we made it cryptic to make this a less likely occurence.

    Any takers on that? Anyone?

    No, the truth is that we didn't spend a lot of time thinking about how users might edit or extend the templates, and therefore, not surprisingly, it's not very approachable.

    In the long term, we're planning on fixing that. "Long term" is code for "don't be surprised if it doesn't show up in the next version".

    In the short term, I'm hoping on writing - or getting somebody else to write - something that explains how the templates work, how you can modify existing templates, and how to create new ones.

    How much time do templates save?

    In the case of the console template, it's true that you can write that much content fairly quickly. But the template also gets you a live project that has the source file in it. Compared to creating an empty project and adding a new class and then adding a Main() to that class (or adding an empty file and typing the text), and then setting the project properties to get what you want, there is some benefit there.

  • Eric Gunnerson's Compendium

    PM Duties Episode IV: The template strikes back...

    • 19 Comments

    One of the things I own for the C# team is the project system. For this, I'm what's known as a relationship PM, which means that I'm responsible for something but there's no C# team that does project system work. My job is to figure out what the C# user needs and drive them to the VS Core project team.

    There are lots of good things coming, most of which I can't talk about until PDC. One thing I can talk about is the C# project templates, which control what you get when you create a new project or add a new item to an existing project. So, I've been looking at what code you get, and deciding how to change it.

    Here's the VS 2003 version of a console application:

    using System;
    namespace ConsoleApplication1
    {
    	/// 
        
        
            /// Summary description for Class1. /// 
        
        class Class1 { /// 
        
            /// The main entry point for the application. /// 
        
        [STAThread] static void Main(string[] args) { // // TODO: Add code to start application
        here // } } }

     There's a lot of cruft in that template. Here are my notes:

    1. The namespace provides little utility at all, as you never refer to this class from outside.
    2. The XML comment for Class1 is useless, for the same reason.
    3. Why "Class1"? How descriptive is that?
    4. The XML comment on Main isn't useful, because Main is private and can't be called.
    5. The XML comment on Main provides no useful information. If you don't know that Main() is the entry point for an app, then you aren't going to be helped by the comment.
    6. The command line args aren't used by most apps.
    7. The TODO comment is really, really useless. It's mind-blowingly useless, like the directions that are on toothpick boxes, or the label on my blow dryer that warns me not to use it when sleeping. I'm trying to picture the scenario. There I am, working on my console application, and it doesn't work. I'm perplexed. What should I do? Do I need to run the debugger? Should I call a co-worker over? Maybe I'll check the task-list first. Oh, here's a TODO comment, and it says that I need to write some CODE to make my application work. Thanks, Visual Studio.

    Here's the Whidbey version:

    using System;
    
    class Program
    {
    	static void Main()
    	{
    
    	}
    }

    It's just a tiny bit cleaner.

    I'm making similar changes to the other templates, get rid of useless comments and just generally simplifying things. Windows Forms projects will now use partial classes, with the user code in one file and the designer code in another file.

    Note that these changes will not be present in the PDC bits, but will show up in the beta.

     

  • Eric Gunnerson's Compendium

    Guilty Pleasures...

    • 8 Comments

    One of the benefits of getting older is that you're less worried at looking foolish (though I'll probably hear about this from work tommorrow...)

    Somewhere around a quarter of a century ago, my mother handed me a book that she said I would enjoy, and she was right. Though I have read all of Jane Austen's books, Pride and Prejudice remains my favorite.

    You may ask yourself, "Self, why is Eric so enamored with a late 18th century historical drama". It's somewhat hard to explain, but Austen's commentary on English society is insightful, and her oblique way of sayings things is very engaging. For example: 

    To Elizabeth, it appeared that had her family made an agreement to expose themselves as much as they could during the evening, it would have been impossible for them to play their parts with more spirit or finer success...

    Once a year, I try to write humor, and I would consider myself successful if I could craft such a finely-tuned sentence.

    Last night I caught the 1979 BBC adaptation, which is what led to this entry.

  • Eric Gunnerson's Compendium

    Building a music system

    • 6 Comments

    We have a ski place in Skykomish, WA (20 miles from Stevens Pass) that my wife and I have been finishing for the past couple of years (contractor did the shell to painted drywall, we do everything else). Having music while doing this is a critical factor, so I brought an old stereo (my first receiver, in fact, from 1980) and a 5 disk CD changer up there. That's has worked okay, but when your used to having all your CDs (somewhere around 200) accessible and automatically scheduled with your PC-based system, you kindof get tired of the same 12 CDs. (Yes, I know, I *could* bring up more, but that would require me to plan ahead, and I still wouldn't bring up the right one).

    I did some research into CD jukeboxes, and wasn't terribly impressed. They're a bit smaller than a PC, but all they do is music, and they don't implement the system that my home-based on does. So I held off on buying one.

    I've recently been working on a C# version of my system, and it's starting to wake up and look around, and it's almost good enough to use. So I decided to build a small PC to hold my music and run the software.

    I started at NewEgg, who supplied the components for my current office system. I wanted a minimal system with a reasonable size disk and network. A little research got me a motherboard, case, cd-rom, memory, processor, fan, and hard disk for $328 to my doorstep. One trip to the computer store to get another IDE cable and to get a fan that would actually *fit* in the case (its a micro atx size). That's a pretty good price for an Athlon 1500, 256M of memory and 40 Gig of disk space.

    So, once I get XP on the system, I'll copy the files over, and probably install the current version of my software. One problem with the system is coming up with something that isn't a big ugly PC. The Micro ATX case helps a bunch, but what do you do with the monitor and keyboard? My real home system uses IR remote control, which is a possibility, but the new system also supports using a PocketPC as a controller. At work I use a Toshiba e740 with built-in wireless as the test for remote control, and it works great, but I'm not going to use a $500 PPC to control a $330 music system. I needed something cheaper. I decided to forgo the wireless and bought a reconditioned Toshiba e310 on ebay for $125, and I'll use that in the cradle as a remote control (I hope. I think I can get this to work over the USB, but I haven't actually tried it yet).

  • Eric Gunnerson's Compendium

    Code behaving badly

    • 4 Comments

    Wow, two programming/C# enties in a single day...

    Before I joined the C# team, I was the test lead for the C++ compiler for a number of years. We would periodically get customer comments that "the compiler was broken", and upon further investigation, we would usually find that it was a bug in the program. There was usually a good correlation between the amount of experience of the programmer - those with more experience normally suspected their code first, and only after careful research would consider the compiler (and they were usually right at that point).

    One of the nice things about the C# compiler not having pointers is that it's much harder to accomplish bad things ("Try to imagine all life as you know it stopping instantaneously, and every molecule in your body exploding at the speed of light". Shame on you if you don't recognize the quote). If you're playing the interop game, you're back in the pointer-world of sharp sticks, and you can easily create the otherwise elusive "Execution Engine Error".

    Last week, I upgraded to build 30730 of VS and the runtime. (This means "third year, seventh month, and 30th day", and is also known in Microsoft parlance as the "Julian Date", even though is isn't a julian date. This replaced our previous scheme (also not a julian date) that we used on VS 2002 and 2003, which replaced the scheme we used in VS6 (also not a julian date). As far back as I remember, our numbers had always been called julian dates but never were. An ideal dating system is monotonically increasing by 1 (so you can tell how far apart builds are) and easy to convert to human-readable dates (so you know when the build was created), but that's not really possible, so at least we've finally settled on something where you know when the build was, and it works for more than a couple of years (previous versions broke badly when confronted with the long dev cycle of VS 2002). It's a testament to the understandability of the previous schemes that I don't remember what they are, but I do know that many people ran little JDate applications on the desktops so they knew what jdate to use for today. But I digress)

    I got the new build on, and nothing broke (a nice thing occurance), rebuilt, and ran my app. It worked fine in most areas, but when I tried to use one function, I got an null reference exception. Of course, I initially thought my code was bad, but a little debugging narrowed the problem down to an innocuous-looking function:

    		private void CheckType<T>
            (DBObject node, List<int>
                list) { if (node is T) { if (node.Checked) { list.Add(node.ID); } } } 
            
        

    In my app, I have a treeview with different node types in it, and I need to get the list of all check nodes of that type into a list so I can persist it. This function is called for each node and each type of node, and it fills in the items.

    All the parameters were correct on being passed in, but when they get into the function, list is nowhere to be found, and calling list.Add() causes problems. Since this code worked before and the debugger couldn't find list, I started to suspect a code generation problem. Further investigation showed that even if list.Add() was never called, the program would blow up at some future point.

    I just finished a session with one of the CLR guys to try to find the root cause and get a small repro case (small repro cases are the holy grail of tracking code generation issues). He knew that there had been some changes in JITting generic methods when one of the parameters was a MarshalByRef type, and we were able to create a small project that throws an ExecutionError at will. That will allows us to find the problem and get it fixed.

    The moral of the story - and I'm sure if you've read this far you're expecting a moral - is that while it's usually your code that has the problem, sometimes it's the underlying system that has issues, so don't be too trusting...

     

  • Eric Gunnerson's Compendium

    Infrared Remote control for your programs

    • 7 Comments

    I have a home music system that uses Evation's irman to provide remote control of song selection. That lets me listen upstairs and control the music with a simple remote control.

    I'm building a replacement to that system using C# (more on that in a few months), and I can't use the irman because then my main system wouldn't work. It was either order another irman, or get something different.

    A couple of weeks ago, I ordered a Tira from Home Electronics. Tira is like the irman, except that it connects via USB rather than a COM port, and it supports both receiving and transmitting IR codes. That means you could, if you wanted to, build your own macro program that would take a single IR command and both control your computer and other IR-controllable devices. Home electronics also makes the Ira-2, which does mostly what the irman does.

    The Tira showed up yesterday. You get the transceiver module hooked to a 6' (ish) usb cord, and a cute little baby CD with some software on it. Installation is easy - plug in the cable, and point the new hardware wizard at the drivers. It installs both a usb device and a virtual COM port, which makes it easier to control the device.

    It also comes with a copy of Girder, which is a program that lets you remotely control things on your computer. I installed it and got Tira set up with it, but it was getting confusing results, and I wasn't planning on using Girder anyway, so I dove into the custom API. It's a fairly typical C-Style API with an accompanying DLL, so I dusted off my P/Invoke skills, turned on some music, and got to work.

    The interface is really straightforward. You need to init the library, tell it which COM port to listen on, and then register a callback that will be called when their is data available. I got those definitions in, fired it up, hit the button on my TiVo remote, and hit a breakpoint in the callback. That took about 15 minutes total.

    Encouraged by my success, I next worked on decoding the event data. The callback passes as a parameter a pointer to a 13-byte string that identifies what button you pressed (think of it as a digital fingerprint that identifies the key uniquely). I defined that as an IntPtr since IIRC, the runtime doesn't like marshalling strings in callbacks. A bit of unsafe code let me copy this to a byte array, and we were off an running. Start up the code, hit the remote button, and a nice 26 character string (the hex values of the data) shows up in the console window. Keep doing it, and it works fine... and then stops working. Hmm.

    Add in some code to number the items. Try again. Each time, it writes out 35 items, and then stops. Stop. Think a bit. There's something familiar here, something about delegates and p/Invoke. Ah... When you pass a delegate to an unmanaged function, the runtime has no way of knowing what the unmanaged function does with it, so it assumes that it doesn't store it (the other assumption would mean that delegate never got free'd). In this case, that assumption is wrong, and after a short bit of time (35 iterations in my case), the GC merrily collects the delegate, the tira thread dies when it tries to call it, and things stop working.

    The fix is easy - just store the delegate in a place where the GC can find it (a member variable works well), and things a great. I haven't gotten around to writing a nice C# wrapper around it, but I'll post when I get that done.

    If you want to retransmit, you have to capture through a separate, more complicated interface, since the 13-byte value doesn't give the system enough info to reconstruct the correct IR signal.

    Eric

  • Eric Gunnerson's Compendium

    "It's a simple matter of weight ratios..."

    • 11 Comments

    This morning I left the cage at home, and decided to come to work using a two-wheeled vehicle. Since I'm only about 2 miles from campus, I took the long way round and rode along beautiful Lake Sammamish Blvd, into Redmond, and then up highway 520 towards Microsoft.

    I haven't been riding much the past year or so, and every time I do, I'm reminded why I should do it more. This summer, I sold my ancient Acura Integra, and bought a used BMW 328i. The Integra had reasonable power for a small car, but the 328 is a lot faster, and still fun to drive. But neither accelerate like my motorcycle.

    One way to measure performance - a bad way, it turns out - is to look at horsepower to weight ratios. If you do that for the vehicles my wife and I own, you'll find the following:

    • Ford Ranger - 27 lb / hp
    • Subaru Outback - 23 lb/hp
    • BMW 328i - 18 lb/hp
    • Honda CBR400RR - 9.6 lb/hp (the wife's bike)
    • Honda VFR750 - 7 lb/hp

    What does this tell us about the relative acceleration of the vehicles? Well, the answer is "not a lot, really", for a number of reasons:

    The first is that torque is the important measure, not horsepower. Torque is what gets you accelerating. Horsepower is only a measure of the work the engine can do. It's a good measure of top speed (assuming equal aerodynamics), but has little bearing otherwise. So why is it the common measure? Beats me - it may be because it has "power" in the title.

    The second factor is one of rotational inertia. Big engines have a lot of rotating mass, and when you accelerate the car, you also need to accelerate that mass. Small engines (more correctly, engines with smaller pistons, connecting rods, cranks, etc.) have less rotating mass, and therefore more of the torque can go into accelerating the vehicle. This is one reason engines with more cylinders perform better (but not the only one).

    Power curves have a huge difference on acceleration. Race engines are tuned to have lots of torque in a very narrow rev band, and they make lots of power there, but not a lot elsewhere. Their peak power may be higher, but the peakier an engine, the more work it is to extract the power in a useful manner (which, strangely, equates to both "more fun" when you want to work at it, and simply "more work" the rest of the time).

    Finally, the rev range of the engine has a huge impact on the amount of acceleration it can produce. If you take two engines, one which makes 30 lb ft of torque at 3000 RPM and one that makes 25 lb ft at 6000 RPM, which one is likely to produce more acceleration? The whole answer lies in gearing - you can take the second engine, gear it down by 50%, and have an engine that produces (ignoring losses) 50 lb ft at 3000 RPM. So, small engines are better because you can spin them faster, which (within reason) gives you a free lunch, so to speak, in the power realm.

    So how do you really compare vehicles? Well, one obvious way is "at the dragstrip", which isn't really a bad measurement, delta the differences between dragstrip use and real use, and the launch characteristics. (If you're curious, the BMW does 15.3 sec @ 83MPH, and the VFR does 11.4 sec @ 112 MPH). But a nicer way is to create what's called a thrust graph. For each gear, you take the torque graph, scale it based upon the various gear ratios and the tire size, to finally arrive at a measurement of pounds of thrust at the rear wheel vs road speed. You then overlay the graphs for each gear, and come up with an overall "thrust graph". If you compare the graphs for two vehicles, you'll get a decent idea of how they stack up.

    There are only a couple of motorcycle mags that do this, and I've never seen it in a car mag.

    Somehow I got from a quick entry about my ride to a dissertation on comparing engines. Not quite sure how that happened...

     

  • Eric Gunnerson's Compendium

    Call for Votes: comp.std.cli & comp.std.csharp

    • 3 Comments

    We are trying to set up newsgroups to discuss the CLI and C# standards.

    These groups would live in the comp hierararchy on USENET, and when you want to do that, you need to follow the accepted practices. We are now in the "call for votes" phase.

    It's considered a violation of ethics to encourage people to vote one way or the other, so please keep that in mind if you refer to or forward this message.

    *****

    Voting is now in progress on the proposed newsgroups comp.std.cli & comp.std.csharp. Interested parties should look for the official Call For Votes, which can be found in news.announce.newgroups.

    You can also access the posts through Google groups at:

    CLI

    C#

     

  • Eric Gunnerson's Compendium

    Bike Fit

    • 8 Comments

    I've been riding my bicycle a fair bit in past months, and I've been having some comfort problems. The first hour is fine, the second hour elbows hurt and my feet and hands fall asleep (and my butt hurts). The third and fourth hours are more and the same.

    I'm planning on doing a metric century (100 KM) in September, which will put me on the bike for 6-7 hours, so I needed to address the comfort issue. I made an appointment with Erik Moen, a physical therapist who works at Seattle's Pro Sports Club (many Microsoft people belong to the Bellevue Pro Sports Club). He came highly recommended as "the great guy" by a friend I have who rides seriously.

    So, Tuesday morning I drove into Seattle, and wheeled my bike in for the fit. The nice thing about going to a physical therapist for a bike fit is that he can consider modifications to either the bike or to the body. My expectation was that I would be shopping for a new bike when I was done, or at least some new components. Another big advantage is that it's considered a physical therapy visit, so I didn't have to pay for the session. Ka-Ching!

    The session starts with the usual medical history questions, and then a questionaire about my bike-riding tendencies. Except for marking "spinner" instead of "cruiser", I'm pretty much on the lightweight side of all the questions. My session with Erik then began.

    Erik is a really nice guy, and he started by doing an evaluation of my body mechanics and flexibility. That took about 15 minutes. We then went to one of the studios and put my bike on a trainer for measurements. This starts with static measurements of the bike (seat highet, difference between bar and saddle height, reach to brake hoods, stem length, seat setback, and crank length). While he did this I watched and generally got in the way.

    Next are the rider on bike measurements, which include the trunk angle (37 degrees), distance between elbows and knee, knee angles, and a few others I've forgot). My seat was too far tipped forward (moved it back one notch), and my bars were too low (raised them 1cm and tilted them back). Saddle height was good, as we my cleat placement.

    We then worked on position, to see if I could get the handlebar inline with the stem. He adjusted me on the bike to the position he thought I should be in, and found that overall, things were set up pretty well for me.

    The problem was that I wasn't actually in that position, due to some inflexibility in my hamstrings and back (there's a note about a "probable ham challenge" on my fit sheet, but I don't think that's about lunch).

    His prescription:

    1) New shoes to replace my very old Shimano ones

    2) Stick with SPD cleats, as they're more practical for my use

    3) Insoles (superfeet or biosoft inserts) to make my feet happier

    4) A number of trunk and hamstring stretching exercises to stretch my legs and cure me of the "software slouch".

    5) A recheck in October

    Overall, a very worthwhile hour. Interesting that the bike setup is fine, it's the rider setup that needs some work. I knew I had crappy hamstring flexibility (too much soccer, not enough stretching), but the back part is a new one. I'm going to ask my group to tell me to "sit up straight" if they see me slouching.

    I was going to ride this morning but felt to sick, but I'm hoping to get in a few miles tonight. I'll post again with my impressions of the adjustments.

  • Eric Gunnerson's Compendium

    Hangin' with generics

    • 13 Comments

    I've been using generics in a project that I'm using on Whidbey. They are tremendously convenient when you want to use collection classes. That wasn't really very surprising.

    What was surprising - mostly because I hadn't spent any time reading the generics spec - was how useful generic methods would be. For my app, I sometimes needed to fetch a list of values from a database. I didn't want to write a separate function for each type,  and with generic methods, I could write the following:

        
    public List<T> GetValues<T>(string selectStatement)
    {
     OleDbCommand select = new OleDbCommand (selectStatement, connection);
     List<T> list = new List<T> ();
     OleDbDataReader reader = select.ExecuteReader ();  while (reader.Read())
     {
      list.Add ((T)reader[0]);
     }
     return list;
    }

    I think that's pretty cool. I wrote several functions like that that perform database operations in a generic (ha ha!) way. So I think generics are going to be cooler than I expected.

    I should mention one more thing. If the type you use in the generic method is one of the parameters, the compiler can figure out what generic type you're using, and you don't have to specify it. If I wrote:

            public T Process<T>(T param) {...}
        

    I could call it with:

            int i = inst.Process(12);
        

    and the compiler figures out that I want Procees<int>.

    Oh, and before you ask me for the spec, you can't have it... yet... It's still churning due to the ECMA process, and we don't want to release it until it's a bit more concrete.

  • Eric Gunnerson's Compendium

    WMP, album art, and Amazon Web Services

    • 4 Comments

    So, I'm working on a music playing application (to be revealed sometime around PDC, or perhaps a bit later if I don't get it working) to play around with Whidbey features. One of the things I want to do is be able to display the album art.

    So, I dust off the docs for WMP - which are a bit spotty in places - and go searching for the album art. For a media item in WMP, you can fetch various attributes, and one of them is WM/CoverArt (or something like that). Seems like just the ticket, but none it's not set for any of the items in my collection.

    I go searching for another way to get the art. There's a MoreInfo attribute that you can fetch for all the items, but it points to a non-existant page. I play around a bit more and find that I can fix the URL to get to the "More Info" page, which has a link to the album art. A quick Regex, and I pull the cover art url out. I code it up, and run it on a small subset of my album selection (about 20 albums). There is no album art for 5 of the 20 albums.

    That's pretty pathetic. It's not like I had a lot of obscure albums there - they were all albums that had certified at least platinum. Now, it may not have had all the albums, but it was really slow, so I had that going for me.

    A bit of time on Google led me towards Amazon web services.  The SDK is free, and using it is free, as long as you don't do more than 1 request a second per application (multiple instances can do more). The Details object has a link to the album art, so I just needed to get that object for the album. So, I started up VS, added the WSDL as a web reference, and went to town.

    The first attempt was to use keyword searching. That worked well if I as looking for something like "Green Day Dookie", but something like "rush rush" gives you 282 matches, some in artist, some in album. The basic problem is that you're just looking at keywords, and you can't specify album and artist directly. Nor can you find that in the Details object - it just has a single property named ProductName.

    The solution was to do an Artist search. This gives me all albums with a specific value in the artist field (once again, a keyword search, not an exact search). You then need to look through all the matches you got back and match against the album you want. This works, but is a bit ungainly, but at least you don't need to do it that often.

    Here's some code:

    			ArtistRequest request = new ArtistRequest();
    			request.artist = albumCoverArt.Artist;
    			request.devtag = ericsDevtag;
    			request.mode = "music";
    			request.type = "lite";
    			request.page = "1";
    
    			AmazonSearchService search = new AmazonSearchService();
    
    			ProductInfo productInfo = null;
    			productInfo = search.ArtistSearchRequest(request);
    
        

    That gets the first chunk of data, and you have to make other calls 1 second apart to get the rest. The devtag is given to you when you register with Amazon, and it identifies who's using the service.

  • Eric Gunnerson's Compendium

    Question of the day

    • 7 Comments
    If you put wheat bread into a toaster and take out wheat toast, what happens when you put french bread in a toaster?
Page 1 of 1 (15 items)