January, 2005

Posts
  • Eric Gunnerson's Compendium

    C# vs C++

    • 48 Comments

    [Update: Changed a misplaced "C#" to a "C++" in the tools section. Thanks, Nick]

    At the class a took a while back, the instructor asked me to talk a little bit about the benefits of C++ vs the benefits of C#, since I had worked in C++, then in C#, and now in C++ again.

    I talked for about 5 minutes on why C# was better, in the following themes. Note that I'm speaking about the whole programming environment, not just the language.

    Automatic memory management

    There are several facets of this. When I read or write C++ code,  I keep part of my attention on the algorithmic aspects of the code, and another part on the memory aspects of the code. In C#, I usually don't have to worry about the memory aspects. Further, in C++ I need to figure out who owns objects and who cleans them up, and also figure out how to clean up in the presence of error checking.

    Exceptions

    Without a lot of dedication, using return codes is a bug farm waiting to happen. In the C++ world, some functions return HRESULT, some return a bool, some return another set of status code, and some return a number and use an out-of-range value as an error indicator. Oh, and some are void. You not only have to write the correct code there, you have to successfully convert back and forth between the various kinds of error handling.

    You also lose the opportunity to write more functional code. I end up writing something like:

    CString name;
    RTN_ERROR_IF_FAILED(employee.FetchName(name));

    instead of writing:

    string name = employee.FetchName();

    And, of course, exceptions are fail-safe in that you get error reporting without doing anything, rather than having to do everything right to get error reporting.

    Coherent libraries

    The C++ code I'm writing uses at least 6 different libraries, all of which were written by different groups and have different philosophies in how you use them, how they're organized, how you handle errors, etc. Some are C libraries, some are C++ libraries that are pretty simple, some are C++ libraries that make heavy use of templates and/or other C++ features. They have various levels of docs, all the way from MSDN docs through "read the source" docs to "a single somewhat-outdated word doc".

    I've said in the past that the biggest improvement in productivity with .NET comes from the coherent library structure. Sure, it doesn't cover everything in Win32, but for the stuff it does cover, it's often much much easier to use than the C++ alternative.

    Compilation Model

    C++ inherited the C compilation model, which was designed in a day when machine constraints were a whole lot tighter, and in those days, it made sense.

    For today, however, separate compilation of files, separate header and source files, and linking slow things down quite a bit. The project that I'm working on now takes somewhere on the order of a minute to do a full build and link (that's to build both the output and my unit tests, which requires a redundant link). An analogous amount of C# code would take less than 5 seconds. Now, I can do an incremental make, but the dependency tracking on the build system I use isn't perfect, and this will sometimes get you into a bad state.

    Tools

    Reflection is a great feature, and enables doing a ton of things that are pretty hard to do in the C++ world. I've been using Source Insight recently, and while it's a pretty good IDE, it isn't able to fully parse templates, which means you don't get full autocompletion in that case.

    Code == Component

    In C#, you get a component automatically. In C++, you may have extra work to do - either with .IDL files or with setting up exports.

    Language Complexity

    C++ templates are really powerful. They can also be really, really obtuse, and I've seen a lot of those obtuse usages over the years. Additionally, things like full operator overloading are great if you want a smart pointer, but are often abused.

    And don't get me started on #define.

    The alternate view

    So, if what I say above is true, the conclusion looks pretty clear - use C#. Not really much surprise considering who came up with the list.

    But there are things that C++ has going for it, and it really depends on the kind of code you're writing which language is a better choice. In broad strokes (and in my opinion, of course), if you're doing user applications (either rich client or web), choosing C# is a no-brainer. When you start getting towards lower-level things like services or apps with lots of interop, the decision is less clear.

    So, what do I think you get in C++? A few things spring to mind.

    Absolute control over your memory

    I think you actually need this level of control much less than you may think you need it, but there are times when you do need it.

    Quick boot time

    Spinning up the CLR takes extra time, and it's likely to alway take extra time  (unless it's already spun up in some other manner). You don't pay this penalty in the C++ world.

    Smaller memory footprint

    Since you aren't paying for the CLR infrastructure, you don't pay for memory footprint.

    Fewer install dependencies

    You can just install and go, and your users don't have to install the proper version of the CLR.

     

    So, that's pretty much what I said in the class. What did I miss?

  • Eric Gunnerson's Compendium

    More on C# and C++

    • 27 Comments

    Thanks for all the comments - I'd like to expand on a few of them.

    Cross-Platform

    When I wrote the last post, I was thinking of cases where you have a choice between languages, so I didn't think of cross-platformk, since if you need to run on a platform where a language isn't present, that pretty much eliminates the language from consideration.

    It's true that C++ is available on far more platforms, and if that's important in your case, C# probably isn't an option for you.

    Templates, template metaprogramming, STL, Boost

    I missed templates as an advantage that C++ currently has, as I forgot that Whidbey isn't there yet for C# programmers. When Whidbey is widespread, C# will have the majority of the features that I'd want related to generic types, though it won't be able to do as much as C++ does.

    In my mind, that's (mostly) a good thing. While there are things that aren't in C# generics that I'd like, I think that, because of the indirection involved, generic types are something that are best enjoyed in moderation, as they're near the limit of what most programmers can easily understand. Which brings us to template metaprogramming. The discussions I've read on this topic list "power" and "optimization" as the big advantages of this technique, and I'd have to agree with that evaluation. But the code that I've looked at makes normal template code look simple and straightforward. So, I'm not sorry that you can't do this with C# generics.

    Something I do miss is the ability to do Mixins, which would be a nice complement for a language without multiple inheritance. They would be helpful to add in system functionality without burning the base class.

    STL isn't the kind of library that I like to use, as I think it's too baroque. Sure, you can do a *ton* of things with it and easily switch things around, but I've never found that I need to switch things around that often, so it's complexity that I don't use, but still have to deal with. So, for me, no thanks - I'd rather have foreach, which covers about 90% of my loops. Oh, and before I leave this topic, I should mention that the richness of data structures in STL is a lot greater than that in C#, though you should keep your eye out for C5 and PowerCollections when Whidbey shows up.

    In the current C#, foreach only supports one way of iterating. About 3 years ago I wrote an article on some collection wrappers you could use to support other ways of iterating, though at some cost to performance. Unfortunately, I chose to call the "iterators", which, of course, is also the name of a C# 2.0 feature that allows you to make objects iterable more easily, and support multiple ways of iterating a collection.

    Boost seems like an obvious C++ advantage, if you're working in an environment where you can use outside libraries.

    Object Lifetime

    There were a lot of comments around deterministic destruction, and there is certainly a big difference between the "programmer owns the allocations" and the "the GC owns the allocations" approaches.

    I will admit that when I first started using C#, I missed that feeling that I had full control over what was going on in the system. But over time I found that while I did need to be concerned with scarce resources (db connections, file handles, and other system resources), I didn't really need to be spend a lot of attention on memory resources. For scarce resources, "using" works well for me, and I prefer the scoping that "using" gives me over the scope-based lifetime that you get with smart pointer approaches in C++, and I also like that it's more explicit.

    This does not mean that you can totally ignore the issues around object allocation in C#, as Rico's has said, repeatedly.

    Oh, one other point on object lifetime. Having an environment where there is no automatic scope-based lifetime makes supporting exceptions much cheaper in C#, as there isn't the overhead of tracking what objects are live at any point that is required by C++ exceptions.

    const

    Which brings us to const. My experience with const is as follows:

    When I used const in my projects, I always ran into situations where I needed a routine that was const to become non-const. That meant either changing that routine - and then updating all of the callers so that they were non-const - or creating non-const versions of existing routines where applicable. Neither of those is a particularly nice and/or fun thing to do, and after trying it for a while, my conclusion was that having const didn't give me enough benefits to make it worth the disadvantages.

    I do agree that const can give some protection against the programmer doing the wrong thing (which, interestingly, is not really in keeping with the general C++ philosophy that programmers should be able to do whatever they want, even if it's wrong (yes, I'm being a bit extreme there)), but since it's merely a convention and not a guarantee (as I can cast const away whenever I want, or just use "mutable"), I don't see a lot of value.

    I've talked to enough people to know that my opinion is not shared by all.

    C# things I missed

    There were a couple of notable things I missed from my C# list.

    Events

    Events are much much more useful than I had originally thought. While you can do a lot of similar things with interfaces, events are great for the sort of loosely-coupled components that I like to create. For me, events are a feature that work exactly the way I want them to.

    Data types

    This is a big one that I missed.

    In C#, there is one string type.

    In C++, I'm currently dealing with code that uses:

    • CString (the ATL/WTL type)
    • LPCTSTR
    • LPTSTR
    • WCHAR*
    • TCHAR*
    • BSTR
    • _T("constant")

    and needs to transform strings from one type to another fairly regularly. I also spend time making sure I have the right distinction between byte count and character count when dealing with such types.

    Comments?

    If I've missed any that you'd like me to comment on or you have others to talk about, feel free to pile on...

  • Eric Gunnerson's Compendium

    Macros

    • 20 Comments

    In the C# design, there are a number of cases where we decided on more limited functonality than C++ for the sake of simplicity. One of those decisions was the decision to not support macros.

    Thomas wrote that this is something that he really misses.

    I know that I've run across a few cases where macros would have been really convenient...

    But...

    I spent about 4 hours last Friday fighting with some code as part of a restructuring I'm doing, and a lot of that time was spent dealing with the fact that somebody defined CoCreateInstance to point to a private wrapper on the function. Part of the time was spent finding that out, and the other part in restructuring include files so that my code could build without that support.

    There's a considerable benefit in looking at code and knowing that is means what it says, without any renaming going on under the covers.

    Incidentally, there's also a benefit in compilation model, in that compiling without a preprocessor is simpler and faster.

  • Eric Gunnerson's Compendium

    C#: What's wrong with this code #5 - Discussion

    • 19 Comments

    Thanks for all the comments.

    As a refresher, here's the code that we were looking at:

    enum ResponseType { ReturnValues, InvalidInput }

    string CreateWebText(string userInput, ResponseType operation) {
       switch (operation) {
          case ResponseType.ReturnValues:
             userInput = "<h1>Values</h1>" + FilterOutBadStuff(userInput);
             break;
          case ResponseType.InvalidInput:
             userInput = "<h1>Invalid</h1>" + FilterOutBadStuff(userInput);
             break;
       }
       return userInput;
    }

    First, there were a few comments about not generating HTML yourself, not echoing input back to the user, and not writing something that is already present in ASP.NET.

    Agreed, agreed, agreed. There are lots of attacks that depend on this, and you shouldn't even do it for debugging.

    There was considerable discussion about whether the variable "userInput" should be reused. In most cases, I think that it's best to come up with a new name, but there are cases where it makes more sense to reuse the name. I don't feel strongly about this case.

    Finally, on to the big thing - or what I define as the big thing, at least - the handling of the enum.

    Enums in C# are double-duty - they serve both as sets and as bit fields. That means that an enum can have any value that is valid on the underlying type. In this case, there's no type defined, so it's an int. In other words, you can write:

    ResponseType responseType = (ResponseType) 157;

    and things will work just fine.

    So, we need a way to validate the value (assuming, of course, that we're going to keep this routine. We wouldn't in practice, but humor me...)

    Here's a modification that does that:

    string CreateWebText(string userInput, ResponseType operation) {
       if (!Enum.IsDefined(typeof(ResponseType), operation)
          throw new InvalidArgumentException(...);

       switch (operation) {
          case ResponseType.ReturnValues:
             userInput = "<h1>Values</h1>" + FilterOutBadStuff(userInput);
             break;
          case ResponseType.InvalidInput:
             userInput = "<h1>Invalid</h1>" + FilterOutBadStuff(userInput);
             break;
       }
       return userInput;
    }

    And that works just fine.

    Or does it?

     

  • Eric Gunnerson's Compendium

    Fred and his 5.2 Madone

    • 9 Comments
     

    Last fall, I wrote a post about being an aspiring Fred, where a "Fred" is someone with more bike than rider. Apologies to anybody named "Fred" - I'm not the one who coined the term .

    I had gotten tired of my LeMond Tourmalet, whose decidedly midrange components have seen better days, and whose main claim to fame is "heavy".

    I rode bikes made of steel and titanium, but it was the carbon fiber Trek that I fell in love with. It's certainly not as live of a feel as the metallic bikes that I rode, but it doesn't feel mushy either. Just sort of muted, a marked opposite to the Litespeed TI bike I rode (their entry-level bike, to be fair), which was a bit springy. I was either going to buy the 5000 or the 5200, the chief difference being that the 5200 comes with full Ultegra (Shimano's second-best line of bicycle components), and that 5000 comes with some Ultegra, some 105.

    My first Fred post got a comment from SeanB, where he pointed me to Trek's ProjectOne website. On this site, you can choose a model, a paint job, and then customize parts of the bike. You can choose color, saddle, wheels, component group (to varying degrees based on the bike), seatpost, bars, etc. When you order, you can specify a stem and crank length as well.

    I first test rode the 5200 in 58cm (the frame size), doing some hills outside of Greggs. I found two things - that I loved the feel, and the 58 was too small for me. They ordered in a 60, and I rode that, and it was pretty much perfect. I decided to go the project one route, did my customizing, and had them order the bike. As part of their model year switch, Trek is changing their line, and the 5200 is being replaced by the 5.2 Madone, which is just like the 5200, except it has the fin behind the seat post (for improved aero above 23 MPH, I have heard). Oh, and it's about $150 more expensive, though that also corresponds to the change from a 9-speed cassette in the rear to the new Ultegra 10 speed cassette, so it's not just the frame change. Coupled with the triple up front, that gives me 30 different gear combinations (brief aside - you can't really use all 30. First of all, there's lots of overlap, and second, if you ran the large ring on the front and the large on the back, the angle could lead to noise, wear, or even chain breakage. But more speeds gives you more chance to find the exact one you want).

     

    The paint job is the project one "Deep South" motif - bright red with some yellow accents on. I'd considered getting "Pave Flambe", but it was a little too muted for my taste. Notice the flowing curves of the carbon fiber frame.

    The bike arrived in good condition, except that it came with a carbon seatpost (I wanted aluminum because I sometimes run a seatpost rack, and carbon seatposts don't hold up to that), and Gregg's had put a 2" x 3" silver sticker right on the bottom of the downtube. The seatpost will get re-ordered, and the sticker came off fairly easily.

    Unfortunately, it's been cold and wet here, so I haven't been able to go on a real ride yet, but I did go out on Saturday for 5 or 6 miles. I'm not sure how fast it is, since I don't have a computer yet, but it's definitely faster on the flats, and felt really nice on the one hill that I threw at it.

    On, and here's a picture of "Fred" next to his bike. The shorts are normally black/silver, but the retro-reflective fabric really shows up in the flash. Oh, and what were they thinking with the blue bar tape? I can handle the saddle being blue, but blue bars on a red bike?

  • Eric Gunnerson's Compendium

    C#: What's wrong with this code #5

    • 36 Comments

    [Update: I messed up some code. Corrected version now]

    Greeting from the frigid city of Redmond. Keep in mind, of course, that in the Puget Sound region, "frigid" is any temperature below freezing.

    Thanks for all the feedback on previous bad code examples. I've created a new category for these posts, so you can avoid my other posts if you wish.

    This snippet is a routine that is used for web debugging.

    enum ResponseType { ReturnValues, InvalidInput }

    string CreateWebText(string userInput, ResponseType operation) {
       switch (operation) {
          case ResponseType.ReturnValues:
             userInput = "<h1>Values</h1>" + FilterOutBadStuff(userInput);
             break;
          case ResponseType.InvalidInput:
             userInput = "<h1>Invalid</h1>" + FilterOutBadStuff(userInput);
             break;
       }
       return userInput;
    }

    What's wrong with this code?

    Hint: There are both obvious and subtle issues lurking...

  • Eric Gunnerson's Compendium

    Dissecting a C# Application: Inside SharpDevelop

    • 11 Comments
    Dissecting a C# Application: Inside SharpDevelop is free at Apress
  • Eric Gunnerson's Compendium

    Reggae Rock Songs...

    • 17 Comments

    Last winter, I was talking with a friend about music, and he asked if I had any reggae songs. My first response was "no", but after I thought about it, I came up:

    The Scorpions - Is there anybody there? (Lovedrive)

    Definitely a strong reggae beat there. There were two others in my collection that I've forgotten.

    What ones can you add?

  • Eric Gunnerson's Compendium

    C# Coding Guidelines

    • 4 Comments
    Brad publishes some of the C# coding guidelines that are used inside MS.
  • Eric Gunnerson's Compendium

    New toy...

    • 31 Comments

    I'm going to pick up a new purchase tonight? What will it be???

    What does a south-of-40 guy in Bellevue buy in the dead of winter?

    Any guesses?

  • Eric Gunnerson's Compendium

    The infinite cat project

    • 7 Comments

    Today, somebody sent around a link to a video of cats doing funny things. Then somebody else sent around this link.

    The Infinite Cat Project

  • Eric Gunnerson's Compendium

    Naked in Baghdad

    • 5 Comments

     

    I just finished reading "Naked in Baghdad", written by NPR correspondent Anne Garrels. Anne spent time in Baghdad on and off (the visas were only 10 days, so that the Iraqis could charge more money), and was there during the initial attack and when the US troops entered the city.

    It's a really interesting read, and gives some good insight into what life was like in the city before and after the invasion.

  • Eric Gunnerson's Compendium

    True Lies - the GNN Book

    • 4 Comments

    I happened to pick this one up at the library when I was there last.

    (aside - it's pretty weird to be in the library these days. Not only do the sell food and drinks, they work on the honor system, so you check out your books yourself and then just walk out.)

    This book is written by Anthony Lappe and Stephen Marshal of the Guerilla News Network.

    I'm not sure I believe everything in this book, but there is certainly some thought-provoking information at the very least.

     

  • Eric Gunnerson's Compendium

    Anders on TheServerSide.Net

    • 7 Comments

    Anders on TheServerSide.Net

    from Dan...

  • Eric Gunnerson's Compendium

    C# PUM blogs...

    • 5 Comments

    PUM == Product Unit Manager, or the person in charge of a whole group.

    Scott Wiltamuth is the PUM of the C# team, and he's recently started blogging. Please read what he's written and send him some feedback about what you'd like him to write about.

  • Eric Gunnerson's Compendium

    Intelligent Agent Nirvana

    • 13 Comments

    At the dinner last week, we got into an interesting discussion about intelligent agents.

    The discussion is around whether intelligent agents will be able to gather the information that we want to see automatically, or whether human intervention is necessary.

    I should probably note here that I'm simplifying the discussion, to keep things short and to make me look better.

    My assertion is that intelligent agents not only aren't there yet, but are unlikely to be there (wherever "there" may be) in the forseeable future.  Now, in making that pronouncement, I am aware that the track record of people saying that thing are impossible - such as the crazy notion of "heavier than air" flight - isn't exactly stellar.

    So, why don't I think that intelligent agents are going to work - at least for me? Well, a few reasons.

    The first is my skepticism around anything that requires AI. Way back in the early 80s, there was lots of press around AI systems, with a ton of money being spent both by DARPA and the Japanese, and no real results. I think that's a good demonstration that "AI is hard", and I don't expect any breakthroughs in that area.

    Another challenge is that I really don't know what I want to know. I continue to find offbeat information in blogs that I didn't know that I wanted to know, so I don't see how I can expect an agent to filter in that manner.

    Finally - and related to the last point - I think that coming up with a categorization system that works well is likely to be very, very difficult. If you've ever fought with Google trying to find the one page in 100,000 on a specific topic that you looked at a few weeks ago, then you understand what I'm talking about.

    I expect that a human filter will remain a necessity to get good information for quite some time.

    So, what do you think? Will agents make human editors obsolete?

    FYI, here's a brief history of AI, and a wikipedia article

  • Eric Gunnerson's Compendium

    PC Upgrade

    • 9 Comments

    My daughter has been using our hand-me-down Gateway computer. It's been fine for what she's been doing, but it doesn't have enough graphical oomph to play Zoo Tycoon 2. I thought about just putting a new graphics card in it, but since it's only a 500 MHz cpu, an upgrade seemed to be in order.

    Off to NewEgg. I've built two computers from scratch with their parts, and didn't expect any problems. Here's what I bought:

    13-135-160 MB VIA KT333| K7VTA3 V6.0 ECS
    (Serial#: 996640E40103202)
    1
    $33.00
    $33.00
    For Tech Support Please call 800-829-8890
     
    14-121-506 VGA ASUS|RADEON R9200SE/T/128M TV R
    (Serial#: 610839010578)
    1
    $48.00
    $48.00
    For Asus Tech Support, Please Call 502-995-0883 or http://helpdesk.asus.com/
     
    19-104-201 CPU AMD|SEMPRON 1.5GHZ SDA2200DUT3D
    (Serial#: 19405279)
    1
    $48.00
    $48.00
    For Retail AMD processor, please call 408 749-3060 for service after first 30 days. For OEM AMD processor, we will service for 30 days only. Please verify the Processor matches your order Prior To installation. IMPORTANT: Always pack your CPU well for return. We will refuse your RMA if we received it as DAMAGED!
     
    20-141-302 DDRAM 256M|DDR333 PC-2700 -K %
    (Serial#: 740617064582)
    1
    $34.23
    $34.23
    Kingston Support: 1 (800) 835-6575
     
    35-124-003 CPUFAN KINGWIN KCU-7015 (INTEL/AMD)
    (Serial#: 856149000445)

    The total was $185 and change, including shipping.

    The Sempron is AMDs replacement for the duron line. We used the existing disk and case.

    Sam helped a bunch at pulling the old components out of the case, and in putting the new mobo in the case. Unfortunately, doing a motherboard swap on a non-generic case is a bit of a pain, as it doesn't have an I/O cutout for the new plate. [Update: David asked whether there was a I/O piece with the mobo. There was, but the Gateway case didn't have the cutout - just a big piece of metal that had the appropriate holes (for the old mobo) in it].  It took me about 30 minutes with the dremel and 6 abrasive blades to cut out the slot, and then the mobo went in the case well. Hookup was easy, except for the front switches and lights. All the wires wire in a 2x6 connector rather than the separate connectors you get in a generic case. I didn't have any luck relocating pins nor did I have any extras (connectors I had, but none of the pins that go in them), so I settled on hacking in the power switch connectors to a ribbon cable connector I had lying around.

    Got that done, hooked up the disks, turned on the power, turned off the power, fixed the master/slave on the disks (they were on separate channels on the old computer), rebooted, installed XP, used the board jumpers to set the FSB speed (who uses jumpers for that), got all the SPs and drivers installed, and the thing is up and running quite nicely.

    A pretty nice upgrade for the price.

  • Eric Gunnerson's Compendium

    Another new C# PM blogger

    • 4 Comments
    Raj Pai, my previous boss, has started blogging.
  • Eric Gunnerson's Compendium

    How is the C# team doing?

    • 3 Comments
    Shaykat is working on the C# community effort now, and puts up a great post asking for feedback.
  • Eric Gunnerson's Compendium

    Software Dinner

    • 12 Comments

    I went to the Joel on Software dinner last night at Crossroads mall.

    At least, I think it was for Joel, though it was hard to get close enough to be sure given the mass of people around him. If you were there to listen to Joel, you might have been disappointed as only those near him could hear him.

    People I ran into included Alan (aka Yag), Alex, Dare (aka Carnage4Life), Rory (aka Neopolean), and, of course, Robert (aka "the Scobleizer"). 

    Looks like I need to come up with some sort of alias...

    It was a pretty good event, but about triple the size of most blogger meetups at crossroads, and it had a weird dynamic because some of the people were bloggers, and some were fans of Joel. Not that there's anything wrong with that, but if you're an active blogger and you talk with people who only read one blog, you probably wont be spending your time talking about blogs.

  • Eric Gunnerson's Compendium

    Rock is dead...

    • 11 Comments

    One likes to believe in the freedom of music
    But glittering prizes and endless compromises
    Shatter the illusions of integrity...

    One can argue that rock and roll has never been pure - that it's always been perverted by the record industry and the radio industry. The early years were marred by payola, and there has always been tension between the desires of management and the on-air staff, but when I was growing up, the on-air staff on rock stations - and typically, some of the management - always kept a certain edge, a certain rawness.

    KISW was the premiere rock station for most of the past 25 years in the Seattle area. They competed early with KZOK, but KZOK decided to move to classic rock, and KISW was back on top. Until the mid-80s, when a bunch of staff defected to start up KXRX, which walked all over KISW for a few years, and then dissolved.

    So, KISW has been part of my music experience for a long time. I don't listen as much as I used to, but I still listen from time to time. On the way home from the club tonight, I switched on the radio just as American Idiot came on...

    Don't want to be an American idiot.
    Don't want a nation under the new mania.
    And can you hear the sound of hysteria?
    The subliminal mind "bleep" America.

    ???

    Well maybe I'm the "bleep" America.
    I'm not a part of a redneck agenda.
    Now everybody do the propaganda.
    And sing along in the age of paranoia.

    At that point I turned to a different station.

    Rock stations have been pretending that songs don't have dirty words in them for years. That's part of that "edge" that I was talking about before. But apparently those days are gone. The second edit is particularly bothersome. First, the word isn't on the classic list of words you can't use. Second, it's more than a little ironic that they chose this song to censor, given what the song is about.

    Sigh...

    Long live rock, I need it every night,
    Long live rock, come on and join the line,
    Long live rock, be it dead or alive.

  • Eric Gunnerson's Compendium

    Excellent

    • 4 Comments

    Note that you'll have to speak the title using a Jeff Spicoli voice (as performed by Sean Penn in Fast Times...). Or perhaps from Bill and Ted...

    I'm spending the week becoming a more excellent developer, in a five-day course for developers. "Excellence" has been a big hit since Tom Peter's 1982 smash hit In Search of Excellence (which was followed by "A Passion for Excellence" in 1985, and then, finally, with "A messy divorce from Excellence" in late 1989).

    Classes like this can be hit or miss - they'll cover a lot of ground that you've seen before, but this one has a lot of audience participation, and it always brings up a few topics of interest.

    Interestingly, I sat next to a guy that used to work in the same building as I did back at Boeing Computer Services in the late 80's, though we never met.

  • Eric Gunnerson's Compendium

    Wanted: Snow

    • 12 Comments

    Washington ski area looking to purchase 10,000 acre-feet of snow to supplement our meager natural supply. Seller must be willing to provide delivery and installation.

    ----------

    I mean, seriously, can we work some sort of trade here? That two feet of snow that I read about in the paper this morning is just causing problems back east, and it sure would be a good start around here. I'm sure I could get some smoked salmon or Geoduck for you...

    This is by far the worst ski year I can remember. Some of our ski areas saw 5 or 6 inches of warm rain last Monday, and the majority of our ski areas are now closed (Crystal is pretending to be open, but one lift doesn't count in my book. Even Mt. Baker, the North American snowfall record holder at 1140 inches in a year, is closed).

    The poor snowpack is also of concern for the coming summer, and we will proably be under water rationing. Most the municipal water supplies in western Washington depend on snowmelt to fill our resevoirs, and our snowpack is currently at 16% of average.

  • Eric Gunnerson's Compendium

    What to study in College...

    • 3 Comments
    You're probably already reading this, but Joel has a great post on what to study if you're in college...
  • Eric Gunnerson's Compendium

    Weird fact of the day...

    • 3 Comments

    There are over 19,000 articles in the Esperanto edition of Wikipedia..

Page 1 of 1 (25 items)