March, 2006

Posts
  • Michaeljon Miller

    Things to do to the CRM RSS connector

    • 2 Comments

    There are two things that need to get done with the RSS connector, and soon. First, I keep seeing these issues with the connector's default server location. I keep forgetting that CRM has to install itself at a non-standard port (5555) when on SBS. That means that the default to 'localhost' in the service and metadata proxies isn't going to work.

     

    Here's my recommendation, which I'll try to get into an updated RSS package sometime soon. First, edit the web.config file and add a new key to the appSettings

     

    <appSettings>

       <add key="CRMServerLocation" value="servername:portnumber" />

    </appSettings>

     

    Then, modify the constructors for CrmService and MetadataService to use the configuration setting instead of the hard-coded value.

     

    public CrmService() {

    string server = System.Configuration.ConfigurationSettings.AppSettings["CRMServerLocation"];

     

    if (server == null || server == "")

    server = "localhost";

     

    this.Url = "http://" + server + "/mscrmservices/2006/crmservice.asmx";

    }

     

    public MetadataService() {

    string server = System.Configuration.ConfigurationSettings.AppSettings["CRMServerLocation"];

     

    if (server == null || server == "")

    server = "localhost";

     

    this.Url = "http://" + server + "/mscrmservices/2006/metadataservice.asmx";

    }

     

    The next thing that needs to happen, but isn't as critical yet, is to change the way the feed list is loaded. Currently, the generator loops over all the configured entities and for each one asks the platform for all configured and available queries. That's brutal, but on fast hardware it doesn't seem to be a problem (it still is though).

     

    The solution here is to run a single query, once, over all queries and parse the results into a list by entity. This will seriously cut down the database hit and cross-network traffic. I was working on this just before we released the code, but I didn't have it coded in a way that was understandable.

     

    If anyone makes this change, post the code somewhere or mail it to me. I'll do what I can to get the changes into the next connector update.

  • Michaeljon Miller

    When Outside isn't really that far away

    • 0 Comments

    Seems that my previous post about going outside was taken a little more literally than I had expected. As many of you might have learned over the last few days at Convergence, I haven't left Microsoft. I have left the CRM team after many, many years, but I'm staying inside of MBS.

     

    If you met with Steve Silverberg at Convergence then you have an idea of the kinds of things I'm working on. If you didn't meet with him, or haven't heard from another partner, then hang in there just a bit longer. I will announce what we're working on and talk about how partners and customers can get involved.

  • Michaeljon Miller

    Announcing the RSS connector for MS-CRM 3.0

    • 6 Comments

    We've finally released the RSS connector for MS-CRM. I've mentioned this tool a few times. This has been a long release mired in a few documentation, legal, and technical issues. But, that's not your concern, you probably just want to download this thing, install it, and make things happen. Well, here's the backgrounder on what's making the connector tick. The MSDN article which comes with the connector download covers some basic information. Look for a longer whitepaper from our UE team in the next few weeks.

     

    We want to hear how you've used, modified, or extended the connector. If you use it, let me know. This is the last bit of code that I built for the CRM team (well, it's the last bit that I released, I was still working on the add-entity framework and address book right up to RTM) and I'd like to see what happens with it (and no, this isn't typical of the code quality that I usually write, this was a prototype first and a public release second; if we were going to release this is would be very different). We're releasing this under a different model (see the EULA) and we're very interested in its life once it leaves here.

     

    Basics of the connector

    The RSS connector for CRM is built on top of the advanced find and web service Fetch functionality. For the most part it directly executes the requested query and returns the results as RSS-formatted XML. However, there are a few changes that are made to the base query (if you're wondering, all queries are stored as serialized <fetch> requests, which means the connector gets to mess with XML).

     

    First thing that happens is that the connector loads the actual <fetch> definition for the requested user or system query. Next, it creates an array of the columns specified in the query's grid. These are used for specifying the simple list extension attributes for IE7 sorting and grouping. The one change from the grid columns is that the connector adds the modifiedon attribute if it's part of the underlying entity's definition.

     

    Once the connector has cached away the display attributes it modifies the in-memory copy of the query so that all attributes are available. The query definition has all of its selection criteria removed as well so that the feed data is as broad as possible given the caller's security attributes.

     

    User queries vs. system queries

    Under the covers system queries and user queries are structurally the same. They are stored in different tables in the database (don't ask, it was a decision that couldn't be undone by the time it was noticed), they have the same columns, and they have the same semantics. The primary difference is that the security model changes: system queries are effectively public and user queries are effectively private. A secondary difference is that they had different APIs during the TAP and alpha releases because they just happen to be written in two different languages (again, don't ask).

     

    This does have the nice side-effect that the RSS connector simply reads the query definition through the proper entity. It's really just a <fetch> that changes entity names and some minor decorations.

     

    When the RSS connector displays the nice HTML-based list of available feeds it does so in two sections. The top contains user-specific feeds and the bottom contains all accessible system feeds. The connector also generates HTML <link> elements for each user query. This tells RSS-aware browsers and readers that there are feeds published on the page. The connector only does this for user queries otherwise the list would be unreasonable long.

     

    How the connector selects attributes for display

    Because the connector modifies the query definition to force all attribute selection (this isn't just done for display, it's also done to support instance delivery, but more on that in a minute) it's able to present a rich view of the instance data to the RSS aggregator. In WriteItemData the connector loops over the entity's attribute list in a semi-intelligent order. It writes the primary field, any audit attributes, any state or status attributes, ownership data, any "description" attributes, and then all the rest. Nearly all attributes retrieved from the platform are displayed: there are a few attributes that have no public view and no reasonable display label, so those are skipped.

     

    Riding on coattails - using the list extensions

    The connector uses the query's grid column to select the set of attributes used in the list extensions elements. Technically, this is a de-selection, because the connector rewrites the query to remove the attribute list and adds an <all-attributes> clause to the <fetch>. In the foreach loop in WriteCrmAttributes there's a check to see if the "current" attribute is in the cell list and if it's not it's skipped. The data is written in a manner that makes list extension display useful to the user and executable to the extension processor. That is, all "codes" and other internal details are tossed away and nice display values are used instead.

     

    The lightweight metadata cache and the service proxy

    Two things that held up earlier release of the connector were technologies that I used to make the connector happen but which aren't supported outside of the CRM team. These are use of the 1.2 COM proxy (which is finally gone in the upcoming CRM release - I hated that thing because I had to code it over my wife's birthday a few years back and that got me in a lot of trouble) and the internal metadata cache assemblies.

     

    I didn't want to freak out our development team so I had to use the same metadata interfaces that everyone else uses. The problem is that the MD web service delivers too much data too slowly for me (speaking of slow, one optimization I'd like to see in the connector is to read the queries in one batch instead of on a per-entity type basis). To get around the metadata problem I rolled a very lightweight and purpose-built cache that uses the web service to read the metadata and keep it around in a static. There are a ton of problems with this approach: there's another copy of the cache floating around and CRM is already memory-hungry, and this cache isn't aware of customization changes (i.e. Publish) so it can get out of sync. I didn't consider either of these show-stoppers for this add-on, but the PM in charge of programmability does and he's doing something about it for V.next.

     

    One thing that I did that might be a little surprising was that I asked for the WSDL and then hand-edited down to its absolute basic bits for this solution. I didn't want a 650Kb proxy loaded into the connector and I didn't want the connector to pay the late compilation and reflection hit when W3WP loaded the proxy. The connector only uses the Fetch method and the SOAP header. That means I was able to strip all the types and method definitions out (sorry Kevin and Arash). And no, I'm not using this as an apology for the web service shape in V3, it's a good thing. I didn't have to do the same thing with the metadata proxy because it's fairly small and the connector needs a lot of the definitions from it.

     

    If you've gotten this far make sure you read my entry on using the offline client hosting process otherwise known as Cassini. I used the connector to verify that I could make the offline client web services work.

     

    Optional non-IE7 "list extension" behavior

    When the RSS feeds are displayed to the user there are two RSS icons and a text-based hyperlink. The two icons represent the "simple" RSS feed and the RSS feed with the complete instance data. The text link will show a down-level IE representation of the IE7 RSS viewer. This bit of code is a very early prototype put together by the IE and RSS team to show what the IE7 experience might look like. I lifted the code from those teams for the PDC demo and just never got around to removing it. Someone better versed in cross-browser AJAX stuff might be able to make this work better in other browsers. For now, this link can be ignored (and I would recommend replacing the link with the "real" RSS link and let the browser figure it out).

     

    Delivering a complete CRM instance in the <item> data

    The RSS connector has the capability to deliver, as part of the item data, the XML serialized representation of a complete entity instance. It does this to enable a set of scenarios supported by really simple sharing and by some hub and spoke delivery models that we're looking at. When this option is enabled the <channel> element contains the underlying entity's XSD (this is a different XSD generation process than the WSDL uses). When a smart RSS aggregator loads a feed with the CRM namespace it knows that the entity definition and entire entity instances are available to it. This means you can tunnel select CRM instance data over RSS without exposing the CRM web services. RSS provides the pipe through which this data moves. We've come up with dozens of applications for this delivery mechanism and will start building some software based on this model over the summer. (This is the project that I've left the CRM team to work on and I couldn't be more excited about it.)

     

    Wrapping things up

    The rest of the code is just infrastructure used to make CRM data into RSS. It's missing support for HTTP 304 and ETags. I'm hoping that someone will add that and drop me an update so I can reverse-integrate it into the code. I'm assuming that the connector will fall under the "unsupported sample code" umbrella which means that there isn't a formal support infrastructure in place for it. However, if you post a comment to this entry I'll see that they get to someone in the CRM team.

     

    Building and installing the connector is easy. I'm assuming that the MSDN document talks about this, but if it doesn't here's the short and sweet. With the connector code is a small CMD script that if executed from a VS2003 command window will compile and copy the assembly to the bin directory. My demo installation uses the ISV extensions to add a "Web feeds" item to the menu which points at the RSS feed display and a convenient OPML page. There's a 16x16 PNG file that fits nicely in the menu and just happens to match the IE7 and Firefox RSS icons.

     

    More things to read

    RSS and CRM - a little history

    Where is the RSS connector for CRM 3.0

    “Democratizing” Business Logic and Data

    Simple List Extensions

    Really Simple Sharing

    Using the CRM SDK offline

    Microsoft Dynamics CRM RSS Connector

  • Michaeljon Miller

    Inside MS-CRM goes Outside

    • 4 Comments

    It all started with an email to a few guys working on a replacement lead management solution for MSN. The point of that email was that we could change the way software was built and create a new model for our partners around linking software to services in the clouds. Wow, now that I look back on it, that seems like a long time ago. Funny thing is that reading that email today brings back lots of memories of being very excited about being on the brink of something huge. When I read that email again last week while dusting off my office I realized that the excitement is still there. It's just shifted a bit for me.

     

    The MS-CRM team has grown and changed over the last (nearly) seven years and I'm glad I was a part of it. I think the team set out to build something and after a few fits and starts actually outdid itself. We learned a lot on the way - both good and bad. I've grown and changed a lot over those same years. I've filled three roles on the CRM team: architect, developer, and overall pain in the butt. To any of those folks who dealt with me while I was on a rampage I apologize.

     

    My decision to leave the team really didn't come as easily as a lot of people might think. There's a lot of cool work to be done on the product and I wanted to be part of that. However, I leave the product in very capable and caring hands. I trust them to do the right thing and I trust that they'll probably bounce their ideas off me on occasion just to see what the old guy says.

     

    You know, it's kind of funny. I actually thought that MS-CRM would be the last team I'd work on at Microsoft. I really believe that the product has a future and I think the environment in which it sits today will start to adopt some of the principles that we put into the product. There were a few times where I figured this would be my last Microsoft job because I wasn't going to find anything else cool to work on. Yeah, I know, it sounds weird what with all the things that Microsoft does, but I couldn't find anything else I wanted to work on.

     

    For the foreseeable future… or the next year, whichever comes first… I'm going to be working on hybrid line of business applications. One of the things I'd like to do is go back to the vision in that original email and see if we can tie all the goodness that is MS-CRM with a bunch more goodness in the clouds. So, I guess I'm going to start looking at MS-CRM as an ISV… from the Outside.

     

    This is going to be pretty damned cool.

  • Michaeljon Miller

    Using the CRM SDK offline

    • 12 Comments

    I've been meaning to write something about using the CRM SDK in an offline state, and I've been meaning to write it for a few years now. I guess I never had the right prodding, but recent newsgroup posts show that there are people interested in this, and that they're stuck.

     

    So, I started doing a little playing around to see what might happen. First thing I noticed is that, as expected, if the client isn't in an offline state you can't work with the local web server. There's code deep in the platform security layer that flat out stops the calls. Ok, that's easy enough to do - let's put the client in an offline state for a while and see what breaks next.

     

    I needed an "application" to test with and I just happened to have my RSS feed generator bits handy and hot off the press. They're really simple and use a very narrow set of CRM SWS (what we call the web service) methods. In fact, it only uses Fetch() to do all of its magic (oh yeah, and it uses a ton of metadata, but that's another posting). Well, as many of you have noticed, you can't get reasonable WSDL from the offline SWS because the module that generates our WSDL (which happens dynamically if you're wondering) isn't on the client. There's just no need for it there.

     

    I pulled WSDL from the server endpoint and hand-tweaked it so it had just the API set that I needed. This isn't strictly necessary, but given the size of the generated code and number of classes there's a significant hit to start-up performance as the CLR reflects over all those types. Anyway, all I needed was Fetch() so I removed everything else and compiled up the resulting CS file into a client proxy assembly.

     

    After installing everything I thought I'd need to run my application offline I noticed that there was a problem hitting the SWS in Cassini, particularly around executing queries. In this case the thing to remember is that queries are old V1.x functionality and that they're implemented in native C++. That means the SWS needs its own proxy to get at those C++ bits. That's where the COM proxy comes in (warning: the COM proxy has already been removed from the next release's build environment, so don't assume you can use this in any supported way for anything).

     

    You might have noticed that the COM proxy isn't on the client machine anywhere (although there is another client-specific COM proxy, but that's not the one we want for this exercise). Go to your install CD or grovel the COM proxy from somewhere off your server and copy it to the res/web/bin directory on the client. Then, and this is important, GAC it so it's accessible from the Cassini process.

     

    That's all I needed to do to get arbitrary query support on the client in a custom application offline. I haven't expanded to arbitrary reads through other messages, but I'm assuming that they should all work. I also haven't done anything with create / update / delete yet because those requests must end up in the playback queue. The COM proxy doesn't do this work. If I remember correctly, this happens somewhere in the RC proxy or in Cassini itself (it would make the most sense for this to work as an HTTP handler inside of Cassini since we want to capture SOAP requests for later playback).

     

    Anyway, I hope that unblocks a few creative people and gets them moving in a direction that helps. I'd love to start seeing some add-on code running in an offline state. Granted, things like callouts and workflow won't work offline, so don't even both trying to make them work.

     

    If anyone comes up with a cool offline add-on I'd like to hear about it.

Page 1 of 1 (5 items)