This will be the first of a two part blog post about real-world adventures in BAM deployment.  As time has proven over and over, permissions and environment variables tend to trip even the most careful planners up when deploying artifacts to non-development environments. 

 I have developed a BAM solution for a customer which has been deploying into a production environment for the past 4 months, with several updates to the activity definition along the way.  However, I hadn't yet created a view which created any OLAP cubes yet.  When I finally got a requirement to create and deploy such a cube, I did so and deployed it in my local environment, and everything looked great.  However, when it came time to promote it out to the next environment on the way to prod, the bm update-all command failed with:

 "OLE DB error:  OLE DB or ODBC error:  Cannot open database "BAMStarSchema" requested by the login.  The login failed.; 42000"

"Errors in the high-level relational engine.  A connection could not be made to the data source with the DataSourceID of 'bam_%viewname%', Name of 'bam_%viewname%'.  These were followed by many other errors about "Errors in the OLAP storage engine" trying to configure the various dimensions which made up my cube.

What I figured out was that because our operations folks (correctly) had configured the Analysis Services instance to run as a separate service account (i.e. different than the service account running the SQL Server instance).  In order to allow the Analysis Services instance to talk back to the BAM databases, we have to give that service account permissions to those databases.  So I created a script which added the Analysis Services service account login to the SQL instance.  Then for each of the BAMPrimaryImport, BAMArchive, and BAMStarSchema databases created a user for that login which was then added to the db_owner role for the database.  Once the Analysis Service service account had permissions to these databases, voila - the update-all command ran successfully!