Cube Synchronization through Analysis Services DDL Tasks in SSIS
http://blog.decentrix.com/cube-synchronization-through-analysis-services-ddl-tasks-in-ssis/
One of the most common things we do in the Business Intelligence (BI) world is to process and synchronize OLAP cubes. There are several methods people use for synchronizing, each with their own benefits, but normally we use the XMLA script generated by the synchronize wizard and use it in an SSIS package called by a scheduled job. Recently a client server sync job started failing because of a lack of disk space.

The cube on the server was 220 GB and there was about 200 GB of free space on the disk. The problem is that all of the cube files from the source cube are copied to the secondary server so that the files can be compared, deleted, and merged leaving a final set of files representing the synced version of the cube. This can be problematic because this method requires that you have enough disk space to store two complete cubes, in our case 440 GB worth. While this is not much space in the grand scheme of Business Intelligence, we were dealing with older hardware that was used for production reporting and couldn’t wait for additional disk to be added.
Our solution was to drop the analysis services database prior to syncing so that we would only have one set of files to work with. According to the Analysis Services Synchronization Best Practices article on SQLCat, some tests showed that syncing was actually faster when dropping the cube prior to syncing. Another reason dropping the database prior to syncing was an attractive option for us was because we were dealing with locking issues during the normal sync. Users were running queries through excel workbooks that would lock the database and not allow the sync to finish, in some cases for up to 4 hours
To test this scenario out we set up an SSIS package with two DDL tasks and one data connection to accomplish this.


Drop Database Task:

Sync Cube Task:

The first step executed successfully but the Sync Cube step failed with error.

We did a quick internet search and found quite a few forums where people were having the same problem with no apparent solution in that their tasks failed when trying to execute ddl tasks when there wasn’t a database present first. We decided to try modifying the data connection because the error actually makes sense in that the connection manager can’t find the database because we just dropped it in the previous step. We modified the connection to point at the entire instance by removing the initial catalog, tested the connection and everything looked good.

We executed the package and this time, after about an hour, we had a successful cube sync.


浙公网安备 33010602011771号