The size of the subcube that is used in a deviation detection definition affects the amount of computing resources your server needs to complete a definition run. Creating deviation detection definitions with very large subcubes can take several hours to complete. You need to be aware of the size of your cubes before you run a deviation detection definition.
You can schedule definition runs after regular business hours by using the omrundef program that is provided with OLAP Miner. See Mining run program for information about scheduling definition runs.
Use OLAP Miner on a computer with a configuration that is similar to that of the OLAP server component.
Ensure that the OLAP server component is tuned for optimum performance. See the OLAP Setup and User's Guide for information about tuning DB2 OLAP Server.
Set the retrieval buffer size to 2 KB, which is lower than the OLAP server default setting, for each database that you want to mine. You must set this retrieval buffer size for each, separate cube. It cannot be set globally. This setting can significantly reduce the computing time for deviation detection definitions. For information about setting retrieval buffer size, see the OLAP Database Administrator's Guide, Volume 2, Version 7.