Recently had an issue when processing dimensions in BPC. The process would complete without errors, but the OK button was disabled. The only way to exit was killing the application. Turns out the recycle limit on the DefaultAppPool was causing this.
Even though minimum memory was set to 60% before the app pool would recycle, and even though the memory on the server never got close to 60%, the process wouldn’t finish. I turned off recycling based on memory usage and processing completed fine.
Thanks to Gert Andries van den Berg on the SAP BPC forums for making sense of all this. In a nutshell, by writing records to either the WB or FAC2 table, the cube is updated. No need to build SSIS package with Dumpload task. This can be done with SQL. Writing to the FAC2 table seems to be the destination of choice. Though I’ve yet to determine if running optimization is required to see data in reports.
From Gert Andries van den Berg.
As per the tuning doc:
WB – real time data input (ROLAP partition)
This is data that is the most current data sent to the system. Data sent by BPC for Excel data sends and Investigator browser data sends is placed in real-time storage.
FAC2 – short term and Data Manager imports (MOLAP partition)
This is data that is not real-time data, but is also not in long-term storage yet. When you load data via Data Manager (automatic data load from external data sources), it loads the data to short-term storage so that the loaded data does not affect system performance. Only the cube partition associated with this table is processed, so the system is not taken offline.
Fact – long term history (MOLAP partition)
This is the main data storage. All data eventually resides in long-term storage. Data that is not accessed very often remains in long-term storage so that the system maintains performance
This structure allows SAP BPC to maintain the same performance over time even when there is a large increase in data volumes.
Periodically clearing real-time data greatly optimizes the performance of the system and an “Optimization” process is required (this could be scheduled automatically based on given parameters like a numbers of records threshold).
Clears Real-time data storage (WRITEBACK) and moves it to short-term data storage (FAC2). This option doesn’t take the system offline, and can be scheduled during normal business activity.
Clears both real-time and Short-term data storage (WB and FAC2) and moves both to Long-term data storage (FACT).
This option should be run when the system is offline, but it will not take the system offline so it should be run during off-peak periods of activity.
Full Process Optimization:
Clears both real-time and short-term data storage and processes the dimensions.
This option takes the system offline and takes longer to run than the incremental optimization.
It is best run scheduled at down-time periods – for example after a month-end close.
The Compress Database option is available to rationalize the Fact Tables. “Compress” sums multiple entries for the same CurrentView into one entry so that data storage space is minimized. Compressed databases also process more quickly.
More info on this topic from Sorin Radulescu:
First you have to be aware about structure of BPC cubes:
Each cube has 3 partitions:
1. fact – MOLAP
2. fac2 – MOLAP
3. WB – ROLAP
When you insert records into WB table because WB is ROLAP partitions you will see the impact of that insert into cube in Real Time.
If you insert records into any of MOLAP partitions without processin the partition you are not able to see these records into cube.
I think now you have a clear picture about BPC cube and you undertsood diference between MOLAP and ROLAP partitions.
Lite Optimize is necessary just to keep under contrl the number of records from WB table.
For SSAS if a Rolap Partitions has more than 100 000 records retrieve data from that cube it will be very slow if in the same time users are doing insert into WB Table.
So Lite optimize is schedule usually every 15 minutes when number of records is over 20 000.
That’s means every 15 minutes this dtsx package check if WB has 20 000 records.
If yes then is running this process
If not then is not doing anything.
LITE Optimize process
It is doing the follow steps:
1. Copy records from wb to fac2 and marlk the records from wb move into fac2
2. Create a temporary partitions and start to process this partition just for these records move from wb table
3. When it is finishing the process of partition then the system is doing in transaction the follow:
– merge partition fac2 with temp partition
– delete the records marked from wb
In order for BPC SSIS Tasks to be available for use in SSIS Packages, the OSoft Task dll files must be placed in Drive:Program FilesMicrosoft SQL Server100DTSTasks for SQL 2008.
I find there are a couple reasons for an import into BPC to not complete, getting stuck on the the DumpLoad task:
- The first reason it stops at the DumpLoad task is there is alot of data and it just takes that long. You can always Abort the import and try importing a smaller set of data.
- The OutlookSoft SendGovernor Service is not started. This is usually the result of a server reboot. To check this out:
- Start > Run
- Type services.msc and press Enter
- Scroll down to the OutlookSoft SendGovernor Service
- Start the service if it is not running
- The other reason has to do with the dbo.lck[Application Name] table. So if the application name is Finance, then there are rows left in the dbo.lckFinance table. This can happen if an import errors out. To solve this:
- Verify no imports are occurring
- Check if there are any rows in the table: SELECT * from dbo.lckFinance
- If there are, remove them: DELETE dbo.lckFinance
That should solve a hung import.
Displaying teams in Data Manager is simply based on the current user’s Team access. Change their Team membership using BPC Administration.
Security is broken down to 4 aspects:
- Basic component of security. User, Email, IM, etc.
- Basic grouping of users
- You can assign Task and Member Access Profiles to Teams
- Task Profiles
- Application related security
- User/Team Activities and roles specified here
- There is a finite number of Tasks configured in BPC. See Task Profile Descriptions in the BPC Administration Help
- Member Access Profiles
- Dimension and member related security
- Example, you can specify which periods are ready to receive forecasts
Member Access Profiles
To change the rollup for an account in BPC do the following:
- Change PARENTH1 on the account to the new rollup
- Process the dimension (Do a Full Process)
- Make the Application available
- Click on the Noble node
- Click on Set application set status
- Select Available
- Click Update application set status
- Refresh dimension members on the client.
- eTools -> Refresh Dimension Members
This will probably work for any dimension, not just accounts.
OPEN INPUT TEMPLATE FOR EDITING
1. eTools -> Open Dynamic Templates
2. Input Schedules -> 09 Input Templates
3. Open ‘Forecast Input Template 2009.xlt’
UPDATE FORECAST TEMPLATE TO REFLECT NEW MONTH OF ACTUALS
1. Change Cell A4 and BA6 to the new FCST Month
MODIFY EVGTS FORMULAS
1. If we are modifying to prepare for October Month End, then we will copy Cells Q15:Q233 to R15:R233
2. Update Cells R12 and R13 to say ACT and OCT in Black Font
MODIFY EVSEND FORMULAS
1. If we are modifying to prepare for October Month End, then we will copy Cells BI15:BI233 to BJ15:BJ233
. Run the BPC Admin Tool
. Click Manage Security
. Expand Member Access Profiles
. Select Category Upload Switch
. Click Modify member access profile
. Click on BPC Access tab
. Find the Member row (i.e. AprFcst) and change its Access to Read & Write
. Click Next
. Click Apply