Recently I gave 2 presentations to the CEO Executive Forum at their semi-annual meeting; this time located in Detroit. The first was on Connectivity and Collaboration, the 2nd on Metrics, Reports and Dashboards. These focus on Office 365 and Power BI, including SharePoint and middleware.
Connectivity and Collaboration
For Connectivity and Collaboration, we focused on how as humans we need to make connections between things, and between things and people. I then described some ways this is possible with Office 365, SharePoint, OneNote and Teams and middleware.
We worked on answering questions like:
Can we automate our business processes in Office 365?
What is the best way to manage documents?
How do I reduce data entry?
Metrics, Reports and Dashboards
During Metrics, Reports and Dashboards I took time to focus on the difference types of numbers to pay attention to, compared metrics to KPI’s and talked about what a dashboard should contain. There was some demonstrations including Power BI.
We worked on these questions:
What business metrics do I care about?
What numbers should I pay attention to?
I want to make the PowerPoint presentations available to the attendees of the conference and anyone else that finds the topics interesting.
Below are the 2 PowerPoint presentations you can download:
Connectivity and Collaboration
Metrics, Reports and Dashboards
Feel free to email or comment with questions or feedback.
This article was written to help me sort out SharePoint, PowerPivot, Power View and Analysis Services. Read on to discover the questions that arose as I embraced this technology set and the answers I’ve uncovered.
So what are the options for working with Power View within SharePoint with MDX as the source cube? Let me frame this a bit, I am referring to SharePoint 2013 and SQL Server 2012 SP 1 or SQL Server 2014. It appears that PowerPivot becomes the data source for Power View reports. That is, a PowerPivot workbook saved within a Document Library. So it seems feasible that building a PowerPivot report pulling from an MDX cube, then stored in a Doc Library should work. The first test is the Document Library, PowerPivot and MDX cube test. If this works the same way as using a tabular cube, then we are one step closer to a complete solution.
Continue reading “SharePoint, PowerPivot, Power View and Multidimensional Analysis Services”
I had trouble getting the OLEDB Provider from IBM Client Access to work. It was pulling data, but due to CCSID issues, many columns didn’t translate properly. To resolve this, I had to use a T-SQL query so I could set Force Translate=0, when adding the LinkedServer. Continue reading “Adding LinkedServer to AS400/DB2”
Recently doing a BPC install in a Network Load Balanced (NLB) environment and ran into an error when testing the Reporting Services on the second application server. When clicking on the Report Manager URL I would receive the error:
The report server installation is not initialized
It turns out I needed to go back to the first server and run Reporting Services Configuration Manager and select Scale Out Deployment, then join server two.
This couple of SQL statements will delete all the rows from all tables.
- It disables referential integrity
- DELETES or TRUNCATES each table
- Enables referential integrity
- Reseeds rows with identity
— disable referential integrity
EXEC sp_MSForEachTable ‘ALTER TABLE ? NOCHECK CONSTRAINT ALL’
EXEC sp_MSForEachTable ‘
IF OBJECTPROPERTY(object_id(”?”), ”TableHasForeignRef”) = 1
DELETE FROM ?
TRUNCATE TABLE ?
— enable referential integrity again
EXEC sp_MSForEachTable ‘ALTER TABLE ? CHECK CONSTRAINT ALL’
— This will reseed each table [don’t run this exec if you don’t want all your seeds to be reset]
EXEC sp_MSForEachTable ‘
IF OBJECTPROPERTY(object_id(”?”), ”TableHasIdentity”) = 1
DBCC CHECKIDENT (”?”, RESEED, 0)
Thanks to Mauro Cardarelli’s post
Based on this SQL Server Central post: a processor with more cores is better for Data Warehouses / Decision Support Systems (smells like OLAP). And at this time an Intel processor is better for a transactional system (OLTP).
BPC 7.5 SP04 or later:
Windows Server 2008 R2 Standard or Enterprise Edition with or without Hyper-V
Windows Server 2008 Standard or Enterprise Edition with or without Hyper-V, or
Windows Server 2008 Data Center with or without Hyper-V
Windows Server 2003 Standard or Enterprise Edition SP2
Windows Server 2003 Standard configured on the SQL database or OLAP server components only
Windows Server 2003 R2 Standard or Enterprise Edition SP2
BPC 7.0 and 7.5 support:
SQL 2008: SP2 with Cumulative Update 1 for SP2.
SQL 2008 R2: Cumulative Update 5.
Thanks to Gert Andries van den Berg on the SAP BPC forums for making sense of all this. In a nutshell, by writing records to either the WB or FAC2 table, the cube is updated. No need to build SSIS package with Dumpload task. This can be done with SQL. Writing to the FAC2 table seems to be the destination of choice. Though I’ve yet to determine if running optimization is required to see data in reports.
From Gert Andries van den Berg.
As per the tuning doc:
WB – real time data input (ROLAP partition)
This is data that is the most current data sent to the system. Data sent by BPC for Excel data sends and Investigator browser data sends is placed in real-time storage.
FAC2 – short term and Data Manager imports (MOLAP partition)
This is data that is not real-time data, but is also not in long-term storage yet. When you load data via Data Manager (automatic data load from external data sources), it loads the data to short-term storage so that the loaded data does not affect system performance. Only the cube partition associated with this table is processed, so the system is not taken offline.
Fact – long term history (MOLAP partition)
This is the main data storage. All data eventually resides in long-term storage. Data that is not accessed very often remains in long-term storage so that the system maintains performance
This structure allows SAP BPC to maintain the same performance over time even when there is a large increase in data volumes.
Periodically clearing real-time data greatly optimizes the performance of the system and an “Optimization” process is required (this could be scheduled automatically based on given parameters like a numbers of records threshold).
Clears Real-time data storage (WRITEBACK) and moves it to short-term data storage (FAC2). This option doesn’t take the system offline, and can be scheduled during normal business activity.
Clears both real-time and Short-term data storage (WB and FAC2) and moves both to Long-term data storage (FACT).
This option should be run when the system is offline, but it will not take the system offline so it should be run during off-peak periods of activity.
Full Process Optimization:
Clears both real-time and short-term data storage and processes the dimensions.
This option takes the system offline and takes longer to run than the incremental optimization.
It is best run scheduled at down-time periods – for example after a month-end close.
The Compress Database option is available to rationalize the Fact Tables. “Compress” sums multiple entries for the same CurrentView into one entry so that data storage space is minimized. Compressed databases also process more quickly.
More info on this topic from Sorin Radulescu:
First you have to be aware about structure of BPC cubes:
Each cube has 3 partitions:
1. fact – MOLAP
2. fac2 – MOLAP
3. WB – ROLAP
When you insert records into WB table because WB is ROLAP partitions you will see the impact of that insert into cube in Real Time.
If you insert records into any of MOLAP partitions without processin the partition you are not able to see these records into cube.
I think now you have a clear picture about BPC cube and you undertsood diference between MOLAP and ROLAP partitions.
Lite Optimize is necessary just to keep under contrl the number of records from WB table.
For SSAS if a Rolap Partitions has more than 100 000 records retrieve data from that cube it will be very slow if in the same time users are doing insert into WB Table.
So Lite optimize is schedule usually every 15 minutes when number of records is over 20 000.
That’s means every 15 minutes this dtsx package check if WB has 20 000 records.
If yes then is running this process
If not then is not doing anything.
LITE Optimize process
It is doing the follow steps:
1. Copy records from wb to fac2 and marlk the records from wb move into fac2
2. Create a temporary partitions and start to process this partition just for these records move from wb table
3. When it is finishing the process of partition then the system is doing in transaction the follow:
– merge partition fac2 with temp partition
– delete the records marked from wb
By using the ASP.NET ReportViewer control on a web page exposed to the internet, it is possible to display Reporting Services reports. The trick is in turning on Anonymous Access on the ReportServer Virtual Directory and creating the anonymous user as a Reporting Services user and granting that user permission to view the reports you want to share.
In the ReportViewer control configure the ReportView Tasks as follows:
- Choose Report:
- Report Server Url: the path to the ReportServer Virtual Directory
- Report Path: the path to the report beginning with ‘/’
On the web server hosting Reporting Services configure the ReportServer Virtual directory as follows:
- Turn on Anonymous Access
- Remember the Anonymous Access user
In Report Manager do the following:
- Configure the Anonymous Access user as a Reporting Services user
- Grant that user access to the necessary folders and/or reports
Deploy your web page with the ReportViewer control to your internet server and you are good to go.