Using the Exalytics Summary Advisor and Oracle BI Apps 7.9.6.4
Peter Scott is currently mid-way through a series of posts on our blog on replicating parts of the BI Apps data warehouse into TimesTen, as a potential way for running all or part of the BI Apps "in-memory". We'll have to wait and see how Pete's experiments work out, but the thing that prompted that study into replicating whole parts of the BI Apps data warehouse into TimesTen was the fact that, when we last checked, the Exalytics Summary Advisor "blew-up" when we tried to run it against the BI Apps repository; using OBIEE 11.1.1.6 (the initial Exalytics-compatible release of OBIEE 11g) the Summary Advisor just wouldn't run, because the BI Apps repository was just too-complex, and broke too many Exalytics RPD-validity rules, to allow the aggregate recommendation and generation process to complete.
But that was back with OBIEE 11.1.1.6, and we're now up to the 11.1.1.6.11 patchset for this particular release, so whilst Pete checks out the "brute force" method I thought it'd be worth revisiting the use of the Summary Advisor, with my BI Apps installation bumped-up to version 7.9.6.4, and the Exalytics environment patched up to 11.1.1.6.9, the latest patch release we've applied to that environment. To get ready for the test, I'd loaded a minimal amount of EBS 12.1.3 source data into the BI Apps data warehouse tables, mostly around the Accounts Receivable and Sales subject areas, giving us a set of dashboards like those in the screenshot below to use for testing.
I then created a relatively small TimesTen database for my initial testing (2GB perm size, 1GB temp size) as our test & dev environment is only a subset of the whole Exalytics environment; also, for the initial testing all of the timeout and other settings were left at their default value, with the idea being I'd play around with them as testing proceeded.
Running the Summary Advisor against a BI Apps repository is the same as against any other RPD; first select Tools > Utilities, then select Oracle BI Summary Advisor from the list of available BI Administration tool utilities (note that the Summary Advisor menu option only appears if you're working on an Exalytics server).
Then you go through the process of narrowing down the set of table sources that the Summary Advisor's aggregate recommendation process will consider, first by selecting the date range for user queries, and then by setting a threshold for the accumulated time on a particular logical table source. For my example, I left the start and end date open, but discarded any logical table sources that had been queried for less than 15 seconds in total, thereby removing those table sources that we're really used that much in queries.
Next I select the TimesTen database as the target for the recommended aggregates, fine-tune a few further settings around when the Summary Advisor should stop trying out new aggregate options, and then finally review the aggregates that the Summary Advisor has proposed, based on usage patterns recorded in the various usage tracking and summary statistics tables maintained by Exalytics, In this case, the Summary Advisor has recommended five aggregates, each one of which would involve an summary fact table and one or more shortened dimension tables.
I left all of the candidate aggregate table options selected, and then generated the script file that the BI Server then uses to create the aggregate tables, load-up with summary data and then register in the BI Apps repository. In this instance, the script file contained logical SQL statements to first drop the aggregate tables, then create them as described above, as shown below.
delete aggregates
"tt_aggr_store".."EXALYTICS"."ag_320723981",
"tt_aggr_store".."EXALYTICS"."ag_1385967556";
create aggregates
"ag_3423183827"
for "Core"."Fact - Fins - GL Revenue Posted"("Revenue","P&L Amount","Credit Amount","Debit Amount","Credit Local Amount","Debit Local Amount","Credit Account Amount","Debit Account Amount","Revenue Document Amount","Revenue Local Amount","Revenue Amount","Revenue Transaction Quantity","Period Start Date","Period End Date")
at levels ("Core"."Ledger"."Detail", "Core"."Date - Fiscal Calendar"."Fiscal Year")
using connection pool "tt_aggr_store"."Connection Pool"
in "tt_aggr_store".."EXALYTICS",
"ag_3448667885"
for "Core"."Fact - Fins - GL Revenue Posted"("Revenue","P&L Amount","Credit Amount","Debit Amount","Credit Local Amount","Debit Local Amount","Credit Account Amount","Debit Account Amount","Revenue Document Amount","Revenue Local Amount","Revenue Amount","Revenue Transaction Quantity","Period Start Date","Period End Date")
at levels ("Core"."Profit Center"."Profit Center Detail", "Core"."Ledger"."Detail", "Core"."Date - Fiscal Calendar"."Fiscal Period")
using connection pool "tt_aggr_store"."Connection Pool"
in "tt_aggr_store".."EXALYTICS",
"ag_3350139264"
for "Core"."Fact - Fins - GL Revenue Posted"("Revenue","P&L Amount","Credit Amount","Debit Amount","Credit Local Amount","Debit Local Amount","Credit Account Amount","Debit Account Amount","Revenue Document Amount","Revenue Local Amount","Revenue Amount","Revenue Transaction Quantity","Period Start Date","Period End Date")
at levels ("Core"."Date - Fiscal Calendar"."Fiscal Year")
using connection pool "tt_aggr_store"."Connection Pool"
in "tt_aggr_store".."EXALYTICS",
"ag_630606962"
for "Core"."Fact - Fins - GL Revenue Posted"("Revenue","P&L Amount","Credit Amount","Debit Amount","Credit Local Amount","Debit Local Amount","Credit Account Amount","Debit Account Amount","Revenue Document Amount","Revenue Local Amount","Revenue Amount","Revenue Transaction Quantity","Period Start Date","Period End Date")
at levels ("Core"."Profit Center"."Profit Center Detail", "Core"."Ledger"."Detail", "Core"."Date - Fiscal Calendar"."Fiscal Day Detail")
using connection pool "tt_aggr_store"."Connection Pool"
in "tt_aggr_store".."EXALYTICS",
"ag_4212509594"
for "Core"."Fact - Fins - GL Revenue Posted"("Revenue","P&L Amount","Credit Amount","Debit Amount","Credit Local Amount","Debit Local Amount","Credit Account Amount","Debit Account Amount","Revenue Document Amount","Revenue Local Amount","Revenue Amount","Revenue Transaction Quantity","Period Start Date","Period End Date")
at levels ("Core"."Profit Center"."Profit Center Detail", "Core"."Ledger"."Detail", "Core"."Date - Fiscal Calendar"."Fiscal Year")
using connection pool "tt_aggr_store"."Connection Pool"
in "tt_aggr_store".."EXALYTICS";
So now we've got the aggregate creation script, it's a case of running it through the nqcmd.exe BI Server ODBC command-line client, and waiting for the results … which in the first instance, failed. Looking through the various log files it turned out the issue was the W_MCAL_QTR_D table, which contained two values for the same ROW_WID and which caused the aggregate creation process to fail. And this is an issue you're probably likely to find a few times if using the Summary Advisor against the BI Apps RPD; the RPD physical layer tables define this column as being a unique key, but in reality there's no unique key constraint on the underlying table and the load process actually loaded two values for that key into the table. So expect to do a bit of data cleansing and validation before letting the Summary Advisor loose on your BI Apps repository, and I'd imagine issues like this will get sorted in future releases of the BI Apps so that Exalytics "just works" without having to fix the data like this.
Once the underlying data was fixed-up though, the Summary Advisor script ran as expected, giving us the output below (I'd added an extra "delete aggregates" command to the file at the start to clear up some previous attempts, hence the three commands that completed successfully).
Moving back over to the BI Administration tool I could see the new TimesTen in-memory tables registered in the Physical layer, and over in the Business Model and Mapping Layer I could see those tables mapped in as new logical table sources.
Running a quick "View Data" on the new TimesTen tables showed they contained the aggregated EBS data as expected, now held in TimesTen for Exalytics in-memory tables.
So, after a bit of data cleansing and a few changes to the TimesTen settings to up the timeout time for user queries, it now works, which also shows the value of testing something again that didn't work a few patch releases ago. I suspect in-practice running this against a fully-loaded BI Apps environment will uncover a few more data issues, so my next step is to pull together a larger BI Apps data warehouse dataset and test it out against a number of modules and a wide range of query patterns. But - encouraging so far, and it'll be interesting to compare the results with the findings from Pete's experiments.