By Paul Hausser, Envisn, Inc.
The title of this article has to be taken in the context of how Cognos administrators would like to use Cognos audit data. Many would like to be able to use the activity data in multiple ways to better manage their environment and know what’s of value to their users as measured by actual usage. So our focus here is on understanding what’s available out of the box and as compared to the ultimate vision of what may be possible.
Purpose and Structure
While one may never know all the reasons for their creation and inclusion in the product, it appears that the primary reasons were for debugging purposes and ongoing customer support related issues. The 21 Cognos audit tables are:
- COGIPF_ACTION
- COGIPF_AGENTBUILD
- COGIPF_AGENTRUN
- COGIPF_ANNOTATIONSERVICE
- COGIPF_HUMANTASKSERVICE
- COGIPF_HUMANTASKSERVICE_DETAIL
- COGIPF_MIGRATION
- COGIPF_NATIVEQUERY
- COGIPF_PARAMETER
- COGIPF_POWERPLAY_DIM_USAGE
- COGIPF_POWERPLAY_LEVEL_USAGE
- COGIPF_POWERPLAY_MEASURE_USAGE
- COGIPF_RUNJOBSTEP
- COGIPF_SYSPROPS
- COGIPF_THRESHOLD_VIOLATIONS
- COGIPF_USERLOGON
- COGIPF_RUNREPORT
- COGIPF_RUNJOB
- COGIPF_POWERPLAY
- COGIPF_EDITQUERY
- COGIPF_VIEWREPORT
Of these 21 Cognos audit tables their focus falls into the following categories:
- Tables 1 through 9 are concerned with troubleshooting, administration and development.
- Tables 10 through 12 provide detail on the use of dimensions, levels and measures within PowerPlay.
- Tables 13 through 15 deal with job steps, systems properties and threshold violations monitoring on an ongoing basis.
- Table 16 keeps track of user logon data.
- Tables 17 through 21 are focused on activity data for different classes of objects within Cognos. These five tables along with table 16 are the primary ones most Cognos administrators will be concerned with since they relate to tracking and monitoring usage within their environment.
Audit Tables Need to be Decoded
The raw event data from these five tables is not very useful as is. The reason for this is that there are a number of things you need to know about these tables and they don’t come with any instructions.
If accuracy, precision and broadly dimensioned data are your ultimate goal then you’re taking on a bit of work, but the payoff from this can be huge. The major steps here include the following:
- Determine how the tables relate to each other. While some of can be seen at a basic level the rest involves decoding the tables and the data in them.
- You need to perform activity, both interactive and batch, in Cognos Connection and the various studios that will be logged into the tables. This activity should cover all possible permutations that can occur.
- Then determine how all of this activity gets logged into the audit tables.
- You have to interpret the log data and determine what it means and how it can be used.
- Create a rule based automated system for reading the logs, dealing with data anomalies, normalizing it for a star schema and storing it in newly defined tables.
- Identify the reference data that needs to be used with the log data to maximize its value. Performing the steps above will give you insight into what reference data will be useful. See Figure 1 for how this all comes together into a Framework Manager Model.
Really?
If all of this seems like a lot of work it’s because it is. But once you’ve done it successfully there are no limits to how the audit data can be used.
An obvious question to ask is, “Why not just use the audit data “as is” along with the sample reports that come with Cognos 10?”. You can do this but at some point you may find that the data does not hang together. Some of the issues with the raw audit data and the sample reports include:
Data anomalies abound in the audit tables. Some records need to be trimmed to get useable data and in other cases what’s actually provided is not what the table column says it is, eg, PACKAGEPATH sometimes is actually the model path. You can’t build a useful reporting system on usage when key elements can have different meanings.
Narrow dimensionality – Standard dimensions provided limit the sample reports from Cognos to some very basic views of data. By decoding the audit data and combining different data items it’s possible to create new dimensions. Examples would be user, package and content hierarchies.
Depth – The sample reports are essentially list reports that limit the ability to provide any depth to analyses you may want to do. For example, it’s difficult to create cross tab reports with the basic audit data.
Reference data is in many ways the secret sauce here. The right reference data can unlock multiple creative ways to use the Cognos audit tables. It provides context for understanding the audit data. An example is reference data on inactive or deleted objects The audit data persists in the audit tables after an object has been deleted from the Content Store; having a list of inactive objects enables you to filter out these objects from your usage analysis and narrow your focus.
The Art of the Possible
We discussed above the need to understand how data gets logged around certain activities and what data gets logged that time. This may seem like a total waste of time but it’s the only way you can build the rule based system that gives you the precision you need. And it also makes it possible to create additional dimensions and measures for such things as average run times, average session time, view count, etc. The examples in Figures 2 – 4 below show how some of these can be used.
In our next blog we will focus on the Cognos Audit Extensions and their use.
© 2014 Envisn, Inc. – All rights reserved - Cognos Tools