OneStream Security – ‘Child Groups and Users’

Recently I have been talking a lot to clients and colleagues about Security and it seems to me that the nesting of security groups is not widely understood. By ‘nesting of security groups’ I am talking about the ‘Child Groups and Users’ section in the Security module:

I admit that I struggled for a long time with this bit of security. I knew what to add to the section, it’s just that I didn’t understand why. It was not logical – put the parent entity group in the child section. And when the OneStream Security training course refers to this as ‘counterintuitive’ I know I am not the only one who was confused. In this blog I hope to explain the ‘Child Groups and Users’ section so that it is easy to understand.

The nesting of the security groups is all about inheritance of security. That means a user is assigned to a group and that user automatically inherits security from another security group. This is usually linked to a hierarchy and the most common example given is for the Entity hierarchy where if a user has access to a parent Entity, they also have access to all its children. The key to understanding ‘Child Groups’ is that this inheritance stream can go up a hierarchy as well as down. The descriptions on the Security module assume that the child inherits the access of the parent but the only reference/example in the OneStream documentation is the other way around: the parent inherits the access of the child. Let me explain with a couple of examples.

For Workflows, the user must have access from the root member of the hierarchy down to the base level. If the parent Workflow security groups include the security groups of the child Workflows as ‘Child Groups’, the user only needs to be assigned to the base security group and access from the root is automatically included. The security inheritance flows from the parent to the child; if you have access to the child, you also have access to the parent. ‘Child Groups and Users’ makes sense.

For Entities, the access tree is the other way around. If a user has access to a parent, they will want to have access to its children (probably). In this case the parent security groups are added as ‘Child Groups’. The security inheritance flows from the child to the parent; if you have access to the parent, you also have access to the child. ‘Child Groups and Users’ does not make sense.

In the diagram above, Europe will need access to France and Germany for Entities but for Workflows it will need access to Total Geography. While this might clarify the setup, it does little to help with understanding and remembering. The difference between the assignment of child and parent groups is that users can be assigned as a child but not a parent. In effect, the child groups are equivalent to the user access. When you consider the Workflow security, the user access is assigned to the base item and then pushed up the hierarchy. Another way to look at it is, you start at the top member and pull up the security access from the children i.e. Total Geography has Europe as a child. Entity security is the other way round. User access is assigned to a parent and then it is pulled down to the base i.e. Europe has Total Geography as a child because Total Geography should also have access to Europe.

There are many ways of looking at security: you can look at it from a child or member or parent point of view and you can think of a member as giving security or taking it. In my experience of running courses on security, most people have one preferred way of looking at how security works. I will talk about two ways: ‘Give & Take’ and the ‘Can See Circle’.

Give and Take

One way of looking at the security groups is that the current security group gives access to the Child groups and it takes from the parent groups. For Workflows, Europe gives the security to France and Germany while taking access from Total Geography. For Entities, Europe gives the security to Total Geography and takes it from France and Germany.

Can See Circle

This is my preferred way of looking at security; it works for me but you might like another way. Start at the child groups: they can see the member and the member can then see the parents. (I know it’s not a circle but it has a nice ring to it).

OneStream – Error messages with debug information

One of the frustrations of coding VB.NET in OneStream is that while the error messages contain a lot of information only the first line or so is useful (and that is putting a positive spin on it). Consequently we tend to add debug information which is printed to the error log and then commented out or removed when the script goes live. Sometimes you will see a debug flag to control whether to print these messages to the log. I think that is a bit messy; first of all you get a load of messages in the log and secondly if the script is already in production there might be audit problems, you might not have access etc.

With all that in mind, I decided to see if I could find a better way of presenting more information when an error occurs. I wanted a method that gave me more information about what was happening when the error occurred, did not fill up the error log with messages, was easy to manage and did not have a large impact on performance.

The way I now code a script is:
1. Create a variable that will hold debug information: eg Dim CurrentPosition as String
This variable must be created BEFORE the Try block.
2. Update this variable through the script eg CurrentPosition = “Getting the XFFolderEx object…”
3. Write that to the error log eg BRApi.ErrorLog.LogMessage(si, CurrentPosition)

The next step is to add this ‘CurrentPosition’ information to any errors that are thrown. I do that by creating a new instance of the exception object and add the CurrentPosition variable to the end of the exception message:

Catch ex As Exception
    Dim MHEx As Exception = Activator.CreateInstance(ex.GetType(), ex.Message & Environment.NewLine & "ProcName ERROR - Position: " & CurrentPosition & Environment.NewLine, ex)
    Throw ErrorHandler.LogWrite(si, New XFException(si, MHex))

When the script is working I then comment out all the lines that print to the error log leaving the variable assignments in place.

Here is a simple example. This piece of code will throw an error because the script is trying to insert a value too large for the int16 variable:

Dim varNoDaysSmall As Int16 = 10
Dim varNoDaysLarge As Integer = 1000

For tempNumber As Int16 = 1 To 10
  varNoDaysSmall *= tempNumber
  varNoDaysLarge *= tempNumber
Next

This is the error you get (running from an Extender type Rule):

Now add the CurrentPosition debug information and the updated exception message:

For tempNumber As Int16 = 1 To 10
  CurrentPosition = String.Format("varNoDaysSmall value [{0}] tempNumber [{1}]", varNoDaysSmall.ToString, tempNumber.ToString)
  varNoDaysSmall *= tempNumber
  CurrentPosition = String.Format("varNoDaysLarge value [{0}] tempNumber [{1}]", varNoDaysLarge.ToString, tempNumber.ToString)
  varNoDaysLarge *= tempNumber
Next

And this is the error message you get:

This is just a simple example of getting debug information into an error message. No logs in the error log, very little overhead, simple to implement. I hope you find it useful – over the past few months it has saved me a huge amount of time debugging errors.

Here is the full demo script:

Public Function Main(ByVal si As SessionInfo, ByVal globals As BRGlobals, ByVal api As Object, ByVal args As ExtenderArgs) As Object
  Dim CurrentPosition As String = "Start"
  Try
    Dim varNoDaysSmall As Int16 = 10
    Dim varNoDaysLarge As Integer = 1000

    For tempNumber As Int16 = 1 To 10
      CurrentPosition = String.Format("varNoDaysSmall value [{0}]", varNoDaysSmall.ToString)
      varNoDaysSmall *= tempNumber
      CurrentPosition = String.Format("varNoDaysLarge value [{0}]", varNoDaysLarge.ToString)
      varNoDaysLarge *= tempNumber
    Next

  Catch ex As Exception
    Dim MHEx As Exception = Activator.CreateInstance(ex.GetType(), ex.Message & Environment.NewLine & "DemoError ERROR - Position: " & CurrentPosition & Environment.NewLine, ex)
    Throw ErrorHandler.LogWrite(si, New XFException(si, MHex))

  End Try
End Function

Editing FDMEE Reports

This blog entry is a quick introduction to editing reports in FDMEE. As an example I will walk through the steps to add the POV lock information to the existing FDMEE Process Monitor All Categories Report. You will notice that I am editing an existing report which is relatively straightforward. Creating a completely new report is more complicated.

BI Publisher

FDMEE Reports are edited with BI Publisher which is a free add-in for Microsoft Word. You can download it from here: BI Publisher downloads. You want Oracle BI Publisher Desktop. Once it is installed you will have an extra menu (‘BI Publisher’) in Word.

NOTE: you must select either 32-bit or 64-bit depending on your version of Word NOT operating system.

Overview

FDMEE Reports have two components: Query Definition (produces the data) and Report Definition (formats the data). We will need to change the Query Definition to include the database column that contains the POV lock information. And then we will add this new column to the Report Definition.

Query Definition

The Lock information is held in the intLockState column in the table TLOGPROCESS. We can get the SQL syntax from the SQL statement in the original FDM report:

(CASE
WHEN tLogProcess.intLockState IS NULL THEN 'OPEN'
WHEN tLogProcess.intLockState < 60 THEN 'OPEN'
ELSE 'LOCK' END) AS LockStatus

This is going to produce either OPEN or LOCK for each POV. To add this to the Query Definition in FDMEE, select the Setup tab then Reports | Query Definition. And then select the Process Monitor All Categories Query: R_QueryDef1

In the Select Clause box in the Details section below find the word FROM and put that on a new line by placing your cursor at the beginning and hitting enter. Add a comma to the end of tLogProcess.ProcessEndTime and then paste in the above code:R_QueryDef2

Click Save and then click Generate XML to create an XML file that contains the data produced by the Query Definition:

R_QueryDef3

Save the generated XML file. If you want to have a look at what is contained in this XML file, open it with Excel. Select ‘As an XML table’ if you are prompted. You should see the LockStatus column at the end. This XML file will be used by BI Publisher as a data source.

Report Definition

NOTE: I have not been on a BI Publisher course and I am by no means an expert. My knowledge comes from using many other similar reporting solutions and playing around with the software itself. If there is a better way of doing things please fell free to correct me.

Download the FDMEE Report template – Process Monitor All Category.rtf – and open it in Word. From the BI Publisher menu click ‘Sample XML’ and select the generated XML file:R_Word1

Click on the ‘Field’ button in the Insert tab to check that the XML has imported correctly. You will see the LockStatus field at the bottom:

R_Word2

We now want to add this field to the table to the right of the PARTNAME – RULE_NAME section. I have found the best way to do this is to add borders to the table so it is easy to work out where to click. Right-click on the white space to the right of PARTNAME – RULE_NAME and insert a column to the right. Click in this new column and then click the ‘Field’ button on the Insert section (see above) and select LOCKSTATUS.

R_Report

Now click on an output type in the Preview ribbon:

R_Preview

And you have the POV lock status displayed on the report:

R_Finished

NOTE: the data in the generated XML file is for all periods so there will probably be duplicate lines for the same location. When you run the report from FDMEE these duplicate lines will not be on the report.

 

FDMEE Global user

The Global user in FDMEE enables you to have a dimension member in the target that is written to by the FDMEE load process but is actually read-only for the user. FDM has the same functionality. This means that the user cannot change in the target the data they have loaded from FDM/FDMEE. It works well because you know you can prove that the data in the target is the same as that loaded into FDMEE/FDM. There is however a problem with this approach; there is no record of who has actually loaded the data, the logs in the target will contain the global user name not the FDM/FDMEE user. For some of my clients this is unacceptable as they have to show who has loaded the data and this information is lost.

NOTE: The example I will be going through is for an HFM target and I am sure something similar is possible for other targets.

Getting the logged on user

A simple and easy to implement fix for this problem is to write the logged on user to cell text in HFM. Before the export file is loaded into HFM (so use the event script AftExportToDat or BefLoad), read each line and save the entity (you only want to save the entity once). Then at the end of the export file add a !Descriptions section and add a cell text line per entity in the file.

Script steps:
Create a dictionary for the output
Open the export file for reading
For each line in the export file:
–  Save the line to the output dictionary
–  If this is the first data line save the year, period, scenario etc
–  Save the entity to a list if it’s not in the list
Close the export file
Save !Descriptions to the output dictionary
For each entity in the entity list:
–  Save a cell text line for each entity to the output dictionary using the saved year, period, scenario and for the selected account plus custom dimensions
Open the export file for writing
For each entry in the output dictionary:
–  Write the line to the file
Close the export file

In this example the selected account is PL111100 and the cell text label is [Default]:

Global-DatFile

In HFM you can display this on a form something like this:

Global-HFMForm

You now have a customisable log of who loaded the data from FDMEE.

Ad Hoc query from HsGetValue

I have recently been on the edge of an FCCS implementation and one of the areas that is overlooked when compared to HFM is the use of functions in Smart View. For an HFM implementation it is standard to use Smart View functions at some point. The output can be configured to be exactly what the user wants and this can then be automated using Excel macros or VBA. Some of these function sheets can be quite big but generally the response time with HFM as the source application is very good. When Essbase is the source (and FCCS uses Essbase as its storage) response times are excellent for ad hoc queries but are poor for the functions. This means that using Smart View functions to create large reports is not recommended as they take a long time to refresh. So the problem is that the users want to use functions in Smart View but the system architects are saying not to use them because they are too slow. In this article I will go through a possible solution to the problem: ad hoc drill downs from a Smart View function cell.

This article looks at some advanced VBA topics so I am assuming you are already reasonably competent in VBA for Excel. I will not be going into all the detail of how to do this as a full explanation of all the techniques would be too long. I will however point out where you can get more information or the relevant Excel function to use. You can download an example spreadsheet at the end so you don’t have to read the explanations.

Overview

Smart View functions are used to create static reports which can then be formatted as required. The problem is that each function is a separate call to the source. Some of the detail on the report could be removed to improve response time but the user needs a quick and easy way to access the data that has been removed. One solution is to create a macro that automatically generates an ad hoc query based on the parameters in the Smart View function. The idea is that the user double-clicks on a cell that contains an HsGetValue function and the macro creates an ad hoc sheet based on the parameters for the HsGetValue function.

I have always used a particular method for HsGetValue parameters: the dimensions are referenced individually (i.e. not a single string with a semi-colon separator), fixed dimensions are referenced using a named range (usually beginning with POV) and ‘grid’ dimensions use the Dimension# prefix plus the cell reference. This makes identifying the ad hoc column and row dimensions much easier for the macro:

AH_GetValue

Excel Event scripting

If you are familiar with FDM or FDMEE you will probably already know the concept of event scripting i.e. a script that executes in response to an event. In Excel the number of events is much larger and there are separate events for the workbook and the worksheet. For an explanation of how to add extended and robust event scripting to Excel see Chip Pearson’s excellent website:

http://www.cpearson.com/Excel/AppEvent.aspx

The following screenshot shows the starting point:

AH_Start

1: Double-click ThisWorkBook

2: Create a private variable called ExcelEvents as the type clsExcelEvents

3: In the Workbook_Open procedure instantiate the ExcelEvents object

4: Create the class module clsExcelEvents (Insert | Class module)

Now double-click clsExcelEvents. Create the variable XLApp (or whatever you want to call it) using the keyword WithEvents and as the type Application. Then instantiate it in the Class_Initialize procedure:

AH_clsExcel

You have added extended event scripting to your Excel workbook. Select XLApp in the Object drop-down list and you will see a list of Procedures (Events) in the drop-down list to the right:

AH_XLApp

This shows you the events you can intercept and execute code for.

Event SheetBeforeDoubleClick

The event SheetBeforeDoubleClick is the one we want. Every time the user double-clicks a cell this procedure is executed. When the user double-clicks a cell with a HsGetValue formula, we want to pick up the dimension information and create an ad hoc sheet using those dimensions. This is where we want to put our code for creating the ad hoc sheet:

AH_BeforeDoubleClick

The user can double-click anywhere so you need to check that the cell the user clicked has a formula (ActiveCell.Formula) and that formula is HsGetValue. If it is an HsGetValue cell, you can start getting the metadata. This is an example of the format of the HsGetValue formula:

AH_GetValue

This makes the placing of the dimensions for the ad hoc grid simple: any dimension on the same row as the clicked cell will be placed in the row section of the ad hoc. Everything else goes to the columns. So split the formula parameters at the comma; the first member is the connection and the rest are dimensions. Check each dimension and if it contains # then it’s on the grid. Check the row reference for those with # and if it’s the same as the current row, then save it for the row members. Everything else is saved for the column members.

The key part here is that you do not want to save the reference, you want the value that is in the referenced cell. And you do that with the Excel function Evaluate:

AH_Evaluate

In the formula, the connection is referenced using the named range ‘conn’. In the above screenshot, the value in the named range is evaluated so sConnection now contains FCCSDEV (or whatever my connection name is). You do the same with the dimensions; you are just interested in the member name and whether it’s going into the columns or rows:

AH_DimLoop

At the end of processing the formula, you want four pieces of information: the connection name, the name of the new ad hoc sheet, the members for the column headers and finally the members for the row. Create the new sheet and on that new sheet add the column headers and row members. Once you have done that you need to make it an ad hoc sheet:

AH_MakeAdHoc

In order for these Smart View functions to work you need to add the file smartview.bas to your project. You can find it in the bin folder under the SmartView installation folder (default is C:\Oracle\SmartView\bin).

And the final touch is to make sure that the double-click option is set for this new sheet:

AH_SetOption

That’s it. Now whenever you double-click a cell which contains an HsGetValue formula, an ad hoc sheet is automatically generated using the dimension members referenced in the HsGetValue formula.

Example spreadsheet

You can download an example spreadsheet here. You will need to review the Control sheet to check the custom dimension names in column A. If you want to use this against HFM change cell A5 to Year#:

AH_ControlSheet

HFM Type Library enumeration

If you don’t know what enumeration is then I doubt that you will be interested in reading any further. This blog entry is very geeky and short. It will be of interest only to those wanting to access HFM through a custom method such as SQL statements. Here are a couple of examples where I have used this information just in the last few months. In my previous blog there is a SQL statement that has some large CASE clauses to convert the activity codes and module ids to a meaningful description. I have just written a script to extract the Process Control information to Excel and it converts the action and process levels ids into the same descriptions that HFM uses. This blog looks at how I got that conversion information.

HFMConstants.DLL

The information on what these codes and ids mean is held in the file HFMConstants.DLL. You can access the information by adding the file HFMConstants.DLL as a reference in Excel (Alt+F11 then Tools | References and select HFMConstants.DLL. Then F2 and select HFMConstants from the drop down box). But that still means a lot of typing for things such as the activity codes and modules.

Type Library enumeration

An alternative is to enumerate the information in the type library (or TypeEnum). This will extract all the information in a type library file (such as HFMConstants.DLL) which you can then quickly format for use in a SQL statement. I wrote a little program a long time ago to enumerate the type library but that was a 16-bit program and it doesn’t work on a 64-bit OS (not a big surprise). So I have cobbled together some VBA code in Excel that enumerates the information and copies it to the clipboard.

You can download the Excel spreadsheet here. You MUST change the .xlsx extension to .xlsm before it will open. And for those of you who can’t wait here is the listing it produces for HFMConstants.DLL. It is a text file.

If I have added the output for HFMConstants as a text file why bother with the Excel spreadsheet that produced it? That’s because it can be extremely useful for other programs. If you put directory\Excel.exe (where directory is the directory where Excel.exe resides) as the source file, it will enumerate the Excel type libraries. Really useful if you want to write some VBA programs.

Task Audit Extract alternative

In some of my earlier articles I have talked about managing the Task Audit information in HFM. The main point of these articles is that the Audit Extract Utility (AEU) no longer meets the requirements for a robust and complete task audit. The AEU is still fine for the extract of data audit information just not the task audit. To recap, the two main problems with the AEU is that the detailed logs of application changes (i.e. metadata, member list and rules loads) are not extracted and the last logon information is deleted. In this article I will go through an alternative to the AEU for managing the Task Audit data. It is customisable so that any application changes and logon information are preserved and you can ignore some events such as Logoff.

Process Overview

The key to this new Task Audit extract process is a SQL statement which selects and formats the data. This SQL statement is then automated by calling from a batch file or from a trigger/job in the relational database. The automation process collects the output from the SQL statement and writes it to a file.

Note: I have only tested this on SQL Server. For Oracle databases, the REPLACE function should work. You will need to use something like the to_date function instead of MONTH in the WHERE clause.

SQL Statement

This is a basic SQL statement to extract the data. If you look in the log file that the AEU creates, you will see something similar to this:

SELECT U.sUserName, a.ActivityCode, a.ServerName, a.AppName,
a.StartTime, a.EndTime, a.strModuleName
FROM @APPL@_TASK_AUDIT AS a INNER JOIN
HSV_ACTIVITY_USERS AS U WITH (NOLOCK) ON a.ActivityUserID = U.lUserID

Note: you need to change @APPL@ to the name of your HFM application.

This gives you data something like this:

AEU_RawData

While this has extracted the data, we need to make a few changes to make it more readable:

  1. change the ActivityCode to the activity description (e.g. Logon, Logoff)
  2. format the StartTime and/or EndTime to an actual date
  3. change the strModuleName to the module description.

There is a link at the bottom of the page to download the SQL statement.

StartTime/EndTime

The format of StartTime and EndTime is standard SQL so it depends on your database. For SQL Server you can use the DATEADD function:

DATEADD(day, -2, a.StartTime)

For some reason the display of dates in SQL Server needs to be adjusted by 2 days. This might not be the case for your server. For Oracle you use to_date. This will convert the numbers 42907.877962963 etc to something like 2017-07-10 15:10:43.000.

You can now play around with the display of the StartTime and EndTime to get the format you want. I prefer to split out the date and time. This will display the start time as YYYYMMDD (zero-padded):

CONVERT(VARCHAR(4), DATEPART(yyyy, DATEADD(dd, -2, a.StartTime)))
+ RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(month, DATEADD(dd, -2, a.StartTime))), 2)
+ RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(dd, DATEADD(dd, -2, a.StartTime))), 2)

And this displays the time element of the start time as HHMM:

RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(hour, DATEADD(dd, -2, a.StartTime))), 2)
+ RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(mi, DATEADD(dd, -2, a.StartTime))), 2)

ActivityCode

There is no table in the HFM database that contains the descriptions for the ActivityCode column so we will have to use a CASE function. You could create a table in your database to hold the activity codes and their descriptions if you preferred. So the CASE syntax is like this:

CASE a.ActivityCode
WHEN  0 THEN 'Idle'
WHEN  1 THEN 'Rules Load'
WHEN  2 THEN 'Rules Scan'
WHEN  3 THEN 'Rules Extract'
……
END AS Activity,

ModuleName

The module name is similar to ActivityCode in that there is no table containing the descriptions but has an extra complication. All of the records have #@RID@# at the beginning. We could keep that piece of text in the CASE statement but I prefer to remove it:

CASE REPLACE(a.strModuleName, '#@RID@#', '')
WHEN  0 THEN 'ADMINISTRATION'
WHEN  1 THEN 'CELL_HISTORY'
WHEN  2 THEN 'CLOSE_APPLICATION'
WHEN  3 THEN 'CONSOLIDATION'
……
ELSE a.strModuleName
END AS Module

WHERE clause

We don’t want to extract all the Task Audit data every month so we need a WHERE clause to restrict what data is extracted. This piece of code will only include data that has a start time within the month four calendar months ago. So if we are in November now, it will only include records for July.

WHERE DATEDIFF(MONTH, DATEADD(day, -2, a.StartTime), GETDATE()) = 4

You could also include exclusions on the ActivityCode (e.g. Logoff is 31) if you wanted.

Automation

Save your SQL statement to a file and call it using the sqlcmd statement e.g.:

sqlcmd -S %Server% -E -d %DB% -i %Script% -o "%Output%" -s "|" –W

The items between the % symbols are batch variables e.g.

set Script=getActivity.sql

That will extract the Task Audit data to a file. You will then probably add a statement to delete the records from Task Audit for the same month. And for the delete statement you will add some exceptions such as

WHERE ActivityCode NOT IN (1, 21, 23, 26, 29, 75)

This will keep all the application changes and logon information.

You can download the full SQL statement here. It has the extension .key and is a text file:Download GetActivity SQL statement

Conclusion

With this SQL statement, you can have an automated process to extract and manage the Task Audit data. The data that is extracted and deleted is customisable. This means that you can keep your Task Audit data to a minimum while keeping a full year’s data for the important audit records. Information on all application changes and are kept as well as the logon information for all users.

Last Logon in HFM

Have you ever wondered if any users have stopped using HFM and forgotten to tell you? You could trawl through the audit logs of who has logged on and match that against a checklist of your users. It might take a while. Or you could write a script that queries the database.

This code will only work for SQL Server. You will need to change @APPLNAME@ to the name of your HFM application.

SELECT U.sUserName AS UserName,
MIN(DATEDIFF(DAY, A.StartTime, GETDATE()) + 2) AS NumberDaysSinceLastLogon
FROM @APPLNAME@_TASK_AUDIT AS A INNER JOIN
HSV_ACTIVITY_USERS AS U ON A.ActivityUserID = U.lUserID
WHERE (A.ActivityCode = 29)
GROUP BY U.sUserName
ORDER BY U.sUserName

It will produce two columns: UserName and the number of days since the last logon. This includes when a user has logged on via Smart View.

Save this piece of code to a file (e.g. GetLastLogon.sql) and reference that in the variable %Script% like this:

REM Name of the script to execute
set Script=GetLastLogon.sql
REM Name of the file for the output of the records
set OutputFile=LastLogon.csv

then call it from a batch file like this:

REM Parameters:
REM   -s "," : comma is the delimiter used in the output file
REM   -W     : suppress white space
REM This is the syntax if you are using a SQL User and password
REM sqlcmd -S %SQLServer% -U %SQLUser% -P %SQLPW% -d %SQLDB% -i %Script% -o "%OutputFile%" -s "," -W
REM This is the syntax if you are using a trusted user
sqlcmd -S %SQLServer% -E -d %SQLDB% -i %Script% -o "%OutputFile%" -s "," -W

Either set up variables for the other values between % i.e. SQLServer, SQLDB etc or change the values directly. You can now schedule this to run once a month as part of your admin tasks.

Remember that this will only work for the periods in the task audit table. If you have purged the task audit table then you have lost the information on who has logged on for the purged periods. I normally recommend keeping at least 3 months of audit data. If you run this script before purging the audit tables then you will be checking the previous 4 months which is enough to identify who is no longer using HFM.

Task Audit in HFM part 2

The auditing of application changes in HFM has improved recently with the introduction of the metadata load changes feature. This is an excellent enhancement but it does have its problems. In my previous blog I talked about how the Audit Extract Utility has not been updated to manage the metadata load changes audit log and how you mitigate the problems. In this blog I will look at the metadata load changes audit records in more detail. Note that the SQL used in the scripts works on SQL Server only; it will not work on Oracle databases.

Metadata Load Changes

The Metadata Load Changes feature is accessed from the Navigate menu: Administer | Consolidation Administration | Audit | Tasks. There is a log for each metadata load and a matching log for the Metadata Load Differences. Against both logs there is a link icon. When you click on the icon, it prompts you to save the log as a text file with the default name Attachment.txt.

What should be in the audit

Firstly I want to talk about what I think an audit log should contain and how it is presented. In short, an audit log must tell you who made what changes at what time and why they made them. And the audit must be complete (i.e. no significant gaps) with no means of changing it (i.e. not a text or Excel file etc.).

The Task Audit view through Consolidation Administration gives us some of this information (who and when). And if you then go into each of the links you will get the ‘what has changed’. Compare that list with your change request logs and you will get the ‘why’.

Ideally you should also be able to answer questions such as ‘show me the changes in metadata that have been made to the account ABC’. You could do it, but it would take time to go through the Task Audit logs and save each one with a meaningful file name.

How does HFM audit compare?

On the face of it the auditing in HFM shapes up really well. It’s only when you look at it in more detail do you realise that there are some problems. A significant problem is the output for the audit logs. You have to select each log one at a time, save them and change the default filename: Attachment.txt. It’s a text file so there’s nothing to stop you from changing the contents. The log information doesn’t contain who made the change or the date so you have to either edit the file or change the filename to contain that information. This is very poor from an audit perspective.

But there’s more. If you look in the HFM Admin guide and check which characters are invalid for the member label, you will notice that the ampersand (&) is not in the list. So, for example, an account called R&D is valid. Ampersand is not valid for descriptions and not for Cell Text labels. But then in the Loading Metadata section, the manual says:

Note: Do not use the ampersand character ( & ) in a metadata file. If you do, an error occurs.

I can tell you that the error (whatever it is) does not show itself when you load metadata. And I have seen applications happily using ampersand in the member label. I can also tell you that it does cause an error in the view task audit logs. If there is an ampersand in a member label, when you click the link in Consolidation Administration the audit log will display up to and not including the ampersand. Any changes after the ampersand are not in the log. If the ampersand is in something like a userdefined field, it does not cause a problem. I don’t know if any other characters cause the same problem. Safest route then is not to use ampersand anywhere and any other non-numeric/non-alphabetical characters with great care.

In summary then, the HFM Task Audit captures all the changes (probably), in text file format and you will have to edit the files if you want to save them offline. ‘Room for improvement’ as my better school reports used to say.

An alternative

The Task Audit detail information in the database is complete regardless of whether there is an ampersand. The ampersand only causes a problem when you try to download it. So you can potentially write a script to extract the Task Audit detail. And while you are at it, add the missing user and date/time information and save it as a PDF. And it just so happens, I have done exactly that.

Script overview

The output is going to be a PDF file for each metadata type load (i.e. metadata, rules, member lists and security) with the username, date and time in the file and filename. The first step is to get a list of the logs which have an attachment; the second step is to read that list and create the attachment for the log entry.

The Task Audit logs are held in two tables: APPNAME_TASK_AUDIT and APPNAME_TASK_AUDIT_ATT.

This is the first table (TASK_AUDIT):

And the second (attachments) table (TASK_AUDIT_ATT):

The first table contains the information you see in Consolidation Administration. There is a single database record for each log entry. The second table is the information that is used to create the attachments. There are one or more records for each log entry. We want a combination of data from these two tables (and we also want data from the user table to convert the user name).  The script reads the first table and picks off the username, date, time and attachment id. It then reads the attachment table and exports the records for a single log entry and saves it with the username, date, time and audit type in the file name.

This is the SQL statement that is used as the basis for creating the exported audit logs:

SELECT CONVERT(VARCHAR(4), DATEPART(yyyy, DATEADD(dd, -2, A.StartTime))) + RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(month, DATEADD(dd,-2, A.StartTime))), 2) +
RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(dd, DATEADD(dd, -2, A.StartTime))), 2) AS ThisDate,
RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(hour, DATEADD(dd, -2, A.StartTime))), 2) +
RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(mi, DATEADD(dd, -2, A.StartTime))), 2) AS ThisTime,
A.strGUID,
CASE A.ActivityCode
WHEN 1 THEN 'Rules'
WHEN 21 THEN 'Metadata'
WHEN 23 THEN 'Member List'
WHEN 26 THEN 'Security'
WHEN 75 THEN 'Metadata Load Diffs'
END AS Activity,
CASE
WHEN CHARINDEX('\', U.sUserName) > 0 AND CHARINDEX('@', U.sUserName) > 0          THEN SUBSTRING(U.sUserName,
CHARINDEX('\', U.sUserName) + 1,
LEN(U.sUserName) - CHARINDEX('\', U.sUserName) -
CHARINDEX('@', U.sUserName))
WHEN CHARINDEX('\', U.sUserName) > 0
THEN SUBSTRING(U.sUserName,
CHARINDEX('\', U.sUserName),
LEN(U.sUserName) - CHARINDEX('\', U.sUserName))
WHEN CHARINDEX('@', U.sUserName) > 0
THEN SUBSTRING(U.sUserName, 0, CHARINDEX('@', U.sUserName))
ELSE U.sUserName
END AS UserName,
CASE A.ActivityCode
WHEN 75 THEN '.XML'
ELSE '.TXT'
END AS FileExtension
FROM %APPNAME%_TASK_AUDIT AS A INNER JOIN
HSV_ACTIVITY_USERS AS U ON A.ActivityUserID = U.lUserID
WHERE (A.ActivityCode IN (1, 21, 23, 26, 75))
ORDER BY A.StartTime

Click here to download.

The attachment data in the database is held in XML format for the Metadata Load Differences log and in text file format for all the other logs. The script removes any domain names from the username as the \ character causes a problem when creating the output file.

Change %APPNAME% to the name of your HFM application and save the script to a file (e.g. SQLScript.sql). You then use sqlcmd to execute the script and put the output in a temporary file something like this:

sqlcmd -S SQLServer -E -d SQLDatabase -i SQLScript.sql -h-1 -o “Dir\OutFile.txt” -s “|” –W

This will create a file with an audit record on each line:

TA2_HeaderFile

The first field contains the audit record date, then the time, the audit record id, type of audit record, the person who executed it and finally the file type. Six columns in all separated by |. Now you read each record and query the attachments table based on the third column: the id. You can do this through VB Script or batch files. Here it is as a batch file:

REM Each line has 6 columns of data. Put them into A,B,C,D etc
for /F "tokens=1,2,3,4,5,6 delims=|" %%A in (Dir\OutFile.txt) do (
echo %%A %%B %%C %%D %%E %%F
if "%%A"=="" (
ECHO Found a blank line...
goto :eof
)
if "%%A" NEQ "" (
REM Create a script file to export a single log record
echo Creating script for guid %%C... >> log.txt
echo bcp "SELECT CONVERT(char(2000), attachment) FROM
%APPNAME%_TASK_AUDIT_ATT WHERE guid = '%%C' ORDER BY idx" queryout
"%DIR%\%%A_%%B_%%D_%%E%%F" -T -c -C RAW -d SQLDatabase > execme.bat
call execme.bat
)
)
:eof

The attachment itself is stored in the attachment column as binary data or a BLOB (binary large object). Each record holds 2000 characters and the column idx is incremented so that if the attachment is longer than 2000 characters they can be stitched together.

This will produce a separate text file for each log that has an attachment. The file will be in UTF-16 format (Unicode) and there will be padding characters at the end of the file because the text file is being created in blocks of 2,000 characters. The file name will contain the date, time, username and log file type. For some people this is enough. Other people want to have these files printed to PDF.

Clean up the text files

The padding characters at the end of the file are confusing and clumsy so they need to be cleaned up. Here is some VBScript code that will do that (it might need tweaking if the server language is not English):

Set Stream = fso.OpenTextFile(sSourceFile, ForReading, False)
intAsc1Chr = Asc(Stream.read(1))
intAsc2Chr = Asc(Stream.read(1))
Stream.Close
If intAsc1Chr = 255 And intAsc2Chr = 254 Then
OpenAsUnicode = True
Else
OpenAsUnicode = False
End If
Set Stream = fso.OpenTextFile(sSourceFile, ForReading, False, OpenAsUnicode)
' Read file one char at a time and check it is within the ASCII range.
Do While Not Stream.AtEndOfStream
sChar = Stream.read(1)
If OpenAsUnicode = True Then
nCharASCII = AscW(sChar)
Else
nCharASCII = Asc(sChar)
End If
If nCharASCII >= 32 And nCharASCII <= 126 Then
' ASCII character so add it to the output
sText = sText & sChar
ElseIf nCharASCII = 9 Or nCharASCII = 10 Then
' Tab or new line so add to the output
sText = sText & sChar
End If
Loop

Click here to download.

And then print the text to a new file using the username, time, date and filetype to identify what the file contains.

Process the XML file

The Metadata Changes file is in XML format which is not easy to read so you will need to process it into human readable form. You might also want to put a header in the new file using the username, time and date from the other fields in the row.

This is the code I use for converting the XML. You are reading the OutFile.txt to get the name of the XML file as well as the user and date information. These are passed through to this script to be added to the header. It creates a file which is very similar to the output from HFM with a few improvements:

Set NewFile = fso.CreateTextFile(sOutputFile, True, False)
NewFile.WriteLine("Loaded By: " & sUser)
NewFile.WriteLine("Date: " & sFileDate)
NewFile.WriteLine("Time: " & sFileTime)
NewFile.WriteLine("Type: Metadata Load Differences")
NewFile.WriteLine("")
Set xmlDoc = CreateObject("Microsoft.XMLDOM")
xmlDoc.async = "false"
xmlDoc.Load sCleanedXMLFile
Set objRoot = xmlDoc.DocumentElement
For Each childNode In objRoot.ChildNodes
If childNode.BaseName = "DIM" Then
Set attrib = childNode.Attributes
For Each i In attrib
NewFile.WriteLine("Dimension: **  " & i.Value & "  **")
Next
Else
NewFile.WriteLine("Node: " & childNode.BaseName)
End If
ProcessNode childNode, i, ""
NewFile.WriteLine("")
Next
NewFile.Close

Function ProcessNode(childNode, Counter, prevTagName)
Dim attrib
Dim i
Dim Nodes
Dim tagName
Dim member
Dim attribName
Dim prevSetting
Dim newSetting
Counter = Counter + 1
If childNode.Attributes Is Nothing Then
Else
If childNode.Attributes.Length > 0 And Counter > 1 Then
Set attrib = childNode.Attributes
For Each i In attrib
select case lcase(i.BaseName)
case "att"
attribname = i.Value
case "old"
prevSetting = i.Value
case "new"
newSetting = i.Value
end select
Next
If prevTagName = "NCHG" Then
NewFile.WriteLine("    Changed aggregation weight from " & prevSetting & " to " & newSetting)
Else
NewFile.WriteLine("  Changed " & attribName & " for member " & member & " from " & prevSetting & " to " & newSetting)
End If
End If
End If
If childNode.ChildNodes.Length > 0 Then
Set Nodes = childNode.ChildNodes
For Each i In Nodes
On Error Resume Next
tagName = vbNullString
tagName = i.tagName
member = vbNullString
member = i.Text
On Error GoTo 0
If len(member) > 0 then
Select Case tagName
Case "MADD"
NewFile.WriteLine("  Added member: " & i.Text)
Case "MCHG"
NewFile.WriteLine("  Changed member: " & i.Text)
Case "MDEL"
NewFile.WriteLine("  Deleted member: " & i.Text)
Case "NADD"
NewFile.WriteLine("  Added parent/child: " & i.Text)
Case "NCHG"
NewFile.WriteLine("  Changed parent/child: " & i.Text
Case "NDEL"
NewFile.WriteLine("  Deleted parent/child: " & i.Text)
Case Else
'NewFile.WriteLine("  Unknown: " & tagName)
End Select
End If
ProcessNode i, Counter, tagName
Next
End If
End Function

Click here to download.

Print to PDF

The final step is printing to PDF and normally that causes a lot of problems. However you can simply open the new text file in Word and save it as a PDF. For this to work you need to run your script from a computer which has Microsoft Office installed. I found a script that does exactly that here:

http://mydailyjava.blogspot.co.uk/2013/05/converting-microsoft-doc-or-docx-files.html

That’s it. Now there is a process which can be automated that will extract all audit logs and print them to PDF. The PDF file names contain who did what and when. In the logs themselves the usernames and times have also been added.

HFM Task Audit management

The Task Audit tables in HFM store information on who has done what and when in the application. There are separate tables for each application in the database. These tables get very big if they are not maintained properly and there are a number of ways of managing them. In essence the management process needs to archive (or delete) old records. If the audit table contains too many records, checking the audit logs in the workspace is very slow and sometimes it can even stop HFM from working.

There are two ways of archiving the audit tables: to text file or to a database table.

Text file

Archiving to text file is the most common method. The utility HFMAuditExtractUtility is used to extract and truncate the database tables but there are problems with this. Firstly is the sheer amount of data that is stored in the Task Audit table (it is even worse in the Data Audit table). The utility splits the data in the tables into blocks of 64000 rows and it is not uncommon for there to be 5+ such files for a month. Over the year you can accumulate a lot of these extract files which means that they actually lose relevance and usefulness because they are difficult to query. The reason for archiving this information is in case there are any audit questions and these questions usually arise the year after. Typical questions are how many times has metadata been loaded during the year and who did it. When there are millions of records in various text files, it is not easy to answer some of these audit questions.

Database tables

Another method of archiving the audit table is to store the old information in a database. This has the significant advantage of being much easier to find the information you want. It has the disadvantage that this is a custom development and it takes longer than you think to format the data in a manner that is easily understandable and useable.

The database copy can be done either as a trigger or a scheduled event (e.g. monthly). You can set up a trigger so that whenever a record is added to the audit table, it is automatically copied to another (custom) table. Or you can set up an agent of some sort to run once a month to copy the records for the previous month. I prefer the agent method because the trigger method means that you have to amend the HFM database itself. The agent method can be run from a separate database so you are not touching HFM (and thereby removing any possibility of breaking your support agreement).

HFMAuditExtractUtility

As I mentioned before, this utility extracts and truncates the Task Audit and Data Audit tables. This utility is not available for HFM version 11.1.2.4 but the utility from version 11.1.2.3 does work. There is however a fundamental problem with the utility in 11.1.2.4. The Task Audit process in 11.1.2.4 saves the log files for application element loads (Rules, Metadata, Member Lists and Security) against the Task Audit record. From Consolidation Administration you can click the chain icon to download the log file.

TA_TaskAuditLog

These logs files are held in a separate database table which is not touched by the Audit Extract utility so that means that if you use the utility, it does not extract the log files. Once the audit table has been cleared, the ability to download the logs is removed.

Scripts

So what can you do? You want to clean out the task audit table but keep some of the records or extract the log details somehow. A simple way of accomplishing this is to use the Audit Extract Utility to extract only and then delete the records you don’t want from the task audit table. The idea is that the task audit table will contain all records for the previous three months (for example) and then for older periods it will only contain those records which have an associated log. That means you will have an audit of all the loads of application elements and for all other events only the most recent. There should not be too many application element loads because you are unlikely to update your production application more than once a month.

The SQL script that will do this is as follows. This will only work for SQL Server because Oracle does not have the DATEDIFF function and you will need to use SYSDATE instead of GETDATE (my client uses a SQL Server database):

DELETE FROM %HFMAPPNAME%_TASK_AUDIT
WHERE ActivityCode NOT IN (1, 21, 23, 26, 75)
AND (DATEDIFF(MONTH, EndTime, GETDATE()) > %NOMONTHS%)

%HFMAPPNAME% needs to be replaced with the name of your HFM application. %NOMONTHS% is the number of months of task audit data that you want to keep e.g. 3

‘ActivityCode NOT IN (1, 21, 23, 26, 75)’ means that the load of Rules (1), Metadata (21), Member Lists (23) and Security (26) and the Metadata Load Differences (75) will be retained. The list of these activity codes is contained in the file HFMAuditExtractSchemaDefs.xml:

TA_Codes

This XML file has not been updated for 11.1.2.4 however and the new activity code 75 is not in it. I worked that one out by looking at the database table because I can’t find any reference to these activity codes anywhere.

You can either automate this delete command in the database or via a batch file using sqlcmd (SQL Server) or sqlplus (Oracle). This will retain in the Task Audit table all the information about changes to the application elements and remove all the other records. This script combined with the Audit Extract Utility should give you the visibility of the important audit records while keeping the whole audit table trim and quick to respond.