OneStream – Error messages with debug information

One of the frustrations of coding VB.NET in OneStream is that while the error messages contain a lot of information only the first line or so is useful (and that is putting a positive spin on it). Consequently we tend to add debug information which is printed to the error log and then commented out or removed when the script goes live. Sometimes you will see a debug flag to control whether to print these messages to the log. I think that is a bit messy; first of all you get a load of messages in the log and secondly if the script is already in production there might be audit problems, you might not have access etc.

With all that in mind, I decided to see if I could find a better way of presenting more information when an error occurs. I wanted a method that gave me more information about what was happening when the error occurred, did not fill up the error log with messages, was easy to manage and did not have a large impact on performance.

The way I now code a script is:
1. Create a variable that will hold debug information: eg Dim CurrentPosition as String
This variable must be created BEFORE the Try block.
2. Update this variable through the script eg CurrentPosition = “Getting the XFFolderEx object…”
3. Write that to the error log eg BRApi.ErrorLog.LogMessage(si, CurrentPosition)

The next step is to add this ‘CurrentPosition’ information to any errors that are thrown. I do that by creating a new instance of the exception object and add the CurrentPosition variable to the end of the exception message:

Catch ex As Exception
    Dim MHEx As Exception = Activator.CreateInstance(ex.GetType(), ex.Message & Environment.NewLine & "ProcName ERROR - Position: " & CurrentPosition & Environment.NewLine, ex)
    Throw ErrorHandler.LogWrite(si, New XFException(si, MHex))

When the script is working I then comment out all the lines that print to the error log leaving the variable assignments in place.

Here is a simple example. This piece of code will throw an error because the script is trying to insert a value too large for the int16 variable:

Dim varNoDaysSmall As Int16 = 10
Dim varNoDaysLarge As Integer = 1000

For tempNumber As Int16 = 1 To 10
  varNoDaysSmall *= tempNumber
  varNoDaysLarge *= tempNumber
Next

This is the error you get (running from an Extender type Rule):

Now add the CurrentPosition debug information and the updated exception message:

For tempNumber As Int16 = 1 To 10
  CurrentPosition = String.Format("varNoDaysSmall value [{0}] tempNumber [{1}]", varNoDaysSmall.ToString, tempNumber.ToString)
  varNoDaysSmall *= tempNumber
  CurrentPosition = String.Format("varNoDaysLarge value [{0}] tempNumber [{1}]", varNoDaysLarge.ToString, tempNumber.ToString)
  varNoDaysLarge *= tempNumber
Next

And this is the error message you get:

This is just a simple example of getting debug information into an error message. No logs in the error log, very little overhead, simple to implement. I hope you find it useful – over the past few months it has saved me a huge amount of time debugging errors.

Here is the full demo script:

Public Function Main(ByVal si As SessionInfo, ByVal globals As BRGlobals, ByVal api As Object, ByVal args As ExtenderArgs) As Object
  Dim CurrentPosition As String = "Start"
  Try
    Dim varNoDaysSmall As Int16 = 10
    Dim varNoDaysLarge As Integer = 1000

    For tempNumber As Int16 = 1 To 10
      CurrentPosition = String.Format("varNoDaysSmall value [{0}]", varNoDaysSmall.ToString)
      varNoDaysSmall *= tempNumber
      CurrentPosition = String.Format("varNoDaysLarge value [{0}]", varNoDaysLarge.ToString)
      varNoDaysLarge *= tempNumber
    Next

  Catch ex As Exception
    Dim MHEx As Exception = Activator.CreateInstance(ex.GetType(), ex.Message & Environment.NewLine & "DemoError ERROR - Position: " & CurrentPosition & Environment.NewLine, ex)
    Throw ErrorHandler.LogWrite(si, New XFException(si, MHex))

  End Try
End Function

Task Audit Extract alternative

In some of my earlier articles I have talked about managing the Task Audit information in HFM. The main point of these articles is that the Audit Extract Utility (AEU) no longer meets the requirements for a robust and complete task audit. The AEU is still fine for the extract of data audit information just not the task audit. To recap, the two main problems with the AEU is that the detailed logs of application changes (i.e. metadata, member list and rules loads) are not extracted and the last logon information is deleted. In this article I will go through an alternative to the AEU for managing the Task Audit data. It is customisable so that any application changes and logon information are preserved and you can ignore some events such as Logoff.

Process Overview

The key to this new Task Audit extract process is a SQL statement which selects and formats the data. This SQL statement is then automated by calling from a batch file or from a trigger/job in the relational database. The automation process collects the output from the SQL statement and writes it to a file.

Note: I have only tested this on SQL Server. For Oracle databases, the REPLACE function should work. You will need to use something like the to_date function instead of MONTH in the WHERE clause.

SQL Statement

This is a basic SQL statement to extract the data. If you look in the log file that the AEU creates, you will see something similar to this:

SELECT U.sUserName, a.ActivityCode, a.ServerName, a.AppName,
a.StartTime, a.EndTime, a.strModuleName
FROM @APPL@_TASK_AUDIT AS a INNER JOIN
HSV_ACTIVITY_USERS AS U WITH (NOLOCK) ON a.ActivityUserID = U.lUserID

Note: you need to change @APPL@ to the name of your HFM application.

This gives you data something like this:

AEU_RawData

While this has extracted the data, we need to make a few changes to make it more readable:

  1. change the ActivityCode to the activity description (e.g. Logon, Logoff)
  2. format the StartTime and/or EndTime to an actual date
  3. change the strModuleName to the module description.

There is a link at the bottom of the page to download the SQL statement.

StartTime/EndTime

The format of StartTime and EndTime is standard SQL so it depends on your database. For SQL Server you can use the DATEADD function:

DATEADD(day, -2, a.StartTime)

For some reason the display of dates in SQL Server needs to be adjusted by 2 days. This might not be the case for your server. For Oracle you use to_date. This will convert the numbers 42907.877962963 etc to something like 2017-07-10 15:10:43.000.

You can now play around with the display of the StartTime and EndTime to get the format you want. I prefer to split out the date and time. This will display the start time as YYYYMMDD (zero-padded):

CONVERT(VARCHAR(4), DATEPART(yyyy, DATEADD(dd, -2, a.StartTime)))
+ RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(month, DATEADD(dd, -2, a.StartTime))), 2)
+ RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(dd, DATEADD(dd, -2, a.StartTime))), 2)

And this displays the time element of the start time as HHMM:

RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(hour, DATEADD(dd, -2, a.StartTime))), 2)
+ RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(mi, DATEADD(dd, -2, a.StartTime))), 2)

ActivityCode

There is no table in the HFM database that contains the descriptions for the ActivityCode column so we will have to use a CASE function. You could create a table in your database to hold the activity codes and their descriptions if you preferred. So the CASE syntax is like this:

CASE a.ActivityCode
WHEN  0 THEN 'Idle'
WHEN  1 THEN 'Rules Load'
WHEN  2 THEN 'Rules Scan'
WHEN  3 THEN 'Rules Extract'
……
END AS Activity,

ModuleName

The module name is similar to ActivityCode in that there is no table containing the descriptions but has an extra complication. All of the records have #@RID@# at the beginning. We could keep that piece of text in the CASE statement but I prefer to remove it:

CASE REPLACE(a.strModuleName, '#@RID@#', '')
WHEN  0 THEN 'ADMINISTRATION'
WHEN  1 THEN 'CELL_HISTORY'
WHEN  2 THEN 'CLOSE_APPLICATION'
WHEN  3 THEN 'CONSOLIDATION'
……
ELSE a.strModuleName
END AS Module

WHERE clause

We don’t want to extract all the Task Audit data every month so we need a WHERE clause to restrict what data is extracted. This piece of code will only include data that has a start time within the month four calendar months ago. So if we are in November now, it will only include records for July.

WHERE DATEDIFF(MONTH, DATEADD(day, -2, a.StartTime), GETDATE()) = 4

You could also include exclusions on the ActivityCode (e.g. Logoff is 31) if you wanted.

Automation

Save your SQL statement to a file and call it using the sqlcmd statement e.g.:

sqlcmd -S %Server% -E -d %DB% -i %Script% -o "%Output%" -s "|" –W

The items between the % symbols are batch variables e.g.

set Script=getActivity.sql

That will extract the Task Audit data to a file. You will then probably add a statement to delete the records from Task Audit for the same month. And for the delete statement you will add some exceptions such as

WHERE ActivityCode NOT IN (1, 21, 23, 26, 29, 75)

This will keep all the application changes and logon information.

You can download the full SQL statement here. It has the extension .key and is a text file:Download GetActivity SQL statement

Conclusion

With this SQL statement, you can have an automated process to extract and manage the Task Audit data. The data that is extracted and deleted is customisable. This means that you can keep your Task Audit data to a minimum while keeping a full year’s data for the important audit records. Information on all application changes and are kept as well as the logon information for all users.

Last Logon in HFM

Have you ever wondered if any users have stopped using HFM and forgotten to tell you? You could trawl through the audit logs of who has logged on and match that against a checklist of your users. It might take a while. Or you could write a script that queries the database.

This code will only work for SQL Server. You will need to change @APPLNAME@ to the name of your HFM application.

SELECT U.sUserName AS UserName,
MIN(DATEDIFF(DAY, A.StartTime, GETDATE()) + 2) AS NumberDaysSinceLastLogon
FROM @APPLNAME@_TASK_AUDIT AS A INNER JOIN
HSV_ACTIVITY_USERS AS U ON A.ActivityUserID = U.lUserID
WHERE (A.ActivityCode = 29)
GROUP BY U.sUserName
ORDER BY U.sUserName

It will produce two columns: UserName and the number of days since the last logon. This includes when a user has logged on via Smart View.

Save this piece of code to a file (e.g. GetLastLogon.sql) and reference that in the variable %Script% like this:

REM Name of the script to execute
set Script=GetLastLogon.sql
REM Name of the file for the output of the records
set OutputFile=LastLogon.csv

then call it from a batch file like this:

REM Parameters:
REM   -s "," : comma is the delimiter used in the output file
REM   -W     : suppress white space
REM This is the syntax if you are using a SQL User and password
REM sqlcmd -S %SQLServer% -U %SQLUser% -P %SQLPW% -d %SQLDB% -i %Script% -o "%OutputFile%" -s "," -W
REM This is the syntax if you are using a trusted user
sqlcmd -S %SQLServer% -E -d %SQLDB% -i %Script% -o "%OutputFile%" -s "," -W

Either set up variables for the other values between % i.e. SQLServer, SQLDB etc or change the values directly. You can now schedule this to run once a month as part of your admin tasks.

Remember that this will only work for the periods in the task audit table. If you have purged the task audit table then you have lost the information on who has logged on for the purged periods. I normally recommend keeping at least 3 months of audit data. If you run this script before purging the audit tables then you will be checking the previous 4 months which is enough to identify who is no longer using HFM.

Task Audit in HFM part 2

The auditing of application changes in HFM has improved recently with the introduction of the metadata load changes feature. This is an excellent enhancement but it does have its problems. In my previous blog I talked about how the Audit Extract Utility has not been updated to manage the metadata load changes audit log and how you mitigate the problems. In this blog I will look at the metadata load changes audit records in more detail. Note that the SQL used in the scripts works on SQL Server only; it will not work on Oracle databases.

Metadata Load Changes

The Metadata Load Changes feature is accessed from the Navigate menu: Administer | Consolidation Administration | Audit | Tasks. There is a log for each metadata load and a matching log for the Metadata Load Differences. Against both logs there is a link icon. When you click on the icon, it prompts you to save the log as a text file with the default name Attachment.txt.

What should be in the audit

Firstly I want to talk about what I think an audit log should contain and how it is presented. In short, an audit log must tell you who made what changes at what time and why they made them. And the audit must be complete (i.e. no significant gaps) with no means of changing it (i.e. not a text or Excel file etc.).

The Task Audit view through Consolidation Administration gives us some of this information (who and when). And if you then go into each of the links you will get the ‘what has changed’. Compare that list with your change request logs and you will get the ‘why’.

Ideally you should also be able to answer questions such as ‘show me the changes in metadata that have been made to the account ABC’. You could do it, but it would take time to go through the Task Audit logs and save each one with a meaningful file name.

How does HFM audit compare?

On the face of it the auditing in HFM shapes up really well. It’s only when you look at it in more detail do you realise that there are some problems. A significant problem is the output for the audit logs. You have to select each log one at a time, save them and change the default filename: Attachment.txt. It’s a text file so there’s nothing to stop you from changing the contents. The log information doesn’t contain who made the change or the date so you have to either edit the file or change the filename to contain that information. This is very poor from an audit perspective.

But there’s more. If you look in the HFM Admin guide and check which characters are invalid for the member label, you will notice that the ampersand (&) is not in the list. So, for example, an account called R&D is valid. Ampersand is not valid for descriptions and not for Cell Text labels. But then in the Loading Metadata section, the manual says:

Note: Do not use the ampersand character ( & ) in a metadata file. If you do, an error occurs.

I can tell you that the error (whatever it is) does not show itself when you load metadata. And I have seen applications happily using ampersand in the member label. I can also tell you that it does cause an error in the view task audit logs. If there is an ampersand in a member label, when you click the link in Consolidation Administration the audit log will display up to and not including the ampersand. Any changes after the ampersand are not in the log. If the ampersand is in something like a userdefined field, it does not cause a problem. I don’t know if any other characters cause the same problem. Safest route then is not to use ampersand anywhere and any other non-numeric/non-alphabetical characters with great care.

In summary then, the HFM Task Audit captures all the changes (probably), in text file format and you will have to edit the files if you want to save them offline. ‘Room for improvement’ as my better school reports used to say.

An alternative

The Task Audit detail information in the database is complete regardless of whether there is an ampersand. The ampersand only causes a problem when you try to download it. So you can potentially write a script to extract the Task Audit detail. And while you are at it, add the missing user and date/time information and save it as a PDF. And it just so happens, I have done exactly that.

Script overview

The output is going to be a PDF file for each metadata type load (i.e. metadata, rules, member lists and security) with the username, date and time in the file and filename. The first step is to get a list of the logs which have an attachment; the second step is to read that list and create the attachment for the log entry.

The Task Audit logs are held in two tables: APPNAME_TASK_AUDIT and APPNAME_TASK_AUDIT_ATT.

This is the first table (TASK_AUDIT):

And the second (attachments) table (TASK_AUDIT_ATT):

The first table contains the information you see in Consolidation Administration. There is a single database record for each log entry. The second table is the information that is used to create the attachments. There are one or more records for each log entry. We want a combination of data from these two tables (and we also want data from the user table to convert the user name).  The script reads the first table and picks off the username, date, time and attachment id. It then reads the attachment table and exports the records for a single log entry and saves it with the username, date, time and audit type in the file name.

This is the SQL statement that is used as the basis for creating the exported audit logs:

SELECT CONVERT(VARCHAR(4), DATEPART(yyyy, DATEADD(dd, -2, A.StartTime))) + RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(month, DATEADD(dd,-2, A.StartTime))), 2) +
RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(dd, DATEADD(dd, -2, A.StartTime))), 2) AS ThisDate,
RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(hour, DATEADD(dd, -2, A.StartTime))), 2) +
RIGHT('00' + CONVERT(VARCHAR(2), DATEPART(mi, DATEADD(dd, -2, A.StartTime))), 2) AS ThisTime,
A.strGUID,
CASE A.ActivityCode
WHEN 1 THEN 'Rules'
WHEN 21 THEN 'Metadata'
WHEN 23 THEN 'Member List'
WHEN 26 THEN 'Security'
WHEN 75 THEN 'Metadata Load Diffs'
END AS Activity,
CASE
WHEN CHARINDEX('\', U.sUserName) > 0 AND CHARINDEX('@', U.sUserName) > 0          THEN SUBSTRING(U.sUserName,
CHARINDEX('\', U.sUserName) + 1,
LEN(U.sUserName) - CHARINDEX('\', U.sUserName) -
CHARINDEX('@', U.sUserName))
WHEN CHARINDEX('\', U.sUserName) > 0
THEN SUBSTRING(U.sUserName,
CHARINDEX('\', U.sUserName),
LEN(U.sUserName) - CHARINDEX('\', U.sUserName))
WHEN CHARINDEX('@', U.sUserName) > 0
THEN SUBSTRING(U.sUserName, 0, CHARINDEX('@', U.sUserName))
ELSE U.sUserName
END AS UserName,
CASE A.ActivityCode
WHEN 75 THEN '.XML'
ELSE '.TXT'
END AS FileExtension
FROM %APPNAME%_TASK_AUDIT AS A INNER JOIN
HSV_ACTIVITY_USERS AS U ON A.ActivityUserID = U.lUserID
WHERE (A.ActivityCode IN (1, 21, 23, 26, 75))
ORDER BY A.StartTime

Click here to download.

The attachment data in the database is held in XML format for the Metadata Load Differences log and in text file format for all the other logs. The script removes any domain names from the username as the \ character causes a problem when creating the output file.

Change %APPNAME% to the name of your HFM application and save the script to a file (e.g. SQLScript.sql). You then use sqlcmd to execute the script and put the output in a temporary file something like this:

sqlcmd -S SQLServer -E -d SQLDatabase -i SQLScript.sql -h-1 -o “Dir\OutFile.txt” -s “|” –W

This will create a file with an audit record on each line:

TA2_HeaderFile

The first field contains the audit record date, then the time, the audit record id, type of audit record, the person who executed it and finally the file type. Six columns in all separated by |. Now you read each record and query the attachments table based on the third column: the id. You can do this through VB Script or batch files. Here it is as a batch file:

REM Each line has 6 columns of data. Put them into A,B,C,D etc
for /F "tokens=1,2,3,4,5,6 delims=|" %%A in (Dir\OutFile.txt) do (
echo %%A %%B %%C %%D %%E %%F
if "%%A"=="" (
ECHO Found a blank line...
goto :eof
)
if "%%A" NEQ "" (
REM Create a script file to export a single log record
echo Creating script for guid %%C... >> log.txt
echo bcp "SELECT CONVERT(char(2000), attachment) FROM
%APPNAME%_TASK_AUDIT_ATT WHERE guid = '%%C' ORDER BY idx" queryout
"%DIR%\%%A_%%B_%%D_%%E%%F" -T -c -C RAW -d SQLDatabase > execme.bat
call execme.bat
)
)
:eof

The attachment itself is stored in the attachment column as binary data or a BLOB (binary large object). Each record holds 2000 characters and the column idx is incremented so that if the attachment is longer than 2000 characters they can be stitched together.

This will produce a separate text file for each log that has an attachment. The file will be in UTF-16 format (Unicode) and there will be padding characters at the end of the file because the text file is being created in blocks of 2,000 characters. The file name will contain the date, time, username and log file type. For some people this is enough. Other people want to have these files printed to PDF.

Clean up the text files

The padding characters at the end of the file are confusing and clumsy so they need to be cleaned up. Here is some VBScript code that will do that (it might need tweaking if the server language is not English):

Set Stream = fso.OpenTextFile(sSourceFile, ForReading, False)
intAsc1Chr = Asc(Stream.read(1))
intAsc2Chr = Asc(Stream.read(1))
Stream.Close
If intAsc1Chr = 255 And intAsc2Chr = 254 Then
OpenAsUnicode = True
Else
OpenAsUnicode = False
End If
Set Stream = fso.OpenTextFile(sSourceFile, ForReading, False, OpenAsUnicode)
' Read file one char at a time and check it is within the ASCII range.
Do While Not Stream.AtEndOfStream
sChar = Stream.read(1)
If OpenAsUnicode = True Then
nCharASCII = AscW(sChar)
Else
nCharASCII = Asc(sChar)
End If
If nCharASCII >= 32 And nCharASCII <= 126 Then
' ASCII character so add it to the output
sText = sText & sChar
ElseIf nCharASCII = 9 Or nCharASCII = 10 Then
' Tab or new line so add to the output
sText = sText & sChar
End If
Loop

Click here to download.

And then print the text to a new file using the username, time, date and filetype to identify what the file contains.

Process the XML file

The Metadata Changes file is in XML format which is not easy to read so you will need to process it into human readable form. You might also want to put a header in the new file using the username, time and date from the other fields in the row.

This is the code I use for converting the XML. You are reading the OutFile.txt to get the name of the XML file as well as the user and date information. These are passed through to this script to be added to the header. It creates a file which is very similar to the output from HFM with a few improvements:

Set NewFile = fso.CreateTextFile(sOutputFile, True, False)
NewFile.WriteLine("Loaded By: " & sUser)
NewFile.WriteLine("Date: " & sFileDate)
NewFile.WriteLine("Time: " & sFileTime)
NewFile.WriteLine("Type: Metadata Load Differences")
NewFile.WriteLine("")
Set xmlDoc = CreateObject("Microsoft.XMLDOM")
xmlDoc.async = "false"
xmlDoc.Load sCleanedXMLFile
Set objRoot = xmlDoc.DocumentElement
For Each childNode In objRoot.ChildNodes
If childNode.BaseName = "DIM" Then
Set attrib = childNode.Attributes
For Each i In attrib
NewFile.WriteLine("Dimension: **  " & i.Value & "  **")
Next
Else
NewFile.WriteLine("Node: " & childNode.BaseName)
End If
ProcessNode childNode, i, ""
NewFile.WriteLine("")
Next
NewFile.Close

Function ProcessNode(childNode, Counter, prevTagName)
Dim attrib
Dim i
Dim Nodes
Dim tagName
Dim member
Dim attribName
Dim prevSetting
Dim newSetting
Counter = Counter + 1
If childNode.Attributes Is Nothing Then
Else
If childNode.Attributes.Length > 0 And Counter > 1 Then
Set attrib = childNode.Attributes
For Each i In attrib
select case lcase(i.BaseName)
case "att"
attribname = i.Value
case "old"
prevSetting = i.Value
case "new"
newSetting = i.Value
end select
Next
If prevTagName = "NCHG" Then
NewFile.WriteLine("    Changed aggregation weight from " & prevSetting & " to " & newSetting)
Else
NewFile.WriteLine("  Changed " & attribName & " for member " & member & " from " & prevSetting & " to " & newSetting)
End If
End If
End If
If childNode.ChildNodes.Length > 0 Then
Set Nodes = childNode.ChildNodes
For Each i In Nodes
On Error Resume Next
tagName = vbNullString
tagName = i.tagName
member = vbNullString
member = i.Text
On Error GoTo 0
If len(member) > 0 then
Select Case tagName
Case "MADD"
NewFile.WriteLine("  Added member: " & i.Text)
Case "MCHG"
NewFile.WriteLine("  Changed member: " & i.Text)
Case "MDEL"
NewFile.WriteLine("  Deleted member: " & i.Text)
Case "NADD"
NewFile.WriteLine("  Added parent/child: " & i.Text)
Case "NCHG"
NewFile.WriteLine("  Changed parent/child: " & i.Text
Case "NDEL"
NewFile.WriteLine("  Deleted parent/child: " & i.Text)
Case Else
'NewFile.WriteLine("  Unknown: " & tagName)
End Select
End If
ProcessNode i, Counter, tagName
Next
End If
End Function

Click here to download.

Print to PDF

The final step is printing to PDF and normally that causes a lot of problems. However you can simply open the new text file in Word and save it as a PDF. For this to work you need to run your script from a computer which has Microsoft Office installed. I found a script that does exactly that here:

http://mydailyjava.blogspot.co.uk/2013/05/converting-microsoft-doc-or-docx-files.html

That’s it. Now there is a process which can be automated that will extract all audit logs and print them to PDF. The PDF file names contain who did what and when. In the logs themselves the usernames and times have also been added.

HFM Task Audit management

The Task Audit tables in HFM store information on who has done what and when in the application. There are separate tables for each application in the database. These tables get very big if they are not maintained properly and there are a number of ways of managing them. In essence the management process needs to archive (or delete) old records. If the audit table contains too many records, checking the audit logs in the workspace is very slow and sometimes it can even stop HFM from working.

There are two ways of archiving the audit tables: to text file or to a database table.

Text file

Archiving to text file is the most common method. The utility HFMAuditExtractUtility is used to extract and truncate the database tables but there are problems with this. Firstly is the sheer amount of data that is stored in the Task Audit table (it is even worse in the Data Audit table). The utility splits the data in the tables into blocks of 64000 rows and it is not uncommon for there to be 5+ such files for a month. Over the year you can accumulate a lot of these extract files which means that they actually lose relevance and usefulness because they are difficult to query. The reason for archiving this information is in case there are any audit questions and these questions usually arise the year after. Typical questions are how many times has metadata been loaded during the year and who did it. When there are millions of records in various text files, it is not easy to answer some of these audit questions.

Database tables

Another method of archiving the audit table is to store the old information in a database. This has the significant advantage of being much easier to find the information you want. It has the disadvantage that this is a custom development and it takes longer than you think to format the data in a manner that is easily understandable and useable.

The database copy can be done either as a trigger or a scheduled event (e.g. monthly). You can set up a trigger so that whenever a record is added to the audit table, it is automatically copied to another (custom) table. Or you can set up an agent of some sort to run once a month to copy the records for the previous month. I prefer the agent method because the trigger method means that you have to amend the HFM database itself. The agent method can be run from a separate database so you are not touching HFM (and thereby removing any possibility of breaking your support agreement).

HFMAuditExtractUtility

As I mentioned before, this utility extracts and truncates the Task Audit and Data Audit tables. This utility is not available for HFM version 11.1.2.4 but the utility from version 11.1.2.3 does work. There is however a fundamental problem with the utility in 11.1.2.4. The Task Audit process in 11.1.2.4 saves the log files for application element loads (Rules, Metadata, Member Lists and Security) against the Task Audit record. From Consolidation Administration you can click the chain icon to download the log file.

TA_TaskAuditLog

These logs files are held in a separate database table which is not touched by the Audit Extract utility so that means that if you use the utility, it does not extract the log files. Once the audit table has been cleared, the ability to download the logs is removed.

Scripts

So what can you do? You want to clean out the task audit table but keep some of the records or extract the log details somehow. A simple way of accomplishing this is to use the Audit Extract Utility to extract only and then delete the records you don’t want from the task audit table. The idea is that the task audit table will contain all records for the previous three months (for example) and then for older periods it will only contain those records which have an associated log. That means you will have an audit of all the loads of application elements and for all other events only the most recent. There should not be too many application element loads because you are unlikely to update your production application more than once a month.

The SQL script that will do this is as follows. This will only work for SQL Server because Oracle does not have the DATEDIFF function and you will need to use SYSDATE instead of GETDATE (my client uses a SQL Server database):

DELETE FROM %HFMAPPNAME%_TASK_AUDIT
WHERE ActivityCode NOT IN (1, 21, 23, 26, 75)
AND (DATEDIFF(MONTH, EndTime, GETDATE()) > %NOMONTHS%)

%HFMAPPNAME% needs to be replaced with the name of your HFM application. %NOMONTHS% is the number of months of task audit data that you want to keep e.g. 3

‘ActivityCode NOT IN (1, 21, 23, 26, 75)’ means that the load of Rules (1), Metadata (21), Member Lists (23) and Security (26) and the Metadata Load Differences (75) will be retained. The list of these activity codes is contained in the file HFMAuditExtractSchemaDefs.xml:

TA_Codes

This XML file has not been updated for 11.1.2.4 however and the new activity code 75 is not in it. I worked that one out by looking at the database table because I can’t find any reference to these activity codes anywhere.

You can either automate this delete command in the database or via a batch file using sqlcmd (SQL Server) or sqlplus (Oracle). This will retain in the Task Audit table all the information about changes to the application elements and remove all the other records. This script combined with the Audit Extract Utility should give you the visibility of the important audit records while keeping the whole audit table trim and quick to respond.

 

 

FDMEE – Mapping based on the data amount

In my previous blog I talked about Lookup Dimensions and how they can be used to simplify mapping by storing information that is relevant to a small subset of accounts. In this blog I want to talk about how to use Lookup Dimensions to store calculated data. By ‘calculated’ I mean data that is not in the source such as the Target member as a Source. See Francisco’s excellent blog for an example here. I will be going through an example of how to use a Lookup Dimension so that you can easily map based on the data amount.

Mapping requirement

It is quite a common FDMEE requirement to want to map based on the data amount. For example if the value is positive it should go the debtor account and if it’s negative the creditor account. I have blogged about this before and all of the methods had a drawback of some sort. I think I have now come up with a really simple and effective method using Lookup Dimensions.

Before I go into the detail I want to lay down the ground rules:

  • Easy for the users to manage
  • The map must be contained in the Export All Dimensions to text file extract
  • No manipulating of the source labels
  • Editable in the workspace
  • Preferably not through Logic Accounts
  • Minimal scripting

Multi-Dim maps

Up until recently my preferred method was to use a Multi-Dim map and specify that the ‘dimension’ to check was AMOUNT so something like this:

#MULTIDIM ACCOUNT=[228*] AND AMOUNT >= 0

This works really well. The main drawback is that you can’t edit this through the workspace because it does not recognise AMOUNT as a dimension – the dimension selection box is blank. This means you have to manage the maps outside FDMEE. I have always strongly advised my clients to maintain their maps outside FDMEE anyway but it would be better if the maps could be edited either way.

Method overview

This new method involves creating a Lookup Dimension and updating it with text that identifies if the amount is positive or negative. First of all I will go through a high-level overview of the steps; the detail of how to do it follows:

  • Add a Lookup Dimension to the Target Application.
  • Add a Field against the Lookup Dimension on the Import Format. You specify any data – it doesn’t matter. If you don’t do this step you cannot use the Lookup Dimension in Multi-Dim maps.
  • Write an Event script to populate the Lookup Dimension with some appropriate text to identify whether the amount is positive or negative.

Add the Lookup Dimension

From the Setup tab select Register | Target Application. Then select the Target Application from the Target Application Summary area. Click Add and fill in the boxes. I have called my Lookup Dimension AmountSign and assigned it to UD9 (remember this – we will need it later):

LD2-TargetApplSetup

Update the Import Format

The Import Format will contain the new Lookup Dimension but it won’t have a Field assigned to it. I have assigned the Field Number 1 but it could be anything:

LD2-ImportFormat

Event Script

Now for the bit that does the actual work. Once the data has been imported, the Event script will execute a SQL command to update the value of the AmountSign Lookup Dimension dependent upon whether the data amount is positive or negative. Notice the script needs to reference the Data Table Column Name from earlier (UD9):

# Lookup Dimension
LOOKUP_DIM = 'UD9'
# Text for a Positive number
POSITIVE_NUMBER = '+'
# Text for a Negative number
NEGATIVE_NUMBER = '-'

# SQL to update the LookUp Dimension value
SQL_TEMPLATE = """UPDATE TDATASEG_T SET %s = '%s' WHERE LOADID = ? """
# AND clause for the positive amount.
SQL_AND = """AND AMOUNT > 0"""

# Put the load id into the parameter list
lstParams = [fdmContext["LOADID"]]

# First the Negative - set everything to negative.
SQL = SQL_TEMPLATE % (LOOKUP_DIM, NEGATIVE_NUMBER)
fdmAPI.executeDML(SQL, lstParams, True)

# Now the positive so add the AND clause.
SQL = SQL_TEMPLATE % (LOOKUP_DIM, POSITIVE_NUMBER) + SQL_AND
fdmAPI.executeDML(SQL, lstParams, True)

The Event Script I selected is AftImport but there are others you can choose. Look in the Admin Guide or see here for more information on what Event Scripts are available. You will obviously want a script that is executed before the maps are processed. You will also want to add error handling and more comments.

Notice that the script first sets all records to be the negative value and then updates the records with a positive value – this is slightly quicker in terms of execution. I have deliberately put all the processing into this one script. I could have added the negative value to the Expression column on the Import Format but this would mean that if you wanted to change the positive and negative text (e.g. to P and N) you would have to go to two places to make the changes. This way everything that is needed for the AmountSign dimension is in this one script.

Import the data

This is our source data (Account, ICP, Custom1, Custom2, Custom3, Amount):

LD2-DataSource

And this is how it looks after the import with the AmountSign dimension. You can see that you now have a piece of text that you can use in maps:

LD2-DataImported

Multi-Dim maps

Now I can create a Multi-Dim map that references the AmountSign dimension:

LD2-Map

This means that the users can now map by the data amount without any scripting or editing of the labels or logic accounts. And the map is contained in the Export All Dimensions.

Taking it one step further…

There are certain types of account that needs to be accumulated before they can be mapped for example VAT. Logic Accounts are usually used to aggregate source data but they have their own set of problems (difficult to trace back to the ledger, need to ignore the source records, auditors don’t like them…). SQL commands can be used to aggregate data. That means that if there is a requirement to aggregate data before applying the map we might be able to do it in an Event script.

Add a Lookup Dimension (I will call it AmountAccumSign and assign it to UD8) and add it to an Import Format as before. Change the Event Script to this (the new lines have _ACCUM):

# LookUp Dimension
LOOKUP_DIM = 'UD9'
# Lookup dimension for the accumulated amount
LOOKUP_DIM_ACCUM = 'UD8'

# Text for a Positive number
POSITIVE_NUMBER = '+'
# Text for a Negative number
NEGATIVE_NUMBER = '-'

# SQL to update the LookUp Dimension value
SQL_TEMPLATE = """UPDATE TDATASEG_T SET %s = '%s' WHERE LOADID = ? """
# AND clause for the positive amount.
SQL_AND = """AND AMOUNT > 0"""
# SQL for the accumulated amount - positive.
SQL_TEMPLATE_ACCUM = """UPDATE TDATASEG_T SET %s = '%s'
     WHERE LOADID = ? AND (ACCOUNT IN
     (SELECT ACCOUNT
      FROM TDATASEG_T
      GROUP BY LOADID, ACCOUNT
      HAVING LOADID = ? AND SUM(AMOUNT) > 0))"""

# Put the load id into the parameter list
lstParams = [fdmContext["LOADID"]]

# First the Negative - set everything to negative.
SQL = SQL_TEMPLATE % (LOOKUP_DIM, NEGATIVE_NUMBER)
fdmAPI.executeDML(SQL, lstParams, True)

# Now the positive so add the AND clause.
SQL = SQL_TEMPLATE % (LOOKUP_DIM, POSITIVE_NUMBER) + SQL_AND
fdmAPI.executeDML(SQL, lstParams, True)

# Set all the accumulated values to negative
SQL = SQL_TEMPLATE % (LOOKUP_DIM_ACCUM, NEGATIVE_NUMBER)
fdmAPI.executeDML(SQL, lstParams, True)

# Clear out the parameter list because it needs to have the loadid twice
lstParams = []
lstParams = [fdmContext["LOADID"], fdmContext["LOADID"]]

# Update the records to positive where the accumulated amount > zero.
SQL = SQL_TEMPLATE_ACCUM % (LOOKUP_DIM_ACCUM, POSITIVE_NUMBER)
fdmAPI.executeDML(SQL, lstParams, True)

Import the same data and include some test data and this is what you get. I am showing it in Excel format so that it’s easy to show the subtotals by Account:

LD2-DataImported-2

This works well if you are wanting to accumulate the source data by the same dimensions for all the source rows (e.g. Account and Custom1). If you want to accumulate some data by Account and Custom1 and others by Account and Custom2, then this probably won’t work for you and it’s better to stick to Logic Accounts.

NOTE: I have tested this on SQL Server and I can’t see anything that would stop it working on Oracle but I might be wrong.

Conclusion

Using Lookup Dimensions to hold the result of a SQL command works really well and for the most part is transparent to the users. They just see the result as added information. The flexibility of being able to map by data amount (and perhaps accumulated data amount as well) with little impact on the system while keeping it all in standard map format is a big win in my opinion. You can use the same principle for other requirements such as wanting to use <BLANK> in a Multi-Dim map – create a Lookup Dimension and update it to BLANK where the relevant field is empty.

Creating an Excel spreadsheet from FDMEE using Jython

A client recently asked me to write a report in FDMEE to show the HFM Data Load error file. They didn’t want to use the Intersection Validation Report because it stopped the process. They wanted to get into HFM whatever data they could and then review the error file to identify problems. I wrote the script and the report but, while it worked perfectly on my environment, it didn’t work on the client’s. My heart sank because trying to work out why a report is not displaying the data can be very time-consuming so I thought I would investigate a different way of doing it.

The most direct way of allowing the user to review the load errors is to show the load error file to the user. The actual load error file format is not great so I thought the best way was to read the HFM data load error file into an Excel spreadsheet and then use the FDMEE function showCustomFile. I wrote a test script for the AftLoad event to show the actual error file using the fdmAPI function showCustomFile. That worked so the only problem now was how to write the contents of the error file into Excel.

There was nothing in the documentation or API on how to write to Excel although FDMEE can clearly do it because of the export to Excel in maps. An internet search for “Jython create excel” pointed me to this page:

https://community.oracle.com/thread/3770307?start=0&tstart=0

And from that page I deduced that the Java module that FDMEE uses to create Excel files is called POI. A second internet search for “python excel poi” pointed me to this page:

https://wiki.python.org/jython/PoiExample

And this gave me the code for creating a spreadsheet using POI. I put the two pieces of code together and with a little bit of tinkering, it works. It creates an Excel spreadsheet which I can then show to the user using showCustomFile. It is then relatively simple to read the error file and put it into table format in Excel. It is a great alternative if the Intersection Validation Report is not appropriate.

I have attached my test script for creating an Excel spreadsheet here: ExportToExcel. You will need to review the import section to make sure that it is pointing to the correct folder in your environment (lines 2 and 3). And you will also need to check the folder for the output (line 12). The script is also below but if you copy it the indentation is lost. If you can’t download the script let me know and I will send it to you:


from sys import path
path.append("C:\\Oracle\\Middleware\\EPMSystem11R1\\products\\FinancialDataQuality\\lib\\poi-3.8-20120326.jar")
path.append("C:\\Oracle\\Middleware\\EPMSystem11R1\\products\\FinancialDataQuality\\lib\\poi-ooxml-3.8-20120326.jar")

from java.io import FileOutputStream
from java.util import Date
from java.lang import System, Math
from org.apache.poi.hssf.usermodel import *
from org.apache.poi.hssf.util import HSSFColor

wb = HSSFWorkbook()
fileOut = FileOutputStream(“C:\\TEMP\\POIOut2.xls”)

# Create 3 sheets
sheet1 = wb.createSheet(“Sheet1”)
sheet2 = wb.createSheet(“Sheet2”)
sheet3 = wb.createSheet(“Sheet3”)
sheet3 = wb.createSheet(“Sheet4”)

# Create a header style
styleHeader = wb.createCellStyle()
fontHeader = wb.createFont()
fontHeader.setBoldweight(2)
fontHeader.setFontHeightInPoints(14)
fontHeader.setFontName(“Arial”)
styleHeader.setFont(fontHeader)

# Create a style used for the first column
style0 = wb.createCellStyle()
font0 = wb.createFont()
font0.setColor(HSSFColor.RED.index)
style0.setFont(font0)

# Create the style used for dates.
styleDates = wb.createCellStyle()
styleDates.setDataFormat(HSSFDataFormat.getBuiltinFormat(“m/d/yy h:mm”))

# create the headers
rowHeader = sheet1.createRow(1)
# String value
cell0 = rowHeader.createCell(0)
cell0.setCellStyle(styleHeader)
cell0.setCellValue(“Name”)

# numbers
for i in range(0, 8, 1):
cell = rowHeader.createCell((i + 1))
cell.setCellStyle(styleHeader)
cell.setCellValue(“Data ” + str( (i + 1)) )

# Date
cell10 = rowHeader.createCell(9)
cell10.setCellValue(“Date”)
cell10.setCellStyle(styleHeader)

for i in range(0, 100, 1):
# create a new row
row = sheet1.createRow(i + 2)
for j in range(0, 10, 1):
# create each cell
cell = row.createCell(j)
# Fill the first column with strings
if j == 0:
cell.setCellValue(“Product ” + str(i))
cell.setCellStyle(style0)

# Fill the next 8 columns with numbers.
elif j < 9:
cell.setCellValue( (Math.random() * 100))

# Fill the last column with dates.
else:
cell.setCellValue(Date())
cell.setCellStyle(styleDates)

# Summary row
rowSummary = sheet1.createRow(102)
sumStyle = wb.createCellStyle()
sumFont = wb.createFont()
sumFont.setBoldweight( 5)
sumFont.setFontHeightInPoints(12)
sumStyle.setFont(sumFont)
sumStyle.setFillPattern(HSSFCellStyle.FINE_DOTS)
sumStyle.setFillForegroundColor(HSSFColor.GREEN.index)

cellSum0 = rowSummary.createCell( 0)
cellSum0.setCellValue(“TOTALS:”)
cellSum0.setCellStyle(sumStyle)

# numbers
# B
cellB = rowSummary.createCell( 1)
cellB.setCellStyle(sumStyle)
cellB.setCellFormula(“SUM(B3:B102)”)

wb.write(fileOut)
fileOut.close()

Mapping based on the data amount

Mapping based on the data amount is a common requirement for FDM. As an example, if the amount is positive it should be mapped to the debtor account, if it’s negative it should be mapped to the creditor account. I have done this in a number of different ways depending on the requirements and this is listing the different methods I have used. I am not suggesting this is an exhaustive list; I am sure there are other ways of doing it.

This is just an overview that is meant to give you some ideas on how to approach mapping by amount. The scripts illustrate the point I am trying to make; you will need to edit and add to them to get them to work for your own application.

I will be covering four different ways of doing this:

Description… Advantages… Disadvantages… Applies to…
  1. Conditional mapping
Write a script against the map
  • Standard method
  • Keeps audit trail
  • Not easily understood by users
  • Slow
  • ‘Show conversion rules’ can be misleading
  • Target map not validated
 Individual records
  1. Logic account
Create a logic account based on the record value
  • Users do not have to write the VBScript for conditional mapping
  • Slow
  • Audit trail lost
  • Must set source records to ‘Ignore’
  • Can be difficult to maintain
 Accumulated records
  1. Import script
Edit the account label as the record is imported
  • Quick
  • No need for scripting in maps or logic groups
  • Scripting knowledge needed
  • Audit trail lost
  • Might get complicated if it is not easy to identify the relevant accounts
 Individual records
  1. Event script
Apply the maps for the negative records after the map process has completed
  • Keeps audit trail
  • Quick
  • Specialist scripting
  • Might need to maintain a separate set of maps
  • ‘Show conversion rules’ can be misleading
 Individual and Accumulated records

‘Audit trail lost’

This means that the metadata has been changed in some way and is now not the same as the source records in the ledgers. While this might be acceptable for Management applications, it will probably be flagged as an issue for Statutory. It also means that, if you are using ERPi in conjunction with FDM, you cannot drill back to the source ledger.

Data

In the following examples there are two data records each with two fields. The first field is the Account (1100) which is the same for both records. The second field is the data amount. One record has an amount of 1,000 and the other -5,000.

InputData

Positive values should be mapped to account P1000; negative values to account N1000.

Conditional mapping

Conditional mapping is where you add a script to a mapping record. This means that you can control when a map is or is not applied. FDM processes the data one record at a time and holds the information for each record in an array called varValues. (Search the FDM Admin manual for varValues for more information). The data amount is held in varValues(9) so you can write a script that will map based on the value in this field:

Dim DataAmount
DataAmount = varValues(9)
If CDbl(DataAmount) > CDbl(0) Then
    Result = “P1000”
Else
    Result = “N1000”
End If
Notice both amounts are converted to a double (CDbl). In the later versions of FDM you get a type conversion error if you don’t use the CDbl function.

ConditionalMap
If you use the ‘Show Conversion Rules’ feature (or Drill Up), FDM will always show in the Target column the first return value in the script regardless of what is actually applied (ie it is always P1000 even when you drill into the N1000 record):
drillup
You need to go to the Validate step to see the two records mapped to separate targets:
Validate

  • Simplest method.
  • There is no idea of ‘skipping’ the record but you do not need to have an Else clause.
  • This method looks at individual data records.
  • Scripts cannot be applied to Explicit maps.

Logic Account

You can create a logic account that will only create a record if certain conditions are met. In this case the condition will be that the data amount is positive or negative. This is an example using a Simple Logic Group but the principle is the same for Complex:
LogicAccount
The important difference between Logic Accounts and conditional mapping is that conditional mapping is applied to an individual source data record as it is being imported; Logic Accounts are applied after the data has been imported and records with the same metadata attributes are accumulated.

If CurVal > 0 Then
    Result = CurVal
Else
    Result = “Skip”
End If
CurVal is the result of the query. This logic account is only created if the value is greater than zero.

FDM looks at the accumulated values for the specified criteria (CurVal). In this case we have just specified that the source account is 1100. FDM will therefore look at the sum of the two source records (1,000 and -5,000) to get the amount of -4,000 and this is the value that is used for the creation of the logic account.
ImportAfterLogic

  • More advanced method but still using FDM’s built-in functionality.
  • If the metadata attributes of the source record are the same and you still want to map to different targets based on the data amount, you cannot use Logic Accounts because all the records with the same metadata attributes are combined. The values are accumulated and then the logic accounts are applied so in our example we only have one record with an amount of -4000,
  • You might need to create ‘Ignore’ mappings for the source records.
  • The record that is being loaded into the target system has been generated from the source records. It will not exist in the source ledger. It is not easy to track which records have been used to create the logic account.

Import script

You can create an import script that is applied to the Account dimension. The script looks at the data amount for the record and then updates the Account label so that records with positive data amounts can be distinguished from those with negative amounts. For example, the script might add ‘P_’ to the beginning of the Account if the data value is positive and ‘N_’ it is negative.

  1. Create an Import script (DataPump). In this example the import script is called CheckAccount (and there are other ways to achieve this):
Function CheckAccount(strField, strRecord)
Dim dblDataAmount

‘Parse out the data amount from the input record
dblDataAmount = DW.Utilities.fParseString(strRecord, 2, 2, “,”)

‘If the data value is positive add P_ to the beginning of the
‘account label otherwise add N_

If CDbl(dblDataAmount) > 0 Then
    CheckAccount = “P_” & strField
Else
    CheckAccount = “N_” & strField
End If

End Function

The script parses out the data amount from the source record and stores it in the variable dblDataAmount.

 

 

The script must set a value to the function CheckAccount. The script is applied to the account field so the parameter strField contains the account label. The script then adds either P_ or N_ to the account label depending on the data value.

  1. Apply the script to the import format.
  2. importScript

  3. The script updates the source account label:
  4. importScript2

  5. Create standard maps to convert P_1100 to P1000 and N_1100 to N1000.
  • Simple scripting solution.
  • The script works on individual records.
  • You might want to restrict the script to execute only for certain accounts. You will need to have a method for identifying these accounts.
  • You might need specialist scripting help.
  • The account label has been modified by the import script and will not exist in the source ledger.

Event script

An Event script is a script that is executed in response to an event in FDM. By default Event scripts are empty and when you create an Event script it will be executed automatically by FDM when appropriate.

After the data has been imported into FDM, the logic group is executed and then the maps are applied. The script for updating the maps based on data value will execute after the maps have been applied and, in essence, will apply a second set of maps. This is done by executing a SQL statement directly on the FDM database. Before we look at the script we need to understand how FDM stores its mapping data.

Mapping table

The maps for all locations are held in the table tDataMap. If you use the ‘Export to Excel’ function for the maps this should look familiar. Take note of the DataKey field in the screenshot below. Also notice there is a second map for the source account 1100 which has a suffix of ‘NEG’:
DataMap1

Data table

There are 50 data tables in an FDM application by default. When a location is created it is assigned one of these 50 tables to store its data; this is called the ‘segment’. If you have 300 locations in an application, you will have roughly six locations sharing a segment. In order to retrieve the data from the database for a location, you need to know the location’s segment number. This will then give you the name of the table for the data segment eg tDataSeg4 or tDataSeg23.

Here is an example of the data segment table. Two records have been imported. Looking at the Account dimension, the source account label is held in the column ‘Account’ (1), the target account label is held in the column ‘AccountX’ (2) and the DataKey value from the mapping table is held in the column ‘AccountR’ (3):
DataSeg2
The Event script will update the data segment table so that the AccountX column holds the updated target account label and the AccountR column will hold the datakey value for this new map record. The information for the AccountX and AccountR columns is held in the 1100NEG map record.

The script – When to execute

The key to Event scripts is understanding when to execute them. Event scripts are documented in the FDM API Object Guide. The script can be executed at a number of different places; the main requirement is that the default mapping process must have finished when the event script is executed. In this example we will use the Event ImportProcess. This event has one parameter: strEventName. You will want to execute the script only for the event name ‘EndProcess’.

The script – What to execute

In order to map the negative data records to a different target, you need to update the AccountX and the AccountR values when the Amount value is less than zero. You only want to do this for the current location, category and period.
DataSeg4

The first step is to write and test the SQL statement that will update the record without worrying about data segments, locations, categories etc.

UPDATE tDataSeg4SET AccountX = ‘N1000’, AccountR =
  (SELECT DataKey FROM tDataMap
  WHERE SrcKey = ‘1100NEG’
  AND DimName = ‘Account’
  AND PartitionKey = 751)
WHERE AccountX = ‘P1000’
AND CatKey = 13
AND PartitionKey = 751
AND PeriodKey = ‘1/31/2012’
AND Amount < 0
This will update the record so that the account is mapped to N1000 and the correct DataKey value is stored for records where the Amount is negative.

Once you have tested that the SQL syntax is correct, you need to look at updating the script with correct segment id, category id etc.

Update the script with the correct ids

Segment id RES.PlngSegKey
Partition key RES.PlngLocKey If the location uses a parent location for its maps, the Partition key will need changing.
Category key RES.PlngCatKey
Period key RES.PdtePerKey The period key might need some manipulation depending on the date format.
  • The example is written for SQL Server. Oracle is case sensitive so DimName = ‘Account’ will need reviewing (LIKE ‘A%’).
  • The example updates individual records. You can use the SUM function on the Amount field to apply the update to the accumulated value. You will also need to use a GROUP BY clause.

Completed script

The ImportAction script will look similar to this:

Sub ImportAction(strEventName)
If strEventName = “EndProcess” Then

    ‘Get the name of the DataSeg table. This holds the data for the location.
    strSegTable = “tDataSeg” & RES.PlngSegKey

    ‘Set up the Update SQL query. This will update the tDataSeg table with
    ‘the new target member and add the mapping record number for this new member
    strSQL = “UPDATE ” & strSegTable & ” SET AccountX = ‘N1000’, AccountR = ”
    strSQL = strSQL & “(SELECT DataKey FROM tDataMap ”
    strSQL = strSQL & “WHERE SrcKey = ‘1100NEG’ ”
    strSQL = strSQL & “AND DimName = ‘Account’ ”
    strSQL = strSQL & “AND PartitionKey = ” & RES.PlngLocKey & “) ”

    strWhere = “WHERE AccountX = ‘P1000’ ”
    strWhere = strWhere & “AND CatKey = ” & RES.PlngCatKey & ” ”
    strWhere = strWhere & “AND PartitionKey = ” & RES.PlngLocKey & ” ”
    strWhere = strWhere & “AND PeriodKey = ‘” & Month(RES.PdtePerKey) & “/” & Day(RES.PdtePerKey) & “/” & Year(RES.PdtePerKey) & “‘ ”

    ‘Put the SQL statements together and add the check for the Amount being negative.
    ‘This will process individual records.
    strUpdate = strSQL & strWhere & “AND Amount < 0 ”

    blnReturn = DW.DataManipulation.fExecuteDML(strUpdate, RES.FDMFalse, RES.FDMTrue)

End If

End Sub

  • Complex scripting solution. You might need specialist scripting help.
  • The script works on individual or accumulated records.
  • You will want to restrict the script to execute only for certain accounts. You will need to have a method for identifying those accounts.
  • The account label has not been modified by the script so drill back to the source ledger will still work.

Script to delete old batch files from FDM

This is just a very small script for deleting folders that have been created by the FDM Batch Processor. When Batch Processor has finished its processing it creates a new folder with today’s date and time as the folder name. After a while the number of these folders can run into the hundreds if not thousands. This simple script will automatically delete any folder in the Batch Processor format that is older than a set number of days (default is 30).

Just copy this file to the Scripts\Custom folder. This is a script file with a .KEY extension. Change the file name to: webDeleteOldBatchFolders.uss. webDeleteOldBatchFoldersUSS

Sub webDeleteOldBatchFolders()
‘——————————————————————
‘Oracle Hyperion FDM CUSTOM Script:
‘Created By:      Marcus Hemy.
‘Date Created:      2014-01-07 18:14:58
‘Purpose:        Deletes the folders created by the Batch process.
‘ Do not delete folders that were created less than
‘ the number of days held in the variable nDeleteFoldersOlderThan.
‘——————————————————————
Dim TodayAsNumberDays
Dim oFSO, oFolder, oAllSubFolders, oSubFolder
Dim sFolderName
Dim sStartFolder
Dim arrSplit
Dim nDeleteFoldersOlderThan
Dim sMsg
Dim nDeletedCounter
‘=========================================
‘     ADMIN SECTION – START
‘=========================================
    ‘Number of days.
    ‘Any folder created more than this number of days ago will be deleted.
    nDeleteFoldersOlderThan = 30
‘=========================================
‘     ADMIN SECTION – END
‘=========================================
    ‘Get today in the form of the number of days since 1900.
    ‘Could use DateDiff but you need to add 2 to get the correct number of days
    ‘TodayAsNumberDays = DateDiff(“d”, “1/1/1900”, Now) + 2
    ‘Alternatively get today’s date as a number.
    TodayAsNumberDays = CDbl(Date())
    ‘Store the name of the Batches folder
    sStartFolder = API.DataWindow.Connection.PstrDirInbox & “\Batches\”
    ‘Use File System Object for handling the folders
    Set oFSO = CreateObject(“Scripting.FileSystemObject”)
    ‘Create an object with the all the information about the folder
    Set oFolder = oFSO.GetFolder(sStartFolder)
    ‘Get a list of all the sub folders in Batches
    Set oAllSubFolders = oFolder.SubFolders
    ‘Go through each folder and examine its name
    For Each oSubFolder In oAllSubFolders
    ‘Store the folder name
        sFolderName = oSubFolder.Name
        ‘Format of the folder name created by the Batch Processor is NumberofDays_NumberOfSeconds
        ‘If there is an underscore we might be interested
        If InStr(sFolderName, “_”) Then
            ‘Put the folder name into an array by splitting it at the underscore.
            arrSplit = Split(sFolderName, “_”)
            ‘arrSplit(0) will contain the number of days if it’s in the correct format
            ‘Check if this is a number
            If IsNumeric(arrSplit(0)) Then
            ‘Compare the folder name number of days against today’s date less the number of days parameter
                If CDbl(arrSplit(0)) < CDbl(TodayAsNumberDays – nDeleteFoldersOlderThan) Then
                    ‘We want to delete this one.
                    ‘Update the counter
                    nDeletedCounter = nDeletedCounter + 1
                    ‘Delete the folder
                    oFSO.DeleteFolder(sStartFolder & sFolderName)
                End If
            End If
        End If
    Next
    ‘Store the output message in a variable and display appropriately.
    sMsg = “Deleted [” & nDeletedCounter & “] folders.”
    If LCase(API.DataWindow.Connection.PstrClientType) = “workbench” Then
        MsgBox sMsg
    Else
        ‘Let the user know we are done
        RES.PlngActionType = 2
        RES.PstrActionValue = sMsg
    End If
End Sub

Creating an Excel spreadsheet from FDM without loading Excel

I have written many scripts for FDM that generate files which are ultimately destined for Excel. I know how to create Excel spreadsheets through a script but you don’t want to run scripts that call Excel on a server. Excel needs to be installed for one thing and secondly you run the risk of slowing everything down because there are so many instances of Excel running. So I looked at different ways of creating an Excel spreadsheet and I decided that ADO (Active Data Objects) was probably the simplest method.

The idea behind this is to allow you to create a backup of your maps for all dimensions in a single file. You will have a separate file per location and period. Each file can then be loaded back into FDM.

There are two parts to this: the Script and the template spreadsheet. The script takes a copy of the template spreadsheet and uses it as a source for an ADO recordset. The template spreadsheet has the header information for the maps which is used by the script to populate the correct columns. The script then queries FDM to get the maps for the current location and adds them to the spreadsheet. ADO does not allow any control over the spreadsheet and I wanted the name range to be automatically updated. So there is a macro in the spreadsheet that runs whenever you open the spreadsheet. The macro reads through all the sheets and will create a name range for any maps it finds.

The script follows. Copy this to a file called webExportMapsToExcel.uss and save it in the folder Data\Scripts\Custom.

Option Explicit

Sub webExportMapsToExcel()
'------------------------------------------------------------------
'Oracle Hyperion FDM CUSTOM Script:
'
'Created By: Marcus Hemy. Marcus@HandL.me.uk
'Date Created: November 2013
'
'Purpose: Create an Excel spreadsheet with the maps for the current location without loading Excel.
'
'Dependencies: The script needs a starting Excel spreadsheet.
'
'------------------------------------------------------------------
Dim objFSO
Dim strFileTemplate
Dim strFileExtension
Dim strFileCreated
Dim strConnection, conn, strSQL
Dim field
Dim strDoubleQuote
Dim rs, rsMap
Dim sDimensionList, sDimension
Dim arrDimensions
Dim strLoc
Dim strPer
Dim strMessage
Dim strScriptName
Dim strSourceFolder
Dim strTargetFolder

strScriptName = "webExportMapsToExcel"

 'Check Excel Template file exists in the Templates folder.
 'Get the Outbox folder
 strSourceFolder = DW.Connection.PstrDirOutbox
 'Add a backslash if it needs one
 If Right(strSourceFolder, 1) <> "\" Then strSourceFolder = strSourceFolder & "\"
 'Add the Templates folder
 strSourceFolder = strSourceFolder & "Templates\"

 'Set the folder where the file will be created to the Inbox.
 strTargetFolder = DW.Connection.PstrDirInbox
 If Right(strTargetFolder, 1) <> "\" Then strTargetFolder = strTargetFolder & "\"
 'Name of the template file.
 strFileTemplate = "MapTemplate"
 'File extension. XLS works but XLSM does not.
 strFileExtension = ".XLS"

'Dimension list. Just add another dimension label here.
 sDimensionList = "Account;ICP;Entity;UD1;UD2;UD3;UD4"
 arrDimensions = Split(sDimensionList, ";")

'The output file will have the current location and period added to the file name.
 strLoc = RES.PstrLoc
 'Remove any spaces from the perod
 strPer = Replace(RES.PstrPer, " ", "")

 'Create the file system object
 Set objFSO = CreateObject("Scripting.FileSystemObject")

 'Make sure the template file exists
 If objFSO.FileExists(strSourceFolder & strFileTemplate & strFileExtension) Then

'Make a copy of the Template
     strFileCreated = strTargetFolder & strFileTemplate & "-" & strLoc & "-" & strPer & strFileExtension
     objFSO.CopyFile strSourceFolder & strFileTemplate & strFileExtension, strFileCreated, True

 Else
     strMessage = "Cannot find the source template file: " & strSourceFolder & strFileTemplate & strFileExtension

     If LCase(API.DataWindow.Connection.PstrClientType) = "workbench" Then
         'Win32 Client (Use Msgbox)
         MsgBox strMessage, vbOKOnly, strScriptName
     Else
         'Web Client (Set API action properties)
         RES.PlngActionType = 2 '(See Enumeration: ScriptActionTypes)
         RES.PstrActionValue = strScriptName & vbCrLf & strMessage
     End If
     Exit Sub
 End If

 'Destroy the file system object
 Set objFSO = Nothing

'Need to pass a double quote to a string so save it to a variable
 strDoubleQuote = Chr(34)

 'Set up the connecion string based on the file extension
 Select Case UCase(strFileExtension)
 Case ".XLS"
     strConnection = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & strDoubleQuote & strFileCreated & strDoubleQuote & ";Extended Properties=" & strDoubleQuote & "Excel 8.0;HDR=Yes;" & strDoubleQuote & ";"
 Case ".XLSX", ".XLSM"
     strConnection = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & strDoubleQuote & strFileCreated & strDoubleQuote & ";Extended Properties=" & strDoubleQuote & "Excel 12.0 Xml;HDR=Yes;" & strDoubleQuote & ";"
 End Select

 'Create an ADO connection
 Set conn = CreateObject("ADODB.Connection")

 'Connect ie open the file
 conn.Open strConnection

 'Create a recordset
 Set rs = CreateObject("ADODB.recordset")

 'Specify the cell range where the current header info is.
 strSQL = "SELECT * FROM A1:J2"
 'Now open those cells as a recordset
 rs.Open strSQL, conn, 3, 3

 'Read the maps for each dimension
 For Each sDimension In arrDimensions

     'Get the map records for the Location and Dimension
     Set rsMap = API.SQLMgr.fMapItems(CStr(strLoc), sDimension)

     'Make sure the function returned records
     If Not rsMap.BOF And Not rsMap.EOF Then
         'Now loop through the map records
         Do While Not rsMap.EOF
             'Write a new row to Excel
             rs.AddNew
             'Go through each column. OriginalValue holds the column name.
             For Each field In rs.Fields
                 field.Value = rsMap.Fields(field.OriginalValue).Value
             Next
             rs.Update

             'Get the next map record
             rsMap.MoveNext
         Loop
     End If

     Set rsMap = Nothing
 Next

'Tidy up
 rs.Close
 Set rs = Nothing

conn.Close
 Set conn = Nothing

strMessage = "Maps exported to: " & strFileCreated

If LCase(API.DataWindow.Connection.PstrClientType) = "workbench" Then
     'Win32 Client (Use Msgbox)
     MsgBox strMessage, vbOKOnly, strScriptName
 Else
     'Web Client (Set API action properties)
     RES.PlngActionType = 2 '(See Enumeration: ScriptActionTypes)
     RES.PstrActionValue = strScriptName & vbCrLf & strMessage
 End If

End Sub

For the template, open Excel and in cell A1 add tDataMap. In A2 to J2 add PartitionKey, DimName, SrcKey, SrcDesc, TargKey, ChangeSign, WhereClauseType, WhereClauseValue, Sequence, VBScript

Go to the VBE in Excel (Alt+F11) and add the following code to ThisWorkbook:

Image

Option Explicit

Private Sub Workbook_Open()
Dim sSheet As String
Dim nCounter As Integer
Dim sCurrentLocation As String
Dim sStartCellAddress As String
Dim sEndCellAddress As String
Dim MaxRows As Long
Dim sNamedRange As String
Dim sSourceFileName As String

'ActiveCell.Offset(cMOVEDOWN, cSAMECOLUMN).Select
Const cMOVERIGHT As Integer = 1
Const cMOVELEFT As Integer = -1
Const cMOVEUP As Integer = -1
Const cMOVEDOWN As Integer = 1
Const cSAMEROW As Integer = 0
Const cSAMECOLUMN As Integer = 0

On Error GoTo PROC_ERR

 sSourceFileName = "MAPTEMPLATE.XLS"
 If Right$(UCase$(ActiveWorkbook.Name), Len(sSourceFileName)) = sSourceFileName Then
     'Don't do this for the source file
 Else
     'Go through each sheet
     SaveLocation False
     With ActiveWorkbook.Sheets
         For nCounter = 1 To .Count
             If .Item(nCounter).Type = xlWorksheet Then
                 sSheet = ActiveWorkbook.Sheets(nCounter).Name
                 Application.Sheets(sSheet).Select
                 'Search for tDataMap - produces an error if it doesn't find it
                 On Error Resume Next
                 Cells.Find(What:="tdatamap", After:=ActiveCell, LookIn:=xlFormulas, _
                                 LookAt:=xlPart, SearchOrder:=xlByRows, SearchDirection:=xlNext, _
                                 MatchCase:=False, SearchFormat:=False).Activate
                 On Error GoTo PROC_ERR
                 If UCase$(ActiveCell.Text) = "TDATAMAP" Then
                     'Create the name range
                     sNamedRange = "ups" & sSheet
                     sStartCellAddress = ActiveCell.Address
                     sEndCellAddress = ColLetter(ActiveCell.Offset(cSAMEROW, cMOVERIGHT * 9).Column)
                     MaxRows = Range(sStartCellAddress).CurrentRegion.Rows.Count
                     sEndCellAddress = "$" & sEndCellAddress & "$" & MaxRows
                     ActiveWorkbook.Names.Add Name:=sNamedRange, RefersTo:="='" & sSheet & "'!" & sStartCellAddress & ":" & sEndCellAddress, Visible:=True
                 End If
             End If
         Next
     End With
     SaveLocation True
 End If
PROC_EXIT:
     Exit Sub

PROC_ERR:
     MsgBox "Error: " & Err.Number & " " & Err.Description & vbCrLf & "In procedure Workbook.Open", vbOKOnly + vbCritical, "CreateNameRange"
     Resume PROC_EXIT

End Sub
Private Function ColLetter(ByVal ColNumber As Integer) As String
     ColLetter = Left$(Cells(1, ColNumber).Address(False, False), 1 - (ColNumber > 26))
End Function
Public Sub SaveLocation(ReturnToLoc As Boolean)
Static Wb As Workbook
Static WS As Worksheet
Static R As Range

On Error Resume Next
If ReturnToLoc = False Then
Set Wb = ActiveWorkbook
Set WS = ActiveSheet
Set R = Selection
Else
Wb.Activate
WS.Activate
R.Select
End If

End Sub

Call this file MapTemplate and save it as an XLS. Put it in the Templates folder under the FDM application Outbox.

MapTemplate webExportMapsToExcelUSS