Generating Identity Manager CSV Reports With a Null Driver Rather Than Using Reporting

This post will assist in generating a CSV report for a MicroFocus/NetIQ Identity Manager system that is unique to a business practice. Many times having a pretty PDF file with Identity Reporting is great for management, but other times there is a need to generate quick reports for your security specialists, etc.

In this example, we will identify which accounts don’t have a manager. Not having a manager may cause serious issues in regards to Access Governance or may be an indicator of rogue accounts, etc…, based on an organization’s business policies and procedures. This isn’t a comprehensive example, as the possibilities are endless, but will illustrate how to design a non-custom report.

The Null driver is a subscriber-only driver that will detect events in the Identity Vault to perform an action. Writing to a file isn’t a common practice and one worth considering for your toolbelt.

What Is Needed

An understanding of how to  manipulate a driver policy with creating new rules, jobs, ECMA scripts, and dynamic groups.

ECMA Script

The policy will use an ECMA Script wrapper to write out to a file. The policy supplies the file path to the ECMA Script and the script supplies it to the Java Class as it handles the I/O on the backend.


function appendToFile(filepath, data){

 var fileWriter = new,true);

 try {




 finally {




function renameFile(filepath, newfilepath) {

var rename = new;


var renameResult = rename.renameTo(new File(newfilepath));




Example Policy on the Subscriber Channel


This policy is based on a job that triggers this policy to run. It will gather all of the Distinguished Names of all users associated with the dynamic group. The dynamic group is referenced in the action that uses “[pseudo].Member”. The specific dynamic group is generated from a global config value on the driver. The dynamic group itself would have a LDAP filter that would look for any account that doesn’t have the manager attribute present. It then gathers specific data about the user and stores it in a local variable and composes a comma separated local variable of the order of the content and then writes that out the given data for a given user on a given row of the csv file. Then the for each statement continues on to the next user and writes them out on the next row, etc.


Example of Global Config Values

OutputPath: /opt/iam/fileOutput/noManager.csv

CSVDelimiter: ,

ArchivePath: /opt/iam/fileOutput/Archive  (used for prior ran reports)

DelimitedHeader: uid,givenname,sn,mail,jobcode (csv header for start of new file) This can be tricky depending on your policy. If you want the header, you only want to write it once. So you may have to archive your existing files. Otherwise you may want to use a bash script for cleanup of the files.

DynamicGroup: CN=NoManagerDynamicGroup,ou=sa,o=data


Why use this instead of the Generic File Driver that is open sourced or the delimited text driver?

The other drivers are great to use and very efficient. This example gives flexibility in having multiple files with different data format for many similar rules. The delimited text driver isn’t very friendly with the DirXML Script on the subscriber channel.

This is not an auditing solution or a replacement of reporting. However, if you are choosing for whatever reason to implement this type of policy to replace auditing or reporting, use caution on who has rights to the data and who can manipulate it, how easy it is to maintain, upgrade, and support.

This may take a little bit of time on scheduled jobs if you have a lot of users. It does one user at a time and writes one row out at a time. For generating reports this isn’t bad and it doesn’t grab a bunch of memory and hoze your server during the process. If you need supper speedy, you may want to use LDAP searches through Apache Directory Studio to help identify gaps in your data or anomalies.

I have not tried this for live events. The gist would be to veto the end of a rule for scheduled jobs as they are based on a trigger. This way they wouldn’t interfere with live event rules. Be careful what you send through your filter. Try to scope it down as much as you can so that you are not constantly analyzing every modification.

Remember not to write out too much data. It may be better to write out a unique id that is system generated to correlate with the user to use as a reference rather than supply their actual data. May be better to state an account with UID 000010 doesn’t have a manager. They are active and last logged in yesterday. This way if the report gets into the wrong hands, they don’t actually see extra data.