Entitlement Reports

Each security incident analysis must answer the question of who is responsible for it. The question seems simple but the answer is not.
Who they are used to attack the credentials, who have granted them, and most importantly, whether the person who served them their own?

In the case of databases, the problem becomes further complex due to the multi-dimensional matrix of privileges.
This problem manages separate kind of security solution – Privileged Identity Management (PIM) – and provides the access accountability and session recording but even we have it we still opened to account take ownership (ATO) and service exploitation.
In these cases we should be able to answer on few important questions:
  1. What privileges had the user in the particular (incident) point of time?
  2. Whether authorizations were consistent with change management or bypassed it?
  3. Whether they were sufficient to attack?
  4. Is used account was related to the operation of the account owner?

Answer to the first question requires implementation of the full process of identity management what is not simple at all and mainly covers the database access management on the role level only.

The Guardium Entitlements Reports (ER) functionality is simple but very useful feature to quickly determine the account authorizations in the defined point of time.

New: Guardium 10 ER contains new set of reports for DB2 on iSeries.

ER Prerequisites

ER works outside the standard activity monitoring and bases on scheduled data upload to customized audit data domains. Similar to Data Classification and Vulnerability Assessment uses direct data connection to the monitored database to collect required information.

We need create appropriate technical accounts for each database where ER data will be gathered. On each Guardium appliance there are SQL scripts with role definition with all required credentials to get ER content.

You can download them over fileserver, they are located in /log/debug-logs/entitlemnts_monitor_role/

Entitlement scripts

Entitlement scripts

When the role is already created and attached to a technical account we can create a data source (Setup->Tools and Views->Datasource Definitions) for “Custom Domain

Datasource definition

Data source definition

Use plus icon to add a new data source, the example below defines MSSQL access using SSL without authentication

MSSQL datasource

MSSQL data source

Test Connection button is activated when datasource configuration will be saved (Apply).

Tip: The data source creation process can be invoked directly from ER process but for clarity was presented as separate task

Data Upload

Now we can define the data upload process. For each database we have the set of ER reports. All are located inside custom tables. For example for Oracle we can find out 14 prepared tables (all names which starts at ORA) – Reports->Report Configuration Tools->Custom Table Builder

Custom table builder

Custom table builder

We need configure data upload for each interesting us report.
Select report and push the Upload Data button

Data upload

Data upload

Add Datasource button allows add the data source for which we will create entitlement snapshots. We can point multiple data sources from earlier defined or create a new one.

Overwrite flags (per upload, per datasource) defines how the data will be stored:

  • if both flags are unselected old data will not be removed when new snapshot will arrive (each ER data record contains time stamp, that we are able to identify them in time)
  • per upload means that the old data will be rerased every time when upload will be executed – it makes sense only when particular report contains only one datasource or we would like to remove old data intentionally
  • per datasource flag ensures that the old data for currently updated datasource only will be erased – it protects the old data for datasource which are not available during current data upload

Default Purge for custom domains is executed for every day and removes data older that 60 days. This behavior can be changed (described later)

Now we can upload data manually (Run Once Now) or/and define how often the snapshot or authorization will be created (Modify Schedule)

Configured data upload

Configured data upload

It is user decision how often snapshots will be created. However some recommendation here:

  • if you overwrite data you need archive them before (using audit process)
  • data upload gets data directly from database, it is not heavy task but for large databases with thousands roles and tables the quantity of data can be huge
  • snapshots provide authorization state in the particular time, to cover forensics requirements we need also audit the DCL (grant, revoke) transactions
  • 6-24 hours schedule for snapshot is usually sufficient

    Data upload scheduler

    Data upload scheduler

The data upload configuration steps described here should be repeated for all the interesting ER custom tables.
Now we can review the uploaded data (add ER reports to your dashboard)

Predefined ER list for Informix

Predefined ER list for Informix

ER report - MSSQL - objects visible for everyone

ER report example – MSSQL objects visible for everyone

Predefined ER reports have raw format and cannot be modified so I suggest redefined them to receive the expected appearance.

ER report customization

This standard report presents all privileges and roles assigned to user on MSSQL server. You can notice that in the last 3 hours has been created 2 snapshots and we cannot filter them as the other parameters

2 snaphots in standard report

2 snaphots in standard report

I placed below some reports variations:

#1 – Last snapshot with quick data filtering

Query

Query

Report

Report

We see last snapshot from define time frame and we can filter data by user, authorization, authorization type and database

New: Guardium 10 allows hide particular columns from query. No longer query reconstruction for this purpose 🙂

Column configuration

Column configuration

#2 – List of snapshots

Query and Report

Query and Report

New: “Runtime Parameter Configuration” window separates the user defined parameters from others. No more searching the parameter list for our own 🙂

Report runtime parameters

Report runtime parameters

#3 – Number of authorization for user

Query

Query

Graphical report

Graphical report

#4 – Authorizations from particular snapshot

Unfortunately the report parameter based on time stamp can be defined with one day granularity only. It does not allow us to point specific snapshot. Really?

We can use computed attribute to create snapshot id based on snapshot time stamp:

grdapi create_computed_attribute attributeLabel="Snapshot ID" entityLabel="MSSQL2005/2008 Role/Sys Privs Granted To User" expression="MD5(SQLGUARD_TIMESTAMP)"

This command creates a new dynamically created attribute as MD5 hash string based on time stamp value.

Now I can modify snapshot list report to see this unique id

Query and Report

Query and Report

and add the snapshot id to the parameter list of any report to filter data by time stamp. Easy!

Report wit computed attribute

Report wit computed attribute

Below the example of dashboard for incident analysis inside ER report

Forensics

Forensics

We can notice in this example that badguy user authorizations have been changed between 00:45 and 00:49. Using snapshot id parameter we can present parallel these two snapshots and identify change quickly.

How to create own ER report?

Guardium delivers many different ER reports for DB2, Informix, MSSQL, MySQL, Netezza, Oracle, PostgreSQL, SAP ASE, SAP IQ and Teradata. The custom domain mechanism allows to create own reports for other databases or add additional report to cover information unavailable in the predefined ones.

The good example is MSSQL where user login status is not visible in the predefined tables. From incident management perspective this information is crucial and should be gathered.

I have prepared the SQL to get this information:

select loginname AS 'user', CASE denylogin WHEN 0 THEN 'ACTIVE' WHEN 1 THEN 'INACTIVE'END AS status, CASE isntuser WHEN 0 THEN 'LOCAL' WHEN 1 THEN 'ACTIVE DIRECTORY' END AS 'user type', dbname as 'default database' from master..syslogins order by loginname

Next, we need create a new custom table (Reports->Report Configuration Tools->Custom Table Builder). We have two possibilities, define table from scratch or import structure from SQL. I prefer the second method:

Custom table creation

Custom table creation

In the “SQL Statement” we need insert the SQL which returns sample of reference data. Add Datasource lets specify the database where sample exists. Finally we are ready to Retrive table definition

Table structure import

Table structure import

If the import was successful we return to “Custom Tables”. To review structure push Modify button

Custom table selection

Custom table selection

We can modify fields, define keys and syntax reference to the Guardium groups.

Custom table modification

Custom table modification

Now we can Apply changes and set the Upload data configuration

Data upload

Data upload

Note: Custom table definition can be modified until it does not contain data

We have data but they are not available for reporting till we create a new report domain (Report->Report Configuration Tool->Custom Domain Builder). Plus (+) icon allows create new domain. Insert “Domain name” and find out the created earlier custom table. Move it from “Available entities” to “Domain entities“. Then select default time stamp from “Timestamp Attribute” list and Apply

Custom domain selection

Custom domain creation

Our new domain is visible now in the custom query builder (Report->Report Configuration Tools->Custom Query Builder). Select domain and create all demanded queries and reports. Below report with all MSSQL logins and their status

MSSQL logins

MSSQL logins

ER data management

If we have the ability to use data collected by forensic analysis will need to set their proper retention (archive and restore). These settings are available in “Custom Table Builder” – Purge/Archive button. Archive check box ensures attach data from a custom table to data archived in the standard archive process. We can define how long data will be available locally  (Purge data older than) and schedule purge process (60 days is default value)

Custom table archive

Custom table archive

Tip: Do not forget archive the data stored in custom tables

Summary: ER is a useful tool in the forensic analysis and significantly shorten the time needed to identify permissions held by the subject of incident. The ability to customize the data presentation, scheduled data load and expansion of the area of collected information makes this tool indispensable element of SO in his duty. These data can also be used to identify privileged accounts for the proper definition of audit policy.

Advertisements