How to create a PowerShell connector for synchronization

Introduction

If an application is not listed among the standard Netwrix Identity Manager connectors, this does not represent a limitation to integration. As long as the application exposes an interface such as LDAP, SQL, or an API, it can be seamlessly integrated into Netwrix Identity Manager through its generic connectors.

Among these, the PowerShell connector stands out as a particularly powerful and flexible option. It enables direct interaction with virtually any system that can be accessed or automated via scripts, APIs, or command-line tools. This makes it possible to manage identities, accounts, entitlements, and lifecycle events even for highly customized, legacy, or cloud-native applications.

By leveraging PowerShell, organizations can reuse existing automation scripts, adapt quickly to application-specific logic, and implement advanced provisioning, reconciliation, and governance scenarios without waiting for a dedicated connector. This approach significantly reduces integration time, increases agility, and ensures that all critical applications remain within the scope of identity governance and compliance.

The goal of this show & tell is to help partners and customer understand and configure a PowerShell connector.

How NIM connectors work in general

Architecture

Data Extraction

  1. Extract the data from the application using the appropriate connection. For standard and generic connectors, this action is already included in the connection. For the PowerShell connector, this action needs to be written based on the method provided by the application (API, CLI, etc.)
  2. The result of the extract is a set of CSV files. The output files are stored in a dedicated folder called β€œExportOutput”. The folder is located in the β€œTemp” repository of NIM. For the PowerShell connector this action needs to be written after the data export.
  3. When Encryption is enabled, all csv files are/need to be encrypted using the same encryption certificate configured in the appsettings. For the PowerShell connector, this action need to be written before saving the csv files to the β€œExportOutput” folder.

Data Processing

This step is common to all connectors, whether standard, generic, or custom. During this phase, Netwrix Identity Manager processes the collected data to ensure consistency and reliability, including the detection of empty lines, duplicate keys, and missing or invalid values.

Additional data normalization and transformation operations are also performed to prepare the information for secure and accurate loading into the Identity Manager database, ensuring data quality and integrity across the identity lifecycle.

Data Synchronization

This step is common to all connectors, whether standard, generic, or custom. During this phase, Netwrix Identity Manager persists the processed data in its database and applies the required changes by creating new records, updating existing entries, and deleting obsolete entries, ensuring data alignment with the connected systems.

Step by step configuration

0 - Model Your Data

Before configuring the connector, it is essential to model the data that will be synchronized from the target system. This includes clearly identifying the account objects and the attributes that need to be collected and governed within Netwrix Identity Manager.

In addition, the permissions catalog must be defined, along with the relationships between accounts and the permissions, roles, or entitlements they hold. This modeling phase ensures that access rights are represented accurately and consistently within the identity governance framework.

This step is typically performed outside of Identity Manager and often requires close collaboration with application owners or functional teams who have direct knowledge of the application data model and access to the underlying system. Their involvement is critical to ensure data accuracy, completeness, and alignment with business reality.

For a HR connector, no permissions are synchronized but other data can be extracted. You need to identify the entities (e.g., Person, Department, Site, UserType, etc.) and their relationships (e.g., Person-Department, Person-Manager) that will be represented in Identity Manager.

1 - Create the Script

1.1 - Requirements

  • Output folder: all exported files must be stored in the β€œExportOutput” folder located in the NIM β€œTempβ€™β€œ repository. Files wont be processed if they are not in this folder.
  • Output name: all exported files must have a specific name prefix. The name prefix is basically the identifier used when configuring the connection in NIM β€œ_”

Note: A file per extracted object is required (users, groups, roles, etc.)

Example: given an application called DemoApp with users and roles. 3 files need to be generated:

  • /Temp/ExportOutput/DemoAppSync_Users
  • /Temp/ExportOutput/DemoAppSync_Roles
  • /Temp/ExportOutput/DemoAppSync_UserRoles

Example (HR): given an application called DemoHR with persons, departments, UserTypes. 3 files need to be generated:

  • /Temp/ExportOutput/DemoHRSync_Persons
  • /Temp/ExportOutput/DemoHRSync_Departments
  • /Temp/ExportOutput/DemoHRSync_UserTypes

The Person/Department and Person/UserType links can be extracted in the Users file.

For more details please refer to NIM documentation

1.2 - Write the script

  1. Extract the data: In this part of the script you just need to extract all the data from the system (application, HR, etc).
    Example: call REST API to extract the list of users and permissions.
  2. Format the data: Once extracted, the data can be formatted by filtering unnecessary attributes or entries, flattening complex objects, etc.
  3. Export the data: For each entity, create a CSV file following the current criteria:
    Encoding: β€œUTF-8”
    Delimiter: β€œ,”
    Path: β€œ/Temp/ExportOutput/_”
  4. Encrypt the data: This step needs to be performed only if the β€œEncryptFile” option is enabled in the appsetting.json file.
    To Encrypt a file using the configured encryption certificate you need to run this command at the end of the script:
    .\Usercube-Encrypt-File.exe -f $UsersFilePath $PermissionsFilePath $UserPermissionsFilePath -o $ExportOutputFolderPath

You can find some examples below:

Export-TalentHR.ps1 (2,4 Ko)
ExportDynamicsBusinessCenter.ps1 (6,4 Ko)

2 - Configure the PowerShell connector

2.1 - Create the connector

  • Go to the Configuration section in the Identity Manager UI and select Connectors.
  • In the top right corner, click on the addition icon (+) to create a new connector.
  • Fill in the required fields:
    • Identifier: Unique, starts with a letter, contains only letters, numbers or ”-”.
    • Name: Display name for the connector.
    • Agent: Select the agent that will connect to the target system.
    • Complete job: All checked
    • Incremental job: All checked
  • Click on +Create button

2.2 - Configure the connection

  • Create connections for each script (if more than one).
  • In the Connections section, click on the addition icon β€œ+” to create a new Connection
  • Fill in the required fields:
  • Identifier: Unique, starts with a letter, contains only letters, numbers or `-`.
  • Name: Display name for the connection.
  • Package: click on β€œSelect a Package” then search for the β€œCustom/PowerShellSync” package and click on β€œSelect”
  • Fill in the connection settings
    • PowerShell Script Path: full or relative path to the PowerShell script (for relative paths the reference folder is Runtime)
  • Click on the Check Connection button to verify the script path
  • Click on the Test Script button to run it in order to generate the csv files for the first time. This allows to discover to schema (entities & attributes)
  • Click on β€œCreate & Close” in the top right
  • Repeat the steps for each script.

Note: This step marks the end of the customization phase. All subsequent configuration steps follow the standard Netwrix Identity Manager configuration process and apply uniformly to all connector types, whether standard, generic, or custom.

2.3 - Configure the Entity Types

  • Create entity types that correspond to your application data model (e.g., DemoApp_User, DemoApp_Role, etc.).
  • In the Entity Types section, click on the addition icon to create a new Entity Type
  • Fill in the required fields:
    • Identifier: Unique.
    • Name: Display name for the Entity Type.
    • Plural Display Name
  • Click on β€œProperties”
  • Select the Entity Type source from β€œSource” list (in the left side)
    Note: If the list is empty or the source does not exist, make sure that connector Schema has been refreshed.
  • In the right side click on β€œMap scalar properties”. Select the list of attributes you want to map then, click on β€œMap selected columns”
  • Select the β€œMapping key”: represents the primary unique key
  • Select the β€œKey properties” represent secondary unique keys that can be used for query rules.
  • Click on β€œCreate & Close”
  • Click on β€œReload” in the top left
  • Repeat the steps for each Entity Type

2.4 - Synchronize the Data

  • In the Entity Type section click on β€œJobs” then, β€œComplete Mode > All Tasks”
  • In the left side click on β€œJob Results” to check the connector’s logs and ensure synchronization completes successfully (A new tab opens).
  • Go to Home. In the navigation panel (left side) click on β€œConnectors//” (i.e., β€œConnectors/DemoApp/DemoApp Users”) and verify that the data has been correctly synchronized
6 Likes

Hello Hazem,
Thank you so much for this, it is exactly what I needed.
Currently, we have some configuration at the customer that is not using the PowerShellSync connector but the CSV one.
In our daily synchronization, we have a task set to run a script to export the data to a CSV, and the CSV connector then reads it. The PowerShellSync connector is the better way of doing this.
To be honest, I couldn’t make it work solely with the product documentation because of the output names. Thank you for the detailed explanation.

2 Likes