Performance Problems in Lithnet

An associate asked, ” How could there be performance problems for something like Lithnet?”

Here’s my response:

MIM has a unsupported .NET library which exposes low level calls to retrieve separate bits of data from the MIM internal data structures.  Lithnet uses these calls, and compiles together complete results for a call like “Get-PendingImports”.  Lithnet has to make many calls to get the information it needs: probably one for the list of connector space object IDs, then for each connector space object one to get metadata (DN, object type etc), one to get the attribute hologram, one to get the pending import metadata, one to get the pending import attribute delta, one to get the pending import hologram, one to get the unapplied (pending) export metadata, one to get the unapplied (pending) export deltas, one to get the unapplied (pending) export hologram, one to get the escrowed (unconfirmed) export metadata, one to get the escrowed (unconfirmed) export delta, one to get the escrowed (unconfirmed) export hologram, etc etc.  Unfortunately calls like “Get-PendingImports” return objects with all of that data (yes, even the pending export data) and that takes time to retrieve, so if there are a lot of pending operations then it can be quite time-consuming.

CSExport.exe isn’t much better, but it is a little bit more controllable than the Lithnet library, so it can be used to improve the performance slightly.  The best approach would be to use the underlying .NET library directly and only make the absolute minimum number of calls to get just the data I actually need.  That’s a road I haven’t gone down… yet.

Audit drop files don’t require any .NET calls to get the update information, but there are three big problems with them.  Firstly, they’re expensive to generate and write out to disk.  Secondly, for a Full Import they include all records and you therefore can’t tell which ones are adds, updates or deletes unless you compare the entire data set to the existing connector space, which is incredibly slow.  Thirdly, even for deltas some MAs just send a drop file with the whole object and “replace” operations (i.e. update if exists or add if does not exist) and so again you therefore don’t know if it’s an update or an add operation.

Here endeth today’s lesson; Thanks be to Microsoft.  AMIM.

Unpopulated sync rule “External System Resource Type” drop-down for FIM/MIM Generic LDAP MA

In order to populate the External System Resource Type drop-down on the Create/Edit Synchronization Rule form of the MIM Portal, the client retrieves the ma-data object from the MIM Service using a WCF call. In certain cases (e.g. for General LDAP MAs connected to extensive back-end data sources like Oracle Identity Directories (OIDs)) the object returned can be very large – and if it is in excess of the maximum allowed size of an WCF call (14MB by default) then the call will fail and the External System Resource Type drop-down will be left blank/empty/not populated.

The solution to this is to edit the Portal’s web.config file and increase the limit to something larger (eg. 50MB).

To find the web.config file (usually something like C:\inetpub\wwwroot\wss\VirtualDirectories\80\web.config) run IIS Manager and right-click “Expore” from the MIM Portal site:

Near the end of the file (usually) add maxReceivedmessageSizeInBytes=”52428800″ to the resourceManagementClient element:

An web site or IIS restart is probably be required to active this change!

More info?
https://docs.microsoft.com/en-us/previous-versions/mim/ff800821(v=ws.10)
https://identityunderground.wordpress.com/2014/11/26/new-hotfix-rollup-released-for-fim2010-r2-now-build-4-1-3613-0/

Various MIM, Broker & UNIFYNow conventions explained

This post is a consolidated collection point for notes about MIM, UNIFYBroker and UNIFYNow conventions that should be followed, with explanations why.

UNIFYNow

MIM Run Profile Operation Names

Convention

In UNIFYNow, operations that run MIM profiles should always have the name of the MA and the Run Profile being invoked clearly evident.

Rationale

When MIM run profiles are recreated, or ‘Refresh MA’ is clicked in UNIFYNow (either intentionally or accidentally) the run profile GUIDs on operations can end up being incorrect.  When this happens it may be necessary or expeditious to simply re-select the appropriate MAs and run profiles from the GUI, in which case having the name of the MA and run profile in the operation name will be extremely beneficial.

In Event Broker 4 (an older version of UNIFYNow) there is a known issue where this can happen unexpectedly when a server is rebooted in certain, rare conditions.  The exact trigger for this behaviour is unknown.  It may be related to starting Event Broker on a system where the WMI Service is not running, but this is by no means certain.

Sharepoint and MIM

Installing MIM and all its bits (especially Sharepoint) is SOOO easy… if you get *every* *single* *thing* *right*.  But make even one tiny mistake and you’ll be debugging it for hours.

Here are my self-reminder notes, covering the strange things I’ve found, and the things I seem to get wrong every time:

  • The prerequisites can be installed by a domain admin if that works out easier.
  • Pre-requisite problems?
    • Add-WindowsFeature NET-HTTP-Activation,NET-Non-HTTP-Activ,NET-WCF-Pipe-Activation45,NET-WCF-HTTP-Activation45,Web-Server,Web-WebServer,Web-Common-Http,Web-Static-Content,Web-Default-Doc,Web-Dir-Browsing,Web-Http-Errors,Web-App-Dev,Web-Asp-Net,Web-Asp-Net45,Web-Net-Ext,Web-Net-Ext45,Web-ISAPI-Ext,Web-ISAPI-Filter,Web-Health,Web-Http-Logging,Web-Log-Libraries,Web-Request-Monitor,Web-Http-Tracing,Web-Security,Web-Basic-Auth,Web-Windows-Auth,Web-Filtering,Web-Digest-Auth,Web-Performance,Web-Stat-Compression,Web-Dyn-Compression,Web-Mgmt-Tools,Web-Mgmt-Console,Web-Mgmt-Compat,Web-Metabase,WAS,WAS-Process-Model,WAS-NET-Environment,WAS-Config-APIs,Web-Lgcy-Scripting,Windows-Identity-Foundation,Xps-Viewer -verbose
    • Turn on .NET 3.5 feature
  • Install Sharepoint (and create the farm and CA) as the MIMAdmin user, even if it has to be temporarily made a local admin in order to do so.
  • Get the installers from https://my.visualstudio.com/Downloads
  • Get the license key for testbed installs from https://my.visualstudio.com/ProductKeys
  • MIMAdmin needs sysadmin rights in SQL.
  • When building a single node testbed with ADDS on the same server, remember that “local” group memberships are configured in the Builtin container in AD.
  • Make sure that the MIMSharepoint user has “login as a service” rights.  Then make sure it doesn’t have “DENY login as a service” rights.
  • Always do the post-installation configuration (Farm and Central Admin creation) using a powershell ISE window that is Run As Administrator.
  • Once the permissions and rights issues have all been solved, usually the farm still can’t be created due to persistent rubbish.  Remove Sharepoint altogether, delete the databases, reboot and reinstall it from scratch.
  • After a failed attempt to install the MIM Service + Portal, it’s often necessary to delete and recreate the whole Sharepoint WebApp before trying again.  I’ve even had to resort to uninstalling and reinstalling Sharepoint entirely!
  • This is certainly a weird one – when installing MIM on a domain controller it looks like the MIM Service account needs to have local admin privileges, otherwise MIM Sync cannot impersonate it when connecting to the FIMService database during setup of the MIM Service MA.  It just reports “The credentials provided for accessing Forefront Identity Manager are invalid” in the Forefront Identitity Manager Management Agent event log otherwise.

And here are my helpful scripts for the Sharepoint configuration steps:

Create-SPFarm-example.ps1

Create-MIMPortalSPWebApp-example.ps1

*** This Is A Work In Progress ***

UNIFYBroker-based AD Provisioning Model

Background

The UNIFYBroker-based Active Directory provisioning model uses a Powershell connector in Broker to action “once-off” operations for an AD user account, such as:

  • Account creation
  • Account deletion
  • Mailbox enable/disable

The Broker Powershell connector has the following configuration components:

  • “Import All” – e.g. pre-load all existing users from AD
  • “Import Changes” – e.g. identity and load newly created users from AD, and remove any deleted users from AD
  • “Add Entities” – called when a new user record is created in MIM, to trigger and perform creation of the required AD account
  • “Update Entities” – called when user attributes are updated in MIM, to trigger any required actions (such as enable/disabling an exchange mailbox)
  • “Delete Entities” – called when a user record is deleted in MIM, to trigger and perform deletion of the corresponding AD account

The Powershell connector can also be used for AD objects other than users.

[CUSTOMERX] Solution

A key element of the user lifecycle at [CUSTOMERX] is the “Identity State” of each user, which is derived from various user attributes (e.g. HR contract start date, manual disable flag, user last login timestamp, etc) and drives the values of various other attributes (e.g. AD Distinguished Name, account disable/enable state, exchange mailbox enable/disable, etc).  There are similar attributes (“Group State” and “Contact State”) for AD groups and AD mail contacts.

The following documents the specific actions performed by Broker connectors:

AD User connector

  • Import All: load all user objects from AD (Get-ADUser command)
  • Import Changes: identity new and deleted user objects in AD (Get-ADObject command)
  • Add Entities: create AD user account (New-ADUser command)
  • Update Entities: depending on the Identity State attribute (from MIM), enable or disable the user’s exchange mailbox (enable-mailbox and disable-mailbox commands via a remote PSSession to the Exchange server)
  • Delete Entities: delete AD user account (Remove-ADObject command)

AD Group connector

  • Import All: load all group objects from AD (Get-ADGroup command)
  • Import Changes: identity new and deleted group objects in AD (Get-ADObject command)
  • Add Entities: create AD group object (New-ADGroup command)
  • Update Entities: depending on the Group State attribute (from MIM), hide or show the group’s address in the GAL (Set-DistributionGroup command via a remote PSSession to the Exchange server)
  • Delete Entities: delete AD group object (Remove-ADObject command)

AD Contact connector

  • Import All: load all contact objects from AD (Get-ADObject command)
  • Import Changes: identity new and deleted contact objects in AD (Get-ADObject command)
  • Add Entities: create AD contact object (New-ADObject command)
  • Update Entities: depending on the Contact State attribute (from MIM), mail enable the contact object (Set-MailContact command via a remote PSSession to the Exchange server)
  • Delete Entities: delete AD contact object (Remove-ADObject command)

These three Broker connectors each have a corresponding Broker adapter, but share one “Provisioning” MA within MIM.  Please refer to the Provisioning MA configuration and the Powershell connector implementation for important aspects of the solution (e.g. required minimum attributes that must be exported from MIM for use when creating the various AD objects).

Issues Encountered and Key Learnings

The following important lessons were learned during implementation.

“Import All” cannot be treated as a periodic baseline as per our normal MIM best practices, because when run it resets (nulls) all connector attribute values other than those it sets directly.  This means that the next MIM export operation (after a MIM full import/sync to identify that the attributes have been cleared – and not a delta import/sync as this does not notify MIM of the changes) will have to restore all export attribute values (i.e. Identity/Group/Contact State, in the [CUSTOMERX] case) and this will re-trigger any “Update Entities” code for all objects (i.e. attempt to re-enable or re-disable every user’s exchange mailbox, in the [CUSTOMERX] case).

“Import Changes” functionality must be implemented and cannot be set to “None”.  Following on from the previous paragraph, because regular baselining of the connector is problematical and best avoided an incremental synchronisation method is needed to keep the connector in sync with the source AD data.  If this is omitted then objects that are created directly in AD by other systems will not be visible in the connector (and consequently won’t be created in the MIM “Provisioning” MA, leading to failed provisioning requests from MIM when the attempt is made to create the already-existing AD object).

The default timeout for MIM export profile runs to ECMA2 connectors such as Broker is 60 seconds, and the default batch size is 5,000 objects.  At [CUSTOMERX], Broker was unable to process requests from MIM fast enough, leading to timeouts which manifest in MIM as “cd-error” errors on each exported object with no other useful information logged anywhere.  Therefore it is highly recommended that the timeout be significantly increased.  It may also be effective to decrease the batch size, but this approach was not tested and is therefore unconfirmed.

Errors during export (e.g. problems during the development of the Broker Powershell connector code, or transient errors when executing commands from the Broker Powershell connector code) cause MIM to report errors against exported objects, even though Broker may actually correctly save the updated value of that attribute.  If a subsequent export attempt is made then Broker returns an “Other” error on the second export attempt, with error default “Internal error #9 (Cannot add value)”.  Debugging this situation and identifying what is causing it to occur can be difficult, since it only occurs on a second export after a previous error has been encountered, for a different reason.  Consequently it is recommended that all Broker powershell “Update Entities” code must include error-checking and be wrapped in exceptions handling and use the Failed Operations mechanism (“$components.Failures.Push($entity)”) to correctly record operation failures.  Refer to the Broker Powershell connector documentation for details of this mechanism.  It is arguably a bug in Broker that it fails to correctly handle the second export’s attribute update.

When using PSSessions in the Powershell connector to perform operations remotely, be sure to call Remove-PSSession to free the session as soon as it is no longer required.  Failure to do so may contribute to Broker misbehaviour, although this is unconfirmed.

Quick Guide to Common MIM Export Errors

“cd-error” with no other details – a timeout from the MIM side.  Increase the export profile’s timeout (“Operation Timeout (s)” on the third screen of the Run Profile wizard – NOT the Custom Data “Timeout (seconds)” on the second)

“Other” Internal error #9 (Cannot add value) – a previous export failed and Broker updated the attribute but reported the error back to MIM, which retries the export and confuses Broker which has already saved a value to this attribute.

If you are having other errors during export, remove your powershell code altogether and leave just a empty stub in place that does nothing (not even load a blank external script).  Try running your entire export (full data set) and make sure it works correctly first.  Then try some delta exports (just a few records) and make sure they work.  Once you’re sure you have that mechanism working correctly, but your code back in gradually – add functionality bit by bit, making sure you do both the entire export (full data set) and the the delta export (just a few records) each time.

Changing case/capitalisation on Identity Broker attribute names

UNIFY’s Identity Broker keeps old attribute names (both connector and adapter), in the ‘CollectionKey’ SQL table.  The attribute’s name (‘Caption’ column) is a unique key in that table.

Even if you delete an attribute from every connector and adapter in Broker, it still keeps that attribute name record.  Even if you delete all data for the attribute!  It’s there for good, with whatever capitalisation you first used.

If you try to use that same attribute name with a different capitalisation (e.g. in an Rename adapter transformation) Broker does a case-sensitive search for that attribute name’s CollectionKey record and doesn’t find it (because the case doesn’t match).  So it tries to create one… but this fails because the SQL unique key for the Caption field is case insensitive!  The error is:

Violation of UNIQUE KEY constraint 'DF_CollectionKey_Caption'.
Cannot insert duplicate key in object 'dbo.CollectionKey'.
The duplicate key value is (xxxx).

Solutions to this predicament are:

  1. Don’t ever change the capitalisation of the attribute name (avoidance)
  2. If you have to change it (e.g. because you got it wrong in the first place) then go into the SQL table and change the Caption to whatever capitalisation you want 🙂

Failure creating new ECMA2 MA: Exception from HRESULT: 0x8023072A

While creating a new ECMA2 MA, I was faced by this error: “Exception from HRESULT: 0x8023072A”

The only helpful info I found was this, buried deep in https://winprotocoldoc.blob.core.windows.net/productionwindowsarchives/WinArchive/[MS-UPSDBDAP].pdf

0x8023072A extension-dll-noimplementation
The management agent extension does not contain a class that implements the extension interface(s).

Turns out the DLL I was using (UNIFY Identity Broker) was an old ECMA one, not ECMA2.

MIMWAL code for DateTime midnight in local timezone

I needed to calculate a DateTime value for midnight in the local time zone, for comparison against other date fields in my policy.  It’s painfully difficult to do, but just barely possible.  Here’s my solution:

DateTimeNow() -> $now_utc
Mid(DateTimeFormat($now_utc, "zz"), 1, 2) -> $timezone_offset
ConvertToNumber(DateTimeFormat($now_utc, "HH")) -> $utc_hours
DateTimeFromString(DateTimeFormat($now_utc, "dd/MM/yyyy"), "en-AU") -> $midnight_utc
Subtract(24, ConvertToNumber($timezone_offset)) -> $rollover_hour
IIF(LessThan($utc_hours, $rollover_hour), "0.00:00:00.0", "1.00:00:00.0") -> $timezone_day_fix
Concatenate("-0.", $timezone_offset, ":00:00.0") -> $timezone_hour_fix
DateTimeAdd(DateTimeAdd($midnight_utc, $timezone_hour_fix), $timezone_day_fix) -> $midnight_local

Please note that this only works for positive integer timezones (such as Australia/East) but at least it handles daylight savings correctly.  For non-integer timezones, the $rollover_hour calculation would need to be extended, and for negative timezones something else would probably need to be done but I haven’t thought it through properly.

To be honest it would be a lot easier and more efficient to just write a custom activity!

Criteria-based sets and “prior to today” evaluation in the Portal vs the MIM Service

So… maybe you have a set like this:

You might wonder exactly what “prior to today” actually means.  Especially when it comes to timezone handling.  And if you’re not wondering that, well you probably should be 🙂

First of all, remember that all your time DateTimes are (or at least should be) stored in UTC inside the MIM Service database.  That’s the sensible approach, and the Portal will format your DateTimes correctly based on the configured timezone if you do.

Anyway, back in the Portal and you’re clicking the “View Member” button.  It’s not hard to work out that “prior to today” seems to actually mean “prior to now” there.  Specifically, the UTC time in your attribute needs to be before the current UTC time.  I’m not specifically sure if that’s the web browser’s UTC time or the server’s UTC time, but in any sensible environment they’ll be the same anyway.

Here’s a quick look at the underlying Filter attribute to get a better understanding of that:

So it’s all about fn:current-dateTime().  Cool.

So far so sensible.  Now we know that temporal sets like this only get re-evaluated when the FIM_MaintainSetsJob runs on the SQL server.  And here’s where the pain sets in – a user with accessExtensionExpiryDate prior to the current time appeared as a member just fine in the Portal, but when I ran the SQL agent job to recalculate my set membership the user object didn’t get added to the set.

After quite a bit of trial and error it appears that the SQL Agent job’s definition of “prior to now” is “before 1AM today in the local server time”.  I can sort-of understand a decision to do the evaluation in local time – at some time someone decided that the job should just do the sensible thing, since “prior to today” inherently needs to know what “today” means, and using local server time is probably what MIM developers want, rather than “today” in the UTC sense.  But the 1am threshold?  That makes basically no sense to me.  I can only guess it’s because FIM_MaintainSetsJob is configured to run at 1am by default.  If that’s the case then hard-coding the definition of “today” to the time when a SQL job runs by default isn’t my idea of sensible programming 🙁

No wonder temporal sets cause headaches!

Addendum: if you really want to see what’s in your Set, LithnetRMA is your friend:

(Search-Resources -XPath "/Set[DisplayName='My Lovely Set']" -AttributesToGet @("ComputedMember")).ComputedMember

Powershell titlecasing an email address, with awesome

Sometimes, awesome:

$split = $address -split "@"
$address = ([regex]"\w+").Replace($split[0], { $args[0].ToString().Substring(0, 1).ToUpper() + $args[0].ToString().Substring(1).ToLower() }) + "@" + $split[1]

Handles james.o’tootle@mydomain.com -> James.O’Tootle@mydomain.com.  Does not handle james.mcnally@hardwork.com -> James.McNally@hardwork.com, though… sorry but semantic parsing is beyond the scope of this blog 🙂