Automatically Creating Alerts for Users

A question over on the forum about the possibility of having a notification sent to users when they've been selected in a custom property.  For instance, maybe when a request for information is registered I want to alert someone.  Whomever is selected should get a notice.

2017-12-07_18-17-08.png

This behavior is really what the "assignee" feature is meant to handle.  Users use their in-tray to see these newly created items.  However, the email notification options related to that feature are fairly limited.  If I, as the user, create an alert then I can receive a notification that a record has been created and assigned to me.

2017-12-07_18-17-37.png

Here are the properties for the alert...

2017-12-07_18-20-57.png
2017-12-07_18-21-06.png
2017-12-07_18-21-13.png

Now the challenge here is that administrators cannot access alerts for other users.  We could tell each user to do this themselves, but then I wouldn't have much to write about.  So instead I create another user and craft a script.

2017-12-07_18-25-23.png

The script needs to find all users that can login and create an alert for each user not having one.  To do that we'll have to impersonate those users.  That requires you either run this as the service account or you add the impersonator to the enterprise studio options.  You can see below the option I'm referencing.

2017-12-07_18-03-31.png

This is how it appears after you've added the account...

Now I need a powershell script that finds all of the users and then connects as that user to perform the search for an alert.  If it can't find an alert then it creates a new one in the impersonated dataset.  This script can then be scheduled to run once every evening.  The audit logs will show that this impersonation happened, so there should be no concern regarding security.

Clear-Host
$LocationCustomPropertyName = "Alert Location"
Add-Type -Path "D:\Program Files\Hewlett Packard Enterprise\Content Manager\HP.HPTRIM.SDK.dll"
$Database = New-Object HP.HPTRIM.SDK.Database
$Database.Connect()
$LocationCustomProperty = [HP.HPTRIM.SDK.FieldDefinition]$Database.FindTrimObjectByName([HP.HPTRIM.SDK.BaseObjectTypes]::FieldDefinition, $LocationCustomPropertyName)
#Find all locations that can login
$Users = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch -ArgumentList $Database, Location
$Users.SearchString = "login:* not uri:$($Database.CurrentUser.Uri)"
Write-Host "Found $($Users.Count) users"
foreach ( $User in $Users ) 
{
    try {
        #Impersonate this user so we can find his/her alerts
        $TrustedDatabase = New-Object HP.HPTRIM.SDK.Database
        $TrustedDatabase.TrustedUser = $User.LogsInAs
        $TrustedDatabase.Connect()
        Write-Host "Connected as $($TrustedDatabase.CurrentUser.FullFormattedName)"
        #formulate criteria string for this user 
        $CriteriaString = "$($LocationCustomProperty.SearchClauseName):[default:me]"
        #search using impersonated connection
        $Alerts = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch $TrustedDatabase, Alert
        $Alerts.SearchString = "user:$($User.Uri) eventType:added"
        #can't search on criteria so have to inspect each to see if already exists
        Write-Host "User $(([HP.HPTRIM.SDK.Location]$User).FullFormattedName) has $($Alerts.Count) alerts"
        $AlertExists = $false
        foreach ( $Alert in $Alerts ) 
        {
            if ( ([HP.HPTRIM.SDK.Alert]$Alert).Criteria -eq $CriteriaString ) 
            {
                $AlertExists = $true
            }
        }
        #when not existing we create it
        if ( $AlertExists -eq $false ) {
            $UserAlert = New-Object HP.HPTRIM.SDK.Alert -ArgumentList $TrustedDatabase
            $UserAlert.Criteria = $CriteriaString
            #$UserAlert.ChildSubscribers.NewSubscriber($TrustedDatabase.CurrentUser)
            $UserAlert.Save()
            Write-Host "Created an alert for $($User.FullFormattedName)"
        } else {
            Write-Host "Alert found for $($User.FullFormattedName)"
        }
    } catch [HP.HPTRIM.SDK.TrimException] {
        Write-Host "$($_)"
    }
}

When I run the script I get these results....

2017-12-07_18-33-09.png

Ah!  When I created Elmer I didn't give him an email address.   I could either update this script to automatically create email addresses for users, simply exclude them from the initial search for users (by adding "email:*" to the search string), or by sending a log of this script to an email address for manual rectification. 

For now I manually fix it and run the script again.

2017-12-07_18-40-09.png

If I run it a second time I can see it doesn't re-create it for him.

2017-12-07_18-40-51.png

Success!  Now just schedule this to run on a server once every day and you have a viable solution to the posted question.  Before implementation you might adjust the script to only target those users who would ever be selected for the custom property (if that's possible).  

Blob Storage in 9.2

With the introduction of 9.2 there are two new types of document stores: 

 
2017-12-06_21-04-14.png

I'm going to focus on the Microsoft Azure Blob Store for now, as that's what I have access to.  Before continuing within  Content Manager, I first flip over to Azure and create a new storage container.  Once that has been created I'll flip back over to CM.

After creating the container I then went to the storage account and found the access keys.  From there I can copy the connnection string to my clipboard and paste it into the new document store dialog.  

2017-12-06_20-59-48.png

Now back over in CM I can complete the details of the document store.

2017-12-06_21-08-49.png

Then I clicked test to verify the settings...

2017-12-06_21-08-55.png

Then I go create a new document record type that will use this document store...

2017-12-06_21-10-30.png

Now I can import a document and see what happens...

2017-12-06_21-12-14.png

Over in Azure I can see the file has been uploaded..

2017-12-06_21-14-55.png

I can manually transfer a few records to the document store from the electronic sub-menu of the context menu...

2017-12-07_2-15-59.png

When prompted I just need to pick the new document store..

2017-12-07_2-16-39.png

Alternatively, I could transfer the entire contents of any store into this one by selecting transfer from the context menu of a document store.  The default options are the best.  You should not need to use the others.

2017-12-07_2-19-07.png

Lastly, I might want to use the Blob storage as a cheaper storage layer for older content.  I should use the tiered storage options if available to me.  I can make this a storage tier 3.

2017-12-07_2-21-52.png

But I'll update the main document store so that the dates are tracked (it was already at level 1)...

2017-12-07_2-23-46.png

Now I can create a saved search that I'll run routinely to move records to cheaper cloud storage.  Note the search below finds all items using the main store but accessed more than a year ago.

2017-12-07_2-26-38.png

I would tag all the resulting records and transfer them to storage tier 3.  This could be automated via a powershell script as well.  Lastly, workgroup server level document caching may be warranted when using blob storage.

Creating a PostgreSQL Dataset

Creating a database that uses PostgreSQL is just as easy as SQL.  Here I'll step through the process and show screenshots of the various steps.  Although I'm using a locally installed PostgreSQL database, the process for a remote one is the same.

2017-12-04_21-40-10.png

Identifying the dataset requires the entry of the name and ID.  Then I selected the PostgreSQL dataset type and clicked next.

2017-12-04_21-40-50.png

Next I clicked the KwikSelect icon

2017-12-05_8-41-11.png

Then I picked the appropriate driver.  Note that if you select the ANSI driver you will still be forced to enable unicode characters (you cannot uncheck the option later in the wizard).  Therefore you should pick the unicode driver.

2017-12-05_8-40-44.png

If you don't see any drivers then you need to run the application stack builder and install the appropriate drivers.  As you can see below, I've already installed the drivers I'll need.

2017-12-05_8-42-09.png

The connection string is not much different from SQL Server.  Check with your database administrator to verify that your driver is correct.

2017-12-04_21-46-01.png

Clicking OK will then enter the connection string into the dialog.  You can change the string manually if you need to for some reason.  Though you cannot change the password as it's encrypted here.

Encrypted password

Encrypted password

At the last step in the configuration process I made sure to uncheck the GIS columns option.  I don't currently have PostGIS setup and configured.

2017-12-04_22-15-28.png

The creation of the database finished without error.

2017-12-04_22-17-22.png

Now I can inspect the schema that's been created.  The exact same number of tables exist in PostgreSQL as within SQL.  

2017-12-05_8-36-53.png

From within the client you can see the configuration in the setup information dialog.

2017-12-05_8-49-58.png

The schema manager feature in the Enterprise Studio behaves the same, with all the same options.  To verify that I manually removed some indexes.  Running check should highlight this.

2017-12-05_8-54-56.png

As I hoped the feature works and tells me I'm missing some indexes.

2017-12-05_8-56-44.png

You can also migrate from SQL to PostgreSQL by using the export feature in the Enterprise Studio.

2017-12-05_9-11-27.png

However, take care to plan ahead.  Any dataset created with the GIS options will require the PostGIS be installed locally.  The header of the export will include a message indicating such.

2017-12-05_9-13-18.png