Automatically Creating Alerts for Users

A question over on the forum about the possibility of having a notification sent to users when they've been selected in a custom property.  For instance, maybe when a request for information is registered I want to alert someone.  Whomever is selected should get a notice.

2017-12-07_18-17-08.png

This behavior is really what the "assignee" feature is meant to handle.  Users use their in-tray to see these newly created items.  However, the email notification options related to that feature are fairly limited.  If I, as the user, create an alert then I can receive a notification that a record has been created and assigned to me.

2017-12-07_18-17-37.png

Here are the properties for the alert...

2017-12-07_18-20-57.png
2017-12-07_18-21-06.png
2017-12-07_18-21-13.png

Now the challenge here is that administrators cannot access alerts for other users.  We could tell each user to do this themselves, but then I wouldn't have much to write about.  So instead I create another user and craft a script.

2017-12-07_18-25-23.png

The script needs to find all users that can login and create an alert for each user not having one.  To do that we'll have to impersonate those users.  That requires you either run this as the service account or you add the impersonator to the enterprise studio options.  You can see below the option I'm referencing.

2017-12-07_18-03-31.png

This is how it appears after you've added the account...

Now I need a powershell script that finds all of the users and then connects as that user to perform the search for an alert.  If it can't find an alert then it creates a new one in the impersonated dataset.  This script can then be scheduled to run once every evening.  The audit logs will show that this impersonation happened, so there should be no concern regarding security.

Clear-Host
$LocationCustomPropertyName = "Alert Location"
Add-Type -Path "D:\Program Files\Hewlett Packard Enterprise\Content Manager\HP.HPTRIM.SDK.dll"
$Database = New-Object HP.HPTRIM.SDK.Database
$Database.Connect()
$LocationCustomProperty = [HP.HPTRIM.SDK.FieldDefinition]$Database.FindTrimObjectByName([HP.HPTRIM.SDK.BaseObjectTypes]::FieldDefinition, $LocationCustomPropertyName)
#Find all locations that can login
$Users = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch -ArgumentList $Database, Location
$Users.SearchString = "login:* not uri:$($Database.CurrentUser.Uri)"
Write-Host "Found $($Users.Count) users"
foreach ( $User in $Users ) 
{
    try {
        #Impersonate this user so we can find his/her alerts
        $TrustedDatabase = New-Object HP.HPTRIM.SDK.Database
        $TrustedDatabase.TrustedUser = $User.LogsInAs
        $TrustedDatabase.Connect()
        Write-Host "Connected as $($TrustedDatabase.CurrentUser.FullFormattedName)"
        #formulate criteria string for this user 
        $CriteriaString = "$($LocationCustomProperty.SearchClauseName):[default:me]"
        #search using impersonated connection
        $Alerts = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch $TrustedDatabase, Alert
        $Alerts.SearchString = "user:$($User.Uri) eventType:added"
        #can't search on criteria so have to inspect each to see if already exists
        Write-Host "User $(([HP.HPTRIM.SDK.Location]$User).FullFormattedName) has $($Alerts.Count) alerts"
        $AlertExists = $false
        foreach ( $Alert in $Alerts ) 
        {
            if ( ([HP.HPTRIM.SDK.Alert]$Alert).Criteria -eq $CriteriaString ) 
            {
                $AlertExists = $true
            }
        }
        #when not existing we create it
        if ( $AlertExists -eq $false ) {
            $UserAlert = New-Object HP.HPTRIM.SDK.Alert -ArgumentList $TrustedDatabase
            $UserAlert.Criteria = $CriteriaString
            #$UserAlert.ChildSubscribers.NewSubscriber($TrustedDatabase.CurrentUser)
            $UserAlert.Save()
            Write-Host "Created an alert for $($User.FullFormattedName)"
        } else {
            Write-Host "Alert found for $($User.FullFormattedName)"
        }
    } catch [HP.HPTRIM.SDK.TrimException] {
        Write-Host "$($_)"
    }
}

When I run the script I get these results....

2017-12-07_18-33-09.png

Ah!  When I created Elmer I didn't give him an email address.   I could either update this script to automatically create email addresses for users, simply exclude them from the initial search for users (by adding "email:*" to the search string), or by sending a log of this script to an email address for manual rectification. 

For now I manually fix it and run the script again.

2017-12-07_18-40-09.png

If I run it a second time I can see it doesn't re-create it for him.

2017-12-07_18-40-51.png

Success!  Now just schedule this to run on a server once every day and you have a viable solution to the posted question.  Before implementation you might adjust the script to only target those users who would ever be selected for the custom property (if that's possible).  

Blob Storage in 9.2

With the introduction of 9.2 there are two new types of document stores: 

 
2017-12-06_21-04-14.png

I'm going to focus on the Microsoft Azure Blob Store for now, as that's what I have access to.  Before continuing within  Content Manager, I first flip over to Azure and create a new storage container.  Once that has been created I'll flip back over to CM.

After creating the container I then went to the storage account and found the access keys.  From there I can copy the connnection string to my clipboard and paste it into the new document store dialog.  

2017-12-06_20-59-48.png

Now back over in CM I can complete the details of the document store.

2017-12-06_21-08-49.png

Then I clicked test to verify the settings...

2017-12-06_21-08-55.png

Then I go create a new document record type that will use this document store...

2017-12-06_21-10-30.png

Now I can import a document and see what happens...

2017-12-06_21-12-14.png

Over in Azure I can see the file has been uploaded..

2017-12-06_21-14-55.png

I can manually transfer a few records to the document store from the electronic sub-menu of the context menu...

2017-12-07_2-15-59.png

When prompted I just need to pick the new document store..

2017-12-07_2-16-39.png

Alternatively, I could transfer the entire contents of any store into this one by selecting transfer from the context menu of a document store.  The default options are the best.  You should not need to use the others.

2017-12-07_2-19-07.png

Lastly, I might want to use the Blob storage as a cheaper storage layer for older content.  I should use the tiered storage options if available to me.  I can make this a storage tier 3.

2017-12-07_2-21-52.png

But I'll update the main document store so that the dates are tracked (it was already at level 1)...

2017-12-07_2-23-46.png

Now I can create a saved search that I'll run routinely to move records to cheaper cloud storage.  Note the search below finds all items using the main store but accessed more than a year ago.

2017-12-07_2-26-38.png

I would tag all the resulting records and transfer them to storage tier 3.  This could be automated via a powershell script as well.  Lastly, workgroup server level document caching may be warranted when using blob storage.

Monitoring the CM Elasticsearch Index

Here's a couple different ways to manage your new CM 9.2 Elasticsearch instance.


via Chrome

 

Visit the webstore and add the ElasticSearch Head extension.

2017-12-05_22-25-23.png

Then when you launch the extension you can update the server address.  With the visuals I can easily see that the unassigned shards are what's pushing my health to yellow.  

2017-12-05_17-01-51.png

You can also browse the data in the index

2017-12-05_17-04-30.png

via Head Stand-alone Server

 

The same functionality as what's shown above, however the interface is a stand-alone server operating off port 9100.  You'd access it via any browser from address http://localhost:9100.  You would use this option if you can't use chrome.  


via Powershell

 

Install Elastico from an elevated powershell prompt.

2017-12-05_9-54-45.png

You have to trust the repository in order to continue.  If you're in a secure environment then visit the github page for the source, then manually install the module.

2017-12-05_9-55-34.png

Now I can run a command to check the overall cluster's health...

2017-12-05_10-06-05.png

Another command to check the statuses of the indexes...

2017-12-05_10-08-45.png

Note that in both instances I'm calling a script intended for v5 even though I'm using v6.  I can actually run any version of the command as they all seem to be forward & backward compatible.  Probably still makes the most sense to run the one intended for the most recent version of ES.

I can also perform a search.  I went to the client and found the record I worked with in my last post (I had used Kibana as a front-end to the ES index).  Then I searched via powershell.

2017-12-05_10-21-31.png

via Kibana

 

When creating your index pattern, keep in mind that the default naming convention for the CM content index starts with "hpecm_".  You could use that, or just a plain asterisk, to configure your access.  If you wish to use the Timelion feature then you should also pick the date registered field as the time filter.

2017-12-05_22-05-46.png

When you then click discover you can explore the index.

2017-12-05_22-10-35.png

You can pick a different time period that might show some results...

2017-12-05_22-11-18.png

If I can now get some data then I know the system is at least partially working.  Kibana doesn't really give much detail as to the internal workings of the cluster and/or index.

2017-12-05_22-12-40.png