Automating movement to cheaper storage

Now that I have an Azure Blob store configured, I want to have documents moved there after they haven't been used for a while.  I've previously shown how to do this manually within the client, but now I'll show how to automate it.

The script is very straight-forward and follows these steps:

  1. Find the target store in CM
  2. Find all documents to be moved
  3. Move each document to the target store

Here's an implementation of this within powershell:

Clear-Host
Add-Type -Path "D:\Program Files\Hewlett Packard Enterprise\Content Manager\HP.HPTRIM.SDK.dll"
$LocalStoreName = "Main Document Store"
$AzureStoreName = "Azure Storage"
$SearchString = "store:$($LocalStoreName) and accessedOn<Previous Year"
$Database = New-Object HP.HPTRIM.SDK.Database
$Database.Connect()
#fetch the store and exit if missing
$Tier3Store = $Database.FindTrimObjectByName([HP.HPTRIM.SDK.BaseObjectTypes]::ElectronicStore, $AzureStoreName)
if ( $Tier3Store -eq $null ) {
    Write-Error "Unable to find store named '$($AzureStoreName)'"
    exit
}
#search for records eligible for transfer
$Records = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch -ArgumentList $Database, Record
$Records.SearchString = $SearchString
Write-Host "Found $($Records.Count) records"
$x = 0
#transfer each record
foreach ( $Result in $Records ) 
{
    $Record = [HP.HPTRIM.SDK.Record]$Result
    $record.TransferStorage($Tier3Store, $true)
    Write-Host "Record $($Record.Number) transfered"
	$x++
}

I ran it to get the results below.  I forced it to stop after the first record for demonstration purposes, but you should get the idea. 

2017-12-07_21-11-57.png

Automatically Creating Alerts for Users

A question over on the forum about the possibility of having a notification sent to users when they've been selected in a custom property.  For instance, maybe when a request for information is registered I want to alert someone.  Whomever is selected should get a notice.

2017-12-07_18-17-08.png

This behavior is really what the "assignee" feature is meant to handle.  Users use their in-tray to see these newly created items.  However, the email notification options related to that feature are fairly limited.  If I, as the user, create an alert then I can receive a notification that a record has been created and assigned to me.

2017-12-07_18-17-37.png

Here are the properties for the alert...

2017-12-07_18-20-57.png
2017-12-07_18-21-06.png
2017-12-07_18-21-13.png

Now the challenge here is that administrators cannot access alerts for other users.  We could tell each user to do this themselves, but then I wouldn't have much to write about.  So instead I create another user and craft a script.

2017-12-07_18-25-23.png

The script needs to find all users that can login and create an alert for each user not having one.  To do that we'll have to impersonate those users.  That requires you either run this as the service account or you add the impersonator to the enterprise studio options.  You can see below the option I'm referencing.

2017-12-07_18-03-31.png

This is how it appears after you've added the account...

Now I need a powershell script that finds all of the users and then connects as that user to perform the search for an alert.  If it can't find an alert then it creates a new one in the impersonated dataset.  This script can then be scheduled to run once every evening.  The audit logs will show that this impersonation happened, so there should be no concern regarding security.

Clear-Host
$LocationCustomPropertyName = "Alert Location"
Add-Type -Path "D:\Program Files\Hewlett Packard Enterprise\Content Manager\HP.HPTRIM.SDK.dll"
$Database = New-Object HP.HPTRIM.SDK.Database
$Database.Connect()
$LocationCustomProperty = [HP.HPTRIM.SDK.FieldDefinition]$Database.FindTrimObjectByName([HP.HPTRIM.SDK.BaseObjectTypes]::FieldDefinition, $LocationCustomPropertyName)
#Find all locations that can login
$Users = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch -ArgumentList $Database, Location
$Users.SearchString = "login:* not uri:$($Database.CurrentUser.Uri)"
Write-Host "Found $($Users.Count) users"
foreach ( $User in $Users ) 
{
    try {
        #Impersonate this user so we can find his/her alerts
        $TrustedDatabase = New-Object HP.HPTRIM.SDK.Database
        $TrustedDatabase.TrustedUser = $User.LogsInAs
        $TrustedDatabase.Connect()
        Write-Host "Connected as $($TrustedDatabase.CurrentUser.FullFormattedName)"
        #formulate criteria string for this user 
        $CriteriaString = "$($LocationCustomProperty.SearchClauseName):[default:me]"
        #search using impersonated connection
        $Alerts = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch $TrustedDatabase, Alert
        $Alerts.SearchString = "user:$($User.Uri) eventType:added"
        #can't search on criteria so have to inspect each to see if already exists
        Write-Host "User $(([HP.HPTRIM.SDK.Location]$User).FullFormattedName) has $($Alerts.Count) alerts"
        $AlertExists = $false
        foreach ( $Alert in $Alerts ) 
        {
            if ( ([HP.HPTRIM.SDK.Alert]$Alert).Criteria -eq $CriteriaString ) 
            {
                $AlertExists = $true
            }
        }
        #when not existing we create it
        if ( $AlertExists -eq $false ) {
            $UserAlert = New-Object HP.HPTRIM.SDK.Alert -ArgumentList $TrustedDatabase
            $UserAlert.Criteria = $CriteriaString
            #$UserAlert.ChildSubscribers.NewSubscriber($TrustedDatabase.CurrentUser)
            $UserAlert.Save()
            Write-Host "Created an alert for $($User.FullFormattedName)"
        } else {
            Write-Host "Alert found for $($User.FullFormattedName)"
        }
    } catch [HP.HPTRIM.SDK.TrimException] {
        Write-Host "$($_)"
    }
}

When I run the script I get these results....

2017-12-07_18-33-09.png

Ah!  When I created Elmer I didn't give him an email address.   I could either update this script to automatically create email addresses for users, simply exclude them from the initial search for users (by adding "email:*" to the search string), or by sending a log of this script to an email address for manual rectification. 

For now I manually fix it and run the script again.

2017-12-07_18-40-09.png

If I run it a second time I can see it doesn't re-create it for him.

2017-12-07_18-40-51.png

Success!  Now just schedule this to run on a server once every day and you have a viable solution to the posted question.  Before implementation you might adjust the script to only target those users who would ever be selected for the custom property (if that's possible).  

Blob Storage in 9.2

With the introduction of 9.2 there are two new types of document stores: 

 
2017-12-06_21-04-14.png

I'm going to focus on the Microsoft Azure Blob Store for now, as that's what I have access to.  Before continuing within  Content Manager, I first flip over to Azure and create a new storage container.  Once that has been created I'll flip back over to CM.

After creating the container I then went to the storage account and found the access keys.  From there I can copy the connnection string to my clipboard and paste it into the new document store dialog.  

2017-12-06_20-59-48.png

Now back over in CM I can complete the details of the document store.

2017-12-06_21-08-49.png

Then I clicked test to verify the settings...

2017-12-06_21-08-55.png

Then I go create a new document record type that will use this document store...

2017-12-06_21-10-30.png

Now I can import a document and see what happens...

2017-12-06_21-12-14.png

Over in Azure I can see the file has been uploaded..

2017-12-06_21-14-55.png

I can manually transfer a few records to the document store from the electronic sub-menu of the context menu...

2017-12-07_2-15-59.png

When prompted I just need to pick the new document store..

2017-12-07_2-16-39.png

Alternatively, I could transfer the entire contents of any store into this one by selecting transfer from the context menu of a document store.  The default options are the best.  You should not need to use the others.

2017-12-07_2-19-07.png

Lastly, I might want to use the Blob storage as a cheaper storage layer for older content.  I should use the tiered storage options if available to me.  I can make this a storage tier 3.

2017-12-07_2-21-52.png

But I'll update the main document store so that the dates are tracked (it was already at level 1)...

2017-12-07_2-23-46.png

Now I can create a saved search that I'll run routinely to move records to cheaper cloud storage.  Note the search below finds all items using the main store but accessed more than a year ago.

2017-12-07_2-26-38.png

I would tag all the resulting records and transfer them to storage tier 3.  This could be automated via a powershell script as well.  Lastly, workgroup server level document caching may be warranted when using blob storage.