Ensuring Records Managers can access Microstragegy

I've got a MicroStragtegy server environment with a Records Management group.  I'd like to ensure that certain new CM users always have access to MSTR, so that they have appropriate access to dashboards and reports.  For simplicity of the post I'll focus just on CM administrators.

Within MicroStrategy I'd like to create a new user and include them in a "Records Management" group, like so:

2018-05-05_8-48-44.png

This implementation of MSTR does not have an instance of the REST API available, so I'm limited to using the command manager.  My CM instance is in the cloud and I won't be allowed to install the CM client on the MSTR server.  To bridge that divide I'll use a powershell script that leverages the ServiceAPI and Invoke-Expression cmdlet.  

First I need a function that gets me a list of CM users:

function Get-CMAdministrators {
    Param($baseServiceApiUri)
    $queryUri = ($baseServiceApiUri + "/Location?q=userType:administrator&pageSize=100000&properties=LocationLogsInAs,LocationLoginExpires,LocationEmailAddress,LocationGivenNames,LocationSurname")
    $AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
    [System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
    $headers = @{ 
        Accept = "application/json"
    }
    $response = Invoke-RestMethod -Uri $queryUri -Method Get -Headers $headers -ContentType "application/json"
    Write-Debug $response
    if ( $response.TotalResults -gt 0 ) {
        return $response.Results
    }
    return $null
}

Second I need a function that creates a user within MSTR:

function New-MstrUser {
    Param($psn, $psnUser, $psnPwd, $userName,$fullName,$password,$group)
    $command = "CREATE USER `"$userName`" FULLNAME `"$fullName`" PASSWORD `"$password`" ALLOWCHANGEPWD TRUE CHANGEPWD TRUE IN GROUP `"$group`";"
    $outFile = ""
    $logFile = ""
    try {
        $outFile = New-TemporaryFile
        Add-Content -Path $outFile $command
        $logFile = New-TemporaryFile
        $cmdmgrCommand = "cmdmgr -n `"$psn`" -u `"$psnUser`" -f `"$outFile`" -o `"$logFile`""
        iex $cmdmgrCommand
    } catch {
        
    }
    if ( (Test-Path $outFile) ) { Remove-Item $outFile -Force }
    if ( (Test-Path $logFile) ) { Remove-Item $logFile -Force }
}

Last step is to tie them together:

$psn = "MicroStrategy Analytics Modules"
$psnUser = "Administrator"
$psnPwd = ""
$recordsManagemenetGroupName = "Records Management"
$baseUri = "http://10.0.0.1/HPECMServiceAPI"
$administrators = Get-CMAdministrators -baseServiceApiUri $baseUri
if ( $administrators -ne $null ) {
    foreach ( $admin in $administrators ) {
        New-MstrUser -psn $psn -psnUser $psnUser -psnPwd $psnPwd -userName $admin.LocationLogsInAs.Value -fullName ($admin.LocationSurName.Value + ', ' + $admin.LocationGivenNames.Value) -password $admin.LocationLogsInAs.Value -group $recordsManagemenetGroupName
    } 
}

After running it, my users are created!

2018-05-05_10-03-31.png

Using GCP to alert CM administrators when things break

It's crazy how simple it is to set this up within GCP.  You can even do this with your internally hosted CM servers (assuming you don't mind log shipping to your super secure private cloud)!  In a previous post I installed Stackdriver on my VM and showed how the logs appear within the UI.

I dare say a majority of the times CM breaks an error entry is generated within the application log.  Stackdriver is sending me those entries. As shown below, here's an entry from CM stemming from me having (purposefully) moved a document store.

 
2018-04-25_22-09-37.png
 

Similar entries will be generated when the database goes down, document stores are full, and when IIS resets on me.  More importantly, it's super easy to push TRIMWorkgroup log file entries into the event logs.  Also, many integrations can have their log4net logging mechanisms redirected to the event logs!!!!

From the logging interface I can select Logs-Based metrics on the left and then click Create Metric...

 
 

Next I gave it an easy to understand name and a set of filters that drill down to just the errors from the workgroup service.  More advanced filters can be defined so that more (or less) is included.  This serves my purposes for now.

 
 

After hitting save, I can select Create Alert from Metric....

 
2018-04-25_22-29-04.png
 

Here I just need to provide an interval which would trigger the alert...

 
2018-04-25_22-32-34.png
 

Once that's saved I can configure one or more notifications.  Here I'm sending an email, but options exist for text, slack channel post, or smoke signal.  Pretty slick.

 
 

After it was all saved I went and triggered an error by trying to open one of those missing documents.  Look what I got via email about a minute later!

 
2018-04-25_22-40-59.png
 

Time to start cranking out preset filters for customers!

Relocating a Document Store

I've recently migrated my dataset's database into a secure cloud environment.  I've also prepared a workgroup server in the cloud and have registered the dataset on that server.  When users try to open electronic documents though, an error is generated.

download.png

The error message indicates the expected location of the electronic file, relative to the server.  In this case I should be able to browse the file-system on the old server and find this directory: "C:\HPE Content Manager\DocumentStore\47".  When I look on the old server though, I don't see it.

To sort this out I'll need to review the properties of the record and find the document store details.  That information is stored in the Document Details property.  I can view that from the Properties tab of the view pane, as shown below.

2018-04-23_21-32-49.png

To dig in further I'll need to go to the Administration ribbon and click Document Stores.

Then locate the document store I found in the record's document details property and open the properties dialog.  

Well that doesn't seem right.  The path highlighted does indeed exist.  However, as I migrated my dataset into the cloud I also changed the dataset ID!  The path indicated in the properties of the document store is just the starting path.  Within that path will be the ID of the dataset; and then, within that, will be the unique ID of the document store.  I can find that by reviewing the view pane for the document store (you may need to customize the view pane and add that property).

2018-04-25_10-12-40.png

If I go back to the old server I should find a sub-folder within "45" (the old dataset ID) named "2"....

2018-04-25_10-14-40.png

Moving that folder onto my new server into the path "C:\HPE Content Manager\DocumentStore\47" should resolve the issue.  After dropping the file into that path, everything works!

Moral of the story is to be mindful when changing dataset IDs!  

Also, it would be really nice to be able to use a google storage bucket as a document store!