Destroy this without any delay!

If the retention schedule says "Keep no longer than 3 years", establishing an effective process within Content Manager becomes tricky.  We don't want to keep it very long, but we also need to be able to go through management processes for approval.  How do we accomplish that review without tacking another year onto the life-cycle of that record?   

As shown below, trigger properties now have an option for "without any delay".  It also supports a set number of days (previously only years and months).  

2017-10-11_15-39-09.png

This approach is most effective when storing active records within Content Manager.  Once a user creates a record the schedule is attached and the Make Inactive trigger immediately calculates for the 3 year mark.  Later the user, or a power user, can move the record to an inactive container if its' known to no longer be needed (again think of the "no longer than 3 year" retention schedule). 

Otherwise a routine saved search looking for active->inactive trigger changes would surface this record.  We'd send it to the owning unit for review and approval.  They can either move it into an inactive container, change the disposition themselves, or change the schedule.

I find it useful to now have a saved search for these types of schedules:

 
trigger:[delayType:2]

Another saved search for all of the records using a schedule having one of these triggers

 
schedule:[trigger:[delayType:2]]

A saved search to give me all of the records to be destroyed using this approach:

 
schedule:[trigger:[delayType:2]] destroyOn:1/1/1900 - today

A saved search to give me all records to be destroyed next month:

 
schedule:[trigger:[delayType:2]] destroyOn:next month

A saved search to give me all of the records to be made inactive next month using this approach:

 
schedule:[trigger:[delayType:2]] inactiveOn:next month

Checking who's searching for what

Did you know the latest version of Content Manager allows you to store user queries in the workgroup server logs?  Imagine extracting from these logs a treasure-trove of anecdotal meta-data searching tips & tricks or insight into actual product usage.  You can get quick reports of who searched for what and when.

To get this going you need to modify the properties of your workgroup server(s).

2017-10-11_22-35-29.png

The workgroup server has two properties that we need to enable: 

  1. Enable logging on next deployment
  2. Add user query strings to workgroup server log
2017-10-11_22-34-36.png

Once you click OK you have to save and deploy.

2017-10-11_22-37-10.png

This forces the server(s) to re-initialize using the new configuration.  I will see log files accumulating on the local server's application data directory.  If I use windows explorer I can navigate to it like shown in the image below.

2017-10-11_22-39-24.png

If I open that file within Notepad++ I can see a lot of information.  I just want to focus in on the search queries though.  The search itself is surrounded by the pipe character (|).  Here's what my log looked like after a few quick searches.

2017-10-11_22-45-25.png

Powershell is such a handy thing!  I created one that manages these logs files and extracts what I need.  I use a dictionary to track users, their queries, and relational details about the queries themselves.  For instance if the user searched by title, notes, or any word, I look-up other trim indexed words.  When the user doesn't use asterisks, I calculate what would have happened if they did.  I can then use this for targeted training one-on-one, or to guide the creation of updated training materials.

2017-10-11_23-01-58.png

Take the first entry where I searched for "cli".  I was able to see that there were four indexed words containing "cli" (clinic, clinics, clinic-, and clinton).  Those 4 words are used a total of 57 times (not necessarily unique records).  Yet I, as the user, received 0 results.  The user would have received results when using wildcards. 

The really nifty part is powershell has a ConvertTo-Json command.  If I output the results to a json file within my webdrawer instance, I can consume all of this new information in other places.  Like maybe within webdrawer, during an audit, or as part of a health check.

Speaking of webdrawer and indexed words..... 

2017-10-11_23-23-55.png

Don't let OLE DB slow you down

Out-of-the-box sounds good to most, but to a trained ear it screams "poor performance".  It could be the root cause of users receiving a "timed out" response when searching, or sluggish response when scrolling through search results.  A quick way to try and resolve this poor performance is to use a native driver.

When creating a dataset within the Content Manager Enterprise Studio we can either pick SQL Server or Oracle as our database platform.  Making a selection tells the solution which flavor of SQL to use, but not which driver to use.  You get to pick that on the connections tab, as shown below. 

Connection tab of my dataset properties

Connection tab of my dataset properties

The very first thing written in that text box is the provider (driver) to be used when connecting to the database.  You can see here I'm using the SQLOLEDB.1 driver.  That's the default.  If I click the blue KwikSelect icon I can see the datalink properties dialog.  

Initial view of connection string properties

Initial view of connection string properties

Clicking onto the Provider tab will show all of the available drivers.  The highlighted one is the currently selected one for the connection string.  So that "SQLOLEDB.1" equals the "Microsoft OLE DB Provider for SQL Server".  That's the generic one that comes with Windows.  It works but won't contain the unique features and refinements of your SQL Server version. 

Since I have the SQL Server Native Client available, I should be picking that!

List of drivers available on the local machine

List of drivers available on the local machine

Picking the native client forces me to click next and then re-enter my database details.  That's because I'll have new options and features available for my connection.  The interface changes slightly (and more options are available off the advanced tab).  I'll just re-enter what I had before and click OK.

2017-10-11_0-43-06.png

Now if I save and deploy all is well.  If I have multiple workgroup servers I have to deploy the native client onto all of them.  I also have to match the native client version to the SQL server instance build (you should not use mixed builds).    Same goes for Oracle environments.

Don't take my word for it though.  Try it in your development environment and see for yourself (use a load test tool to simulate DB activity).