Discovery Patterns for AWS

Patterns are a new and interesting thing in ServiceNow.

Discovery Patterns

This is a work in progress, but wanted to share as I will otherwise never get this down on paper.

First of all let me prefix this with what I had to do to trigger the pattern for AWS I wanted to use.

Also heres a link to the Docs.

Set up

This post and this YouTube video helped me understand how to get started.

  1. Create a Process Classifier
  1. Create a New Web Service/Region/Logical Data Center
  1. From the aws account record Create the discovery schedule.
  1. Update mid server capabilities to include all

Understanding patterns

Patterns use Groovy as a backend to parse any scripts in the steps.

That means you have no access to normal script includes to change the data.

With that being said you can do most what you did in probes sensors in the pattern and it's pre/post script handlers.

Patterns are limited to following actions;

StepComments
Library ReferenceThis seems like it allows repeatable steps
Match?
Get Process?
LDAP Query?
SNMP Query?
WMI Method Invocation?
WMI Query?
Parse Command Output?
Parse File?
Parse VariableThis is what is used to map a payload to fields
Create Relation/Reference?
Filter Table?
Merge Table?
Transform TableThis is what is used to set fields if extra processing needs to be done
Union Tables?
Change User?
Find Matching URL?
Parse URL?
Put File?
Set Parameter ValueThis allows you to set reference-able variables in EVAL scripts and other fields
Unchange User?
Cloud REST CallThis allows SN to make REST calls via the midserver to get data for this pattern

Pre / Post Processing values

These are records on [sa_pattern_prepost_script] where you can massage the data before and after it writes to the database. This post was helpful in my understanding of it.

If you're working with the payload, this is the payload for all the found responses. If you want to change a "account id" to a reference (sys_id) of a account table, you'll need to iterate over the array of the items and update each account_id.

This is my understanding of the payloadObj structure;

{
items: [
{
className: "cmdb_ci_thing",
name: "the name",
u_account: "account id you set here"
}
]
}

I handled this with a pre-script

/*
* 1. Pre sensor: You can change payload before it will be proccesed by Identification Engine.
* Use IEJsonUtility to add relevant information to the payload
* Input parameters in Pre sensor mode: payload, patternId
* 2. Post sensor: You can update/add missing info to the DB based on result (Json) from
* Identification Engine
* Output parameters in Post sensor mode: payload
*/


var rtrn = {};
//parsing the json string to a json object

var payloadObj = JSON.parse(payload);
//put your business logic here

var handleAccountData = function(){
gs.log(JSON.stringify(payloadObj,'',' '),'AWS Service Account ID to GR');
var returnStr = "Did not replace account value";
for(var i = 0;i<payloadObj.items.length;i++){
var item = payloadObj.items[i];
if(item.className === "cmdb_ci_cloud_database"){
var account_id = item.values.u_account;
var saGR = new GlideRecord('aws_account_admin');
if(saGR.get('account_id', account_id)){
item.values.u_account = saGR.getValue('sys_id');
returnStr = "Did replace account value";
}
}
}
return returnStr;
};
rtrn = {
'status': {
'message': handleAccountData(),
'isSuccess': true
},
'patternId': patternId,
'payload': JSON.stringify(payloadObj)
};
//you can return a message and a status, on top of the input variables that you MUST return.
//returning the payload as a Json String is mandatory in case of a pre sensor script, and optional in case of post sensor script.
//if you want to stop the payload processing due to your business logic - you can set isSucess to false.