We use transform maps for a lot but there are often times when we need to trigger a transform to occur on command, quite frequently through email. Assume you had a scenario where a client was interested in sending in a data source via email into the instance with the expectation that the data source would be transformed. There is a way to accomplish this, and a few things are needed:
- Inbound action to copy the email attachment and content to a staging table
- Staging table business rule that on insert will take the attachment and put it on a scheduled import table
- After adding the excel table to the data source, the business rule will run a script action which will run the import
In the past I have used this process for barcode scanning, cmdb upkeep, etc. The idea is to take the file from the email, copy it to the staging table, and replace the data source’s attached csv file with the one that was received from the email. The process is:
Inbound Action -> Staging table record -> Insert Business Rule -> Script Action -> scheduled import
Step 1
First, create a staging table. Typically I like to keep this table hidden, but it’s up to you whether you’d like to create a module for it. These are the fields I capture in the staging table, but you can gather only what’s relevant if you wish:
- Subject – String, 100 characters
- Body – String, 500 characters
- Sender – Reference to sys_user
- Date Received – Date/time field
- Email Reference – reference to sys_email
After creating the staging table, we can move on.
Step 2
Next it will be necessary to set up the inbound action. Create an inbound action that will match on the email you intend to hit the instance (whether you pick subject, the sender, etc.) Have this inbound action create a record on your staging table, copying over any fields you felt applicable. Ensure that attachments on the email are carried over. See the below code snippet for an example:
current.u_subject = email.subject; current.u_body = email.body_text; current.u_date_received = gs.getNowDateTime(); current.u_sender = email.from_sys_id; current.u_originating_email = sys_email.sys_id; current.insert();
Step 3
Create the transform map. This should do the desired transform on the attached excel spreadsheet that will be hitting the instance via email. Ensure the transform map has a corresponding data source. After creating your transform and data import, create a scheduled data import. Set the data source field on this scheduled data import to be the one you had just created. For “Run,” set it to periodically, and set the starting time to some point impossibly far in the future (i.e., 2600).
You will need to add another line on your inbound action created in the previous step – see below (Add this at the END). This is to delete any previous attachments to the data source.
var dataload = new GlideRecord('sys_data_source'); dataload.get('SYS_ID_OF_DATA_SOURCE_HERE'); var attach = new GlideSysAttachment(); attach.deleteAll(dataload);
Step 4
At this time we’ve already linked our inbound action to our staging table, but we have not linked our staging table to our scheduled import set yet. Go to your staging table and create an insert business rule. This rule will need to invoke a script action – due to timing issues. If we attempted to invoke the scheduled data import right in this business rule, the transform would fail because the data source has not yet been copied!
Create a new event in the event registry. Invoke this event in your Buiness rule.
gs.eventQueue('YOUR.EVENT.NAME', current)
The object doesn’t matter, so just pass current.
Step 5
Now we’re going to create a script action. This is invoked by the eventQueue command you made in the above step. This is what will trigger our scheduled data import (created in step 3), which has our data source/transform map tied to it. Paste the following code into a new script action to invoke the scheduled data import on command – and make sure you pick the same event to trigger the script action as you created in step 4!:
var rec = new GlideRecord('sys_data_source'); rec.get('SYS_ID_OF_DATA_SOURCE'); GlideSysAttachment.copy('U_STAGING_TABLE', current.sys_id.toString(), 'sys_data_source', 'SYS_ID_OF_DATA_SOURCE'); var rec = new GlideRecord('scheduled_import_set'); rec.get('name', 'SCHEDULED_IMPORT_SET_NAME'); if (typeof SncTriggerSynchronizer != 'undefined') { SncTriggerSynchronizer.executeNow(rec); }
The SncTrigger line is what will actually invoke the import job. The purpose of the GlideSysAttachment portion is to ensure the previous attachment is deleted from the data source. Ensure that you add your named scheduled data import into the “SCHEDULED_IMPORT_SET_NAME” and that your data source sys id is placed where “SYS_ID_OF_DATA_SOURCE” is indicated.
This will complete the import process! If there’s anything wrong, reach out to us here, and we will help!