Wednesday, June 24, 2015

QA testing of ServiceNow Data Sources, Import Sets and Transform Maps

ServiceNow QA Testing


ServiceNow Data Sources, Import Sets & Transform Maps


Technically Data Sources in ServiceNow are used to create intermediate import set. Import set table data is mapped to production table. However import set data can be processed prior mapping to production table. Data for data sources can come in three ways viz. 1) Recognized file format 2) Data accessed via JDBC and 3) Data from LDAP server.


As import sets table acts as staging area for records imported from data sources, import set table data can be manipulated for testing of intended data before it gets updated into production table via respective transform map field mappings. As a ServiceNow expert, these points look familiar and might be simple. However it has been observed many times that QA testers find these simple feature difficult while testing newly created (or modified) data sources in ServiceNow.

Mostly QA testers prefer to take the help of ServiceNow developers or admin to run the data source and its respective transform map and then validate the results on destination table using reports, list view or in layout view. They find testing the data source part complex might be because:

·        Import Sets can have thousands of rows and it might take long time to execute

·        QA testers need to run the data source multiple times to validate different test cases and scenarios. To complete all scenarios, QA testers might put lot of hours. However developers have more knowledge and know tricks to run and test intended rows data quickly.

·        QA testers might need admin or security_admin roles/permissions to perform some of steps. Instead they might prefer developers or admins to check those steps


QA testers can find below steps useful in testing ServiceNow data sources, provided they have security_admin elevated rights

1.      Note down details in form of reports or screenshots of required table or form involved in data source testing

2.     Choose the data source and select “Load All Records” (in Related Links) to run the data source and load all data rows

3.     If the data source involves large amount of data, it takes some more time to process. Once the State is Complete and Completion Code is Success, go to Import sets. Note down the Import Set number.



4.     Identify the desired rows from the Import Set to test further.


5.     Once rows from Import Set have been identified, delete other Import Set rows with below script using “Scripts – Background”.  Note:Prior to this, security_admin elevated privilege need to be activated

For e.g. Data Source name is "Test Import Assets"
Import Set table name is "u_test_import_assets"
Import Set number is "ISET1234567"

var gr = new GlideRecord('u_test_import_assets'); //Import Set table name
gr.addQuery('sys_import_set.number''ISET0017093'); //Import Set number
gr.addQuery('sys_import_state''pending');
//Delete Import Set table rows except row numbers 944,5612,8881
gr.addQuery('sys_import_row','NOT IN' , '944,5612,8881');
            gr.deleteMultiple();


6.     With above script, unwanted rows can be deleted from Import Set table and then respective transform map for that Import Set can be run to complete the data update in production table.

7.     Import set rows data can be changed or manipulated to perform positive or negative testing as per required scenario.

8.    With this approach testing can be done with less number of rows, and QA testers can run multiple tests in less time. Otherwise the source data need to be less to execute and complete the data source run quickly.


Popular Posts