gflewis / sndml3 Goto Github PK
View Code? Open in Web Editor NEWServiceNow Data Mart Loader: an application to load SQL databases from ServiceNow
License: MIT License
ServiceNow Data Mart Loader: an application to load SQL databases from ServiceNow
License: MIT License
yaml file: wm_order.yaml
tables:
Command: java -jar sndml-3.2.0-mysql.jar -p profile.txt -y wm_order.yaml
23:58:57 INFO [main] RestTableAPI wm_order PROCESS: getStats count=7375 query="sys_created_on<2020-11-15 22:58:55" createdRange=[2020-05-05 07:28:38,2020-11-13 10:36:05]
23:58:57 INFO [main] LoaderJob wm_order INIT: partition=DatePartition[interval=WEEK size=29 min=2020-04-30 max=2020-11-19]
23:58:57 INFO [main] LoaderJob wm_order INIT: begin sync wm_order (7375 rows)
23:58:57 INFO [main] Synchronizer wm_order.W2020-11-12 INIT: begin compare
23:58:57 INFO [main] RestTableAPI wm_order.W2020-11-12 PROCESS: getStats count=5 query="sys_created_on>=2020-11-12^sys_created_on<2020-11-19"
23:58:57 INFO [main] Synchronizer wm_order.W2020-11-12 INIT: compare identified 0 inserts, 5 updates, 0 deletes, 0 skips
23:58:57 INFO [main] Synchronizer wm_order.W2020-11-12 PROCESS: Inserting 0 rows
23:58:57 INFO [main] Synchronizer wm_order.W2020-11-12 PROCESS: Updating 5 rows
23:58:58 INFO [main] DatabaseUpdateWriter wm_order PROCESS: loaded 5 / 5 (5 / 5)
23:58:58 INFO [main] Synchronizer wm_order PROCESS: Deleting 0 rows
23:58:58 INFO [main] Synchronizer wm_order.W2020-11-05 INIT: begin compare
23:58:58 INFO [main] RestTableAPI wm_order.W2020-11-05 PROCESS: getStats count=6 query="sys_created_on>=2020-11-05^sys_created_on<2020-11-12"
23:58:59 INFO [main] Synchronizer wm_order.W2020-11-05 INIT: compare identified 0 inserts, 6 updates, 0 deletes, 0 skips
23:58:59 INFO [main] Synchronizer wm_order.W2020-11-05 PROCESS: Inserting 0 rows
23:58:59 INFO [main] Synchronizer wm_order.W2020-11-05 PROCESS: Updating 6 rows
23:58:59 INFO [main] DatabaseUpdateWriter wm_order PROCESS: loaded 6 / 6 (6 / 6)
These columns are not included when the table is created. However, if they are added manually (via alter table) then they will be populated.
I am trying to load our cmdb_ci table, which has around 11M records to load to the DB. We are now using multiple concurrent instances of the SNDML to load parts of the table. Each instance is also partitioned by week/day. The speed it loads is just fine, but when it finishes it does not load all the records, it loads just cca 8.7M records.
This is an example of our YAML file (there are more yaml files, the created parameter is just different):
metrics: Path\To\Metrics\File.metrics
tables:
- {name: cmdb_ci, target: cmdb_ci_load, pagesize: 500, action: insert, created: [2020-01-01, 2020-04-01], partition: day, threads: 64}
For random queries to the table we get the RestTableReader warning log:
22:11:03 WARN RestTableReader [pool-3-thread-17] cmdb_ci.D2023-12-15 PROCESS: Expected 26143 rows but processed 26119 rows
22:11:03 WARN DatePartitionedTableReader [pool-3-thread-17] cmdb_ci.D2023-12-15 FINISH: Expected 26143 rows but only processed 26119 rows
These warnings are also random on every running instance of SNDML.
I also get a random JsonRequest warning when there is a "Time limit exceeded" error in the response, This is an example of the query that triggers the error:
sys_created_on%3E%3D2023-11-19%5Esys_created_on%3C2023-11-20%5Esys_id%3E4fc2139287aa355007ff542bbbbb35ca%5EORDERBYsys_id
What can I do in this situation, so it loads all the records from the table, becasue just the majority is not enough for us.
The delete record count in the metrics file is inaccurate following a sync
Exception in thread "main" org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00 at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2422) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2167) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306) at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441) at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365) at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155) at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132) at servicenow.datamart.DatabaseUpdateStatement.update(DatabaseUpdateStatement.java:38) at servicenow.datamart.DatabaseUpdateWriter.writeRecord(DatabaseUpdateWriter.java:29) at servicenow.datamart.DatabaseTableWriter.processRecords(DatabaseTableWriter.java:47) at servicenow.api.RestTableReader.call(RestTableReader.java:71) at servicenow.api.RestTableReader.call(RestTableReader.java:6) at servicenow.api.DatePartitionedTableReader.call(DatePartitionedTableReader.java:142) at servicenow.api.DatePartitionedTableReader.call(DatePartitionedTableReader.java:15) at servicenow.datamart.LoaderJob.call(LoaderJob.java:172) at servicenow.datamart.Loader.loadTables(Loader.java:95) at servicenow.datamart.Loader.main(Loader.java:60)
As per recommendation, we have created the data pump daemon user with a GMT timezone. However, The date-time timezone in MSSQL tables is GMT and our service now is local time zone 'Asia/Singapore'.
We would like to standardize on local timezone, is it possible for data pump to copy the date time data in local timezone from SN than in GMT?
Looking at documentation of timezone in datapump, there does not seem to be an option like
set time_zone = 'US/Eastern'; in MSSQL.
Please advise
MinRows is not working. This was working in 3.2.0 but is broken in 3.4.0.
$ java -ea -jar $jar -t x_mppc2_intake_service_symptom -p xxxprodpg.profile
01:00:57 INFO [main] Globals GLOBAL INIT: loadProperties /home/ec2-user/sndml3/xxx/xxxprodpg.profile
01:00:57 INFO [main] Session GLOBAL INIT: instance=https://xxx.service-now.com/ user=xxxxxxxx
01:00:57 INFO [main] Database GLOBAL INIT: database=jdbc:postgresql://sndm-pg.cwp63xqkanus.us-east-2.rds.amazonaws.com:5432/sndm user=xxxxxx schema=xxx
01:00:57 INFO [main] Generator GLOBAL INIT: dialect=pg schema=xxx namecase=LOWER namequotes=DOUBLE
01:00:57 INFO [main] Database GLOBAL INIT: SET TIME ZONE 'UTC'
Exception in thread "main" java.lang.AssertionError
at servicenow.datamart.Globals.getStart(Globals.java:92)
at servicenow.datamart.DateTimeFactory.getStart(DateTimeFactory.java:52)
at servicenow.datamart.JobConfig.getDefaultRange(JobConfig.java:252)
at servicenow.datamart.JobConfig.getCreated(JobConfig.java:180)
at servicenow.datamart.Job.call(Job.java:60)
at servicenow.datamart.Loader.loadTables(Loader.java:95)
at servicenow.datamart.Loader.main(Loader.java:51)
I am trying to copy the task table using datapump, this time I have selected limited number of columns but get this error. I would really like datapump to skip the error records and continue instead of hard failure, is it not possible to configure?
"error":{"message":"Transaction cancelled: maximum execution time exceeded","detail":"Transaction cancelled: maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing"},"status":"failure"}
Exception in thread "DPRUN0001020" java.lang.ClassCastException: class com.fasterxml.jackson.databind.node.TextNode cannot be cast to class com.fasterxml.jackson.databind.node.ObjectNode (com.fasterxml.jackson.databind.node.TextNode and com.fasterxml.jackson.databind.node.ObjectNode are in unnamed module of loader 'app')
at sndml.servicenow.RecordList.(RecordList.java:30)
at sndml.servicenow.RestTableAPI.getRecords(RestTableAPI.java:116)
at sndml.servicenow.RestTableReader.call(RestTableReader.java:76)
at sndml.datamart.JobRunner.runLoad(JobRunner.java:223)
at sndml.datamart.JobRunner.call(JobRunner.java:99)
at sndml.daemon.AppJobRunner.call(AppJobRunner.java:85)
at sndml.daemon.AppJobRunner.run(AppJobRunner.java:60)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Hi,
First of all Thank You for this beautiful software. It has helped a lot.
We have a huge table with 46 million records, I am trying to pull data but it times out. Any way can we include Partition by Minute too to the list. Thanks
Hi ,
Thanks for letting me use this product.
I am trying to set up a properties file to connect the Oracle ADW database. Getting the following error.
Any help will be appreciated.
properties file.
servicenow.instance=Dev564343
servicenow.username=testsync
servicenow.password=XXXXXXXXXXXXXXXXXXXXXXXX
datamart.url=jdbc:oracle:thin:@snowsyncpdb_medium?TNSADMIN=/home/opc/snowsync/sndml3/Wallet_SNOWSYNCDB/
datamart.username=admin
datamart.password=XXXXXXXXXXXXXXXXXXXXXXXX
Error
[opc@sarapp sndml3]$ java -jar sndml-3.0.2-ora.jar -p rc.properties -y /home/opc/rcsync/sndml3/RC/99-User/sys_user.yaml
23:29:23 INFO [main] Globals GLOBAL INIT: loadProperties /home/opc/rcsync/sndml3/rc.properties
23:29:23 INFO [main] Session GLOBAL INIT: instance=https://Dev564343.service-now.com/ user=rrreportdb
23:29:23 INFO [main] Database GLOBAL INIT: database=jdbc:oracle:thin:@snowsyncpdb_medium?TNSADMIN=/home/opc/rcsync/sndml3/Wallet_SNOWSYNCDB/ user=admin
Exception in thread "main" java.sql.SQLRecoverableException: IO Error: Invalid connection string format, a valid format is: "host:port:sid"
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:743)
at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:666)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:32)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:566)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at servicenow.datamart.Database.(Database.java:48)
at servicenow.datamart.Loader.main(Loader.java:42)
Caused by: oracle.net.ns.NetException: Invalid connection string format, a valid format is: "host:port:sid"
at oracle.net.resolver.AddrResolution.resolveSimple(AddrResolution.java:554)
at oracle.net.resolver.AddrResolution.resolveAndExecute(AddrResolution.java:482)
at oracle.net.ns.NSProtocol.establishConnection(NSProtocol.java:595)
at oracle.net.ns.NSProtocol.connect(NSProtocol.java:230)
at oracle.jdbc.driver.T4CConnection.connect(T4CConnection.java:1452)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:496)
... 7 more
[opc@sarapp sndml3]$
Thanks
Raj
We get followin error in logs:
RestTableAPI [main] u_esm_ci_assoc PROCESS: getStats count=228446 query=
Exception in thread "main" com.fasterxml.jackson.core.JsonParseException: Unexpected character ('}' (code 125)): was expecting a colon to separate field name and value
at [Source: (String)"{"result":[{"sys_id":"000a51868779355083d10f6e8bbb35ed","sys_created_on":"2023-10-11 19:47:28","sys_updated_on":"2023-10-11
I'm using this config:
tables:
- source: table_name
action: sync
partition: hour
pagesize: 50
However, I see this request being issued with a 10k limit:
20:09:43 WARN JsonRequest [main] table_name RESPONSE: GET https://<servicnow-url>/api/now/table/<table_name>?sysp
arm_limit=10000&sysparm_exclude_reference_link=true&sysparm_display_value=false&sysparm_fields=sys_id%2Csys_created_on%2Csys_updated
_on&sysparm_query=.....
I see in Synchronizer.java that this might be hardcoded and not respecting the pagesize option:
I did not see this same issue when using since: last
Hi,
Thank you for the great work.
It worked with a mysql database by using the command :
java -jar sndml-3.0.2-mysql.jar -p properties.conf -t task_load.yaml
Is this possible to use an Azure SQL server instead of mysql ?
what will be the datamart.url parameter :
datamart.url=jdbc:sqlserver://xxx.database.windows.net:1433/database_name ?
If I execute : java -jar sndml-3.0.2.jar -p properties_sql.conf -t sys_user
I got : "Error: Could not find or load main class fully.qualified.MainClass"
So i have been using this to since key tables to report on for over 3 years and it has helped us so much by allowing us to create the advanced reports we need to manage our system. Today i tried to add in the kb_knowledge table as we just started using the Knowledge module in February. When i add the table to the config. It sees it creates the table in my SQL server see how many records are in the table but then just stops
Here is one of the other tables that doesn't have an issue:
09:28:16 INFO [main] LoaderJob sc_task INIT: getKeys sys_created_on<2020-04-22 13:20:02^sys_updated_on>=2020-04-20 13:20:02^ORDERBYsys_created_on^ORDERBYsys_id
09:28:16 INFO [main] LoaderJob sc_task INIT: begin load sc_task (13 rows)
09:28:17 INFO [main] DatabaseUpdateWriter sc_task PROCESS: loaded 13 / 13
09:28:17 INFO [main] LoaderJob sc_task FINISH: end load sc_task (13 rows)
Then here is what happens to the kb_knowledge table
09:28:17 INFO [main] RestTableAPI kb_knowledge PROCESS: getStats count=712 query="sys_created_on<2020-04-22 13:20:02^ORDERBYsys_created_on^ORDERBYsys_id"
09:28:17 INFO [main] LoaderJob kb_knowledge INIT: begin load kb_knowledge (712 rows)
and it just stops there.
I Saw you release 3.1.0 and i upgraded and tried that and same results.
When loading cmdb_ci table (aprox. 8 million records) it takes 72 hours to sync the table.
We are using following configuration: - {name: cmdb_ci, action: sync, partition: week, pagesize: 1000}
Is there a faster way to sync the data?
While attempting to implement the above statement. It does not reflect the delete it records on the target table. Version sndml3-master. The table Im trying to track the record changes is Problem which is by default auditing enabled by the system.
After an import from a legacy system, we're finding some records have no value for sys_created_on
, which causes exceptions in both sndml2 and sndml3.
Exception in thread "main" servicenow.core.InvalidDateTimeException:
at servicenow.core.DateTime.<init>(DateTime.java:76)
at servicenow.core.DateTime.<init>(DateTime.java:57)
at servicenow.rest.RestTableAPI.getStats(RestTableAPI.java:60)
at servicenow.rest.MultiDatePartReader.initialize(MultiDatePartReader.java:79)
at servicenow.datamart.TableLoader.call(TableLoader.java:129)
at servicenow.datamart.Loader.loadTables(Loader.java:94)
at servicenow.datamart.Loader.main(Loader.java:59)
Below is an example SOAP response from debug logging on SNDML2 (not sure how to do debug logging in sndml3 yet; my log4j settings file isn't working)
<sys_class_path>/!!/!$/#^</sys_class_path>
<sys_created_by>SalesForce</sys_created_by>
<sys_created_on />
<sys_domain>f4d05ca0db58434041d8d7795e9619ca</sys_domain>
<sys_domain_path>!!!/!!)/!!#/!!$/</sys_domain_path>
<sys_id>343263c4dba85b00a43689584b9619b1</sys_id>
java -jar -Dlog4j.configuration=file:/home/user/sndml3/log4j.debug sndml-3.0.1-b.4-mssql.jar -p properties.prpty -y tables.yaml
servicenow.instance=https://*****.service-now.com/
servicenow.username=*****
servicenow.password=*****
servicenow.limit=200
servicenow.getkeys_limit=20000
datamart.url=jdbc:sqlserver://*****:1433;database=ServiceNow
datamart.username=*****
datamart.password=*****
datamart.dialect=mssql
datamart.schema=*****
datamart.check_readable=false
metrics: checkpoint_allscripts.txt
tables:
- {source: cmdb_ci, truncate: true, partition: month}
- {source: cmdb_ci_service, truncate: true, partition: month}
log4j.rootLogger=DEBUG, Console
log4j.appender.Console=org.apache.log4j.ConsoleAppender
log4j.appender.Console.layout=org.apache.log4j.PatternLayout
log4j.appender.Console.layout.ConversionPattern=%d{HH:mm:ss} [%t] %p %c %m%n
# This is what I used in v2. Just started with v3 this morning; haven't dug into how logging is different yet.
Does version 3 have the same option?
I ran the process for an oracle sink........ I used last two template jar for Oracle sndml3.0.3 and 02 and did not
map correctly or pull out any data....... I wondering which is the best wait to debug this problem.
Since of the release of the Log4J exploit, I saw that sndml3 also uses Log4j. Is there a plan how the Log4J will be upgraded to 2.16?
Will there be a new release of sndml3?
Getting this , job aborts from the app. On UI it stays in running status, is there a way, the record is skipped and job continues?
03:58:50 ERROR DatabaseUpdateStatement [DPRUN0001004] DPRUN0001004 PROCESS: bindField time_in_seconds="17763811500"
Exception in thread "DPRUN0001004" java.lang.NumberFormatException: For input string: "17763811500"
at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:67)
at java.base/java.lang.Integer.parseInt(Integer.java:668)
at java.base/java.lang.Integer.parseInt(Integer.java:784)
at sndml.datamart.DatabaseStatement.bindField(DatabaseStatement.java:236)
at sndml.datamart.DatabaseStatement.bindField(DatabaseStatement.java:69)
at sndml.datamart.DatabaseUpdateStatement.update(DatabaseUpdateStatement.java:34)
at sndml.datamart.DatabaseUpdateWriter.writeRecord(DatabaseUpdateWriter.java:34)
at sndml.datamart.DatabaseTableWriter.processRecords(DatabaseTableWriter.java:76)
at sndml.servicenow.RestTableReader.call(RestTableReader.java:80)
at sndml.datamart.JobRunner.runLoad(JobRunner.java:223)
at sndml.datamart.JobRunner.call(JobRunner.java:99)
at sndml.daemon.AppJobRunner.call(AppJobRunner.java:85)
at sndml.daemon.AppJobRunner.run(AppJobRunner.java:60)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
04:00:45 INFO AgentScanner [scanner] main INIT: Nothing ready
04:02:45 INFO AgentScanner [scanner] main INIT: Nothing ready
04:14:43 INFO AgentScanner [scanner] main INIT: Nothing ready
Here is the error:
`C:\ServiceNowDatamartLoader\sndml-3.2.0>java -Xmx1048M -jar sndml-3.0.2-mssql.jar -p profile.txt -y cmdb_ci_full.yml
15:00:24 INFO [main] Globals GLOBAL INIT: loadProperties C:\ServiceNowDatamartLoader\sndml-3.2.0\profile.txt
15:00:24 INFO [main] Session GLOBAL INIT: instance=
15:00:24 INFO [main] Database GLOBAL INIT: database=;databaseName=
15:00:25 INFO [main] Generator GLOBAL INIT: dialect=mssql schema=null namecase=LOWER namequotes=SQUARE
15:00:25 INFO [main] LoaderConfig GLOBAL INIT:
metrics: cmdb_ci.metrics
tables:
I was trying to replicate the database views table data into MSSQL database but am getting the following error.
====
05:06:35 ERROR AppJobRunner [DPRUN0001055] DPRUN0001055 ERROR: sndml.daemon.AppJobRunner.call: sndml.servicenow.NoContentException
sndml.servicenow.NoContentException: GET https://dev109575.service-now.com/api/x_108443_sndml/gettableschema/incident_time_worked
HTTP/1.1 404 Not Found
RESPONSE:
at sndml.servicenow.JsonRequest.executeRequest(JsonRequest.java:134) ~[sndml-3.4.4-mssql.jar:?]
at sndml.servicenow.JsonRequest.execute(JsonRequest.java:44) ~[sndml-3.4.4-mssql.jar:?]
at sndml.daemon.AppSchemaFactory.getSchema(AppSchemaFactory.java:30) ~[sndml-3.4.4-mssql.jar:?]
at sndml.servicenow.Session.getSchema(Session.java:197) ~[sndml-3.4.4-mssql.jar:?]
at sndml.servicenow.Table.getSchema(Table.java:115) ~[sndml-3.4.4-mssql.jar:?]
at sndml.datamart.Generator.getCreateTable(Generator.java:262) ~[sndml-3.4.4-mssql.jar:?]
at sndml.datamart.Database.createTable(Database.java:256) ~[sndml-3.4.4-mssql.jar:?]
at sndml.datamart.Database.createMissingTable(Database.java:301) ~[sndml-3.4.4-mssql.jar:?]
at sndml.datamart.JobRunner.runLoad(JobRunner.java:190) ~[sndml-3.4.4-mssql.jar:?]
at sndml.datamart.JobRunner.call(JobRunner.java:99) ~[sndml-3.4.4-mssql.jar:?]
at sndml.daemon.AppJobRunner.call(AppJobRunner.java:85) [sndml-3.4.4-mssql.jar:?]
at sndml.daemon.AppJobRunner.run(AppJobRunner.java:60) [sndml-3.4.4-mssql.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
I have tried the following but still error remains same.
When trying to use upsert job (after Insert job created table in mssql successfully and has sys_id as first column) gives below error.
18:28:26 INFO Generator [DPRUN0001006] INIT: dialect=mssql schema=dbo namecase=LOWER namequotes=SQUARE autocommit=false
Exception in thread "DPRUN0001006" java.lang.RuntimeException: expected 'sys_id', found 'priority' in first column of table 'incident'
at sndml.datamart.ColumnDefinitions.(ColumnDefinitions.java:67)
at sndml.datamart.DatabaseTableWriter.open(DatabaseTableWriter.java:51)
at sndml.datamart.DatabaseUpdateWriter.open(DatabaseUpdateWriter.java:24)
at sndml.datamart.DatabaseUpdateWriter.open(DatabaseUpdateWriter.java:12)
at sndml.datamart.JobRunner.runLoad(JobRunner.java:200)
at sndml.datamart.JobRunner.call(JobRunner.java:99)
at sndml.daemon.AppJobRunner.call(AppJobRunner.java:85)
at sndml.daemon.AppJobRunner.run(AppJobRunner.java:60)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
When using a configuration for sync and combining a created filter and there are not any results returned from ServiceNow, a null pointer is thrown as started and finished metrics are not being set.
Note that update works fine and sync works if you remove the partition option.
Example config item
threads: 1
metrics: C:/temp/all_tables.metric
tables:
Error Stack Trace
Exception in thread "main" java.lang.NullPointerException
at servicenow.api.WriterMetrics.getElapsedSec(WriterMetrics.java:46)
at servicenow.api.WriterMetrics.write(WriterMetrics.java:128)
at servicenow.datamart.Loader.writeAllMetrics(Loader.java:109)
at servicenow.datamart.Loader.loadTables(Loader.java:101)
at servicenow.datamart.Loader.main(Loader.java:61)
Log Entries
12:25:40 INFO [main] LoaderConfig GLOBAL INIT:
threads: 1
metrics: C:/temp/all_tables.metric
tables:
While copying a large table , getting these warning messages , does these mean the records are not mirrored? I checked few and it looked like they are replicated. Can you please advise.
1:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dbec543872f3010a7eac9570cbbxx35b4
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dbf392487570d5x0a7eac9570cbb35d4
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc086dd68705c11019f110e83cbb35fe
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc09381877141101d9f110e83cbb35f1
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc16eb9871e0d10bd1ad32e80cbb3579
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc1b8a787dbc1d0a7eac9570dcbb35dc
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc22e64872389d0a7eac957a0cbb35e3
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc4c3c88753c550a7eac9570cabb35d0
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc4fe9a87777c10bd1d32e80cbxb3576
01:35:21 WARN DatabaseInsertWriter [DPRUN0001001] DPRUN0001001 PROCESS: Failed/Skipped 6dc52ae8872389da0a7eac9570cbb35c9
Getting error:
Exception in thread "main" servicenow.datamart.ResourceException: java.sql.SQLException: No suitable driver found for jdbc:mssql://servername.com/SNOW
running this from command line:
C:\sndml3-master\sndml-3.1.2-bin\sndml-3.1.2>java -jar sndml-3.1.2-mssql.jar -p Profile.txt -t cmn_location
Profile is:
servicenow.instance=https://company.service-now.com
servicenow.username=snowacct
servicenow.password=snowpwd
datamart.url=jdbc:mssql://10.10.10.10;SNOW
datamart.username=SNOW_Admin
datamart.password=helloWorld
Any help much appreciated...
I have tried changing the sql server to the fqdn, as well as including the port..ie jdbc:mssql://servername.com:1433/SNOW
and get the same error...i've tried semicolon then dbname ie 1433;SNOW ...same error
Is sndml3 API supports SQLite Database/Datasource? if Yes, then what is the procedure to install sndml3 on windows platform to communicate with SQLite database?
Not sure if there's a bug or a weird data condition; seemed worth reporting though.
01:39:06 INFO [main] Database customer_contact INIT: truncate table mydomain.customer_contact
01:39:06 INFO [main] Database customer_contact PROCESS: truncate table mydomain.customer_contact
01:39:08 INFO [main] RestTableAPI customer_contact PROCESS: getStats query="" count=731579
01:39:08 INFO [main] TableLoader customer_contact INIT: begin load customer_contact (731579 rows)
01:39:13 INFO [main] DatabaseInsertWriter customer_contact PROCESS: loaded 200 / 731579
01:39:18 INFO [main] DatabaseInsertWriter customer_contact PROCESS: loaded 400 / 731579
[...]
02:51:27 INFO [main] DatabaseInsertWriter customer_contact PROCESS: loaded 171000 / 731579
02:51:32 INFO [main] DatabaseInsertWriter customer_contact PROCESS: loaded 171200 / 731579
02:51:37 INFO [main] DatabaseInsertWriter customer_contact PROCESS: loaded 171400 / 731579
02:51:46 INFO [main] DatabaseInsertWriter customer_contact PROCESS: loaded 171600 / 731579
02:51:51 INFO [main] DatabaseInsertWriter customer_contact PROCESS: loaded 171800 / 731579
Exception in thread "main" com.microsoft.sqlserver.jdbc.SQLServerException: Violation of PRIMARY KEY constraint 'customer_contact_pk'. Cannot insert duplicate key in object 'mydomain.customer_contact'. The duplicate key value is (3b3b1450db90db00d4b85eea4b96198f).
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:217)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1635)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:426)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:372)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:6276)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1794)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:184)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:159)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeUpdate(SQLServerPreparedStatement.java:315)
at servicenow.datamart.DatabaseInsertStatement.insert(DatabaseInsertStatement.java:39)
at servicenow.datamart.DatabaseInsertWriter.writeRecord(DatabaseInsertWriter.java:31)
at servicenow.datamart.DatabaseTableWriter.processRecords(DatabaseTableWriter.java:38)
at servicenow.api.RestTableReader.call(RestTableReader.java:68)
at servicenow.api.RestTableReader.call(RestTableReader.java:8)
at servicenow.datamart.TableLoader.call(TableLoader.java:136)
at servicenow.datamart.Loader.loadTables(Loader.java:94)
at servicenow.datamart.Loader.main(Loader.java:59)
For reference, the record in the error was created and last updated days ago (not in the middle of the load)
curl -X GET \
'https://mydomain.service-now.com/api/now/table/customer_contact/3b3b1450db90db00d4b85eea4b96198f?sysparm_fields=sys_id,sys_created_on,sys_updated_on' \
-H 'Accept: application/json' \
-H 'Authorization: Basic *****' \
-H 'Cache-Control: no-cache' \
-H 'Content-Type: application/json' \
{"result": {
"sys_id": "3b3b1450db90db00d4b85eea4b96198f",
"sys_created_on": "2018-02-11 05:18:13",
"sys_updated_on": "2018-02-23 18:48:03"
}}
servicenow.instance=https://mydomain.service-now.com/
servicenow.username=*****
servicenow.password=*****
servicenow.limit=200
servicenow.getkeys_limit=20000
datamart.url=jdbc:sqlserver://*****:1433;database=ServiceNow
datamart.username=*****
datamart.password=*****
datamart.dialect=mssql
datamart.schema=mydomain
metrics: checkpoint_mydomain.txt
tables:
- {source: change_request, truncate: true}
- {source: cmdb_ci, truncate: true}
- {source: cmdb_ci_service, truncate: true}
- {source: cmn_location, truncate: true}
- {source: cmn_skill, truncate: true}
- {source: core_company, truncate: true}
- {source: customer_account, truncate: true}
- {source: customer_contact, truncate: true}
- {source: incident, truncate: true}
- {source: kb_knowledge, truncate: true}
- {source: problem, truncate: true}
- {source: sc_cat_item_delivery_plan , truncate: true}
- {source: sc_cat_item_delivery_task, truncate: true}
- {source: sc_request, truncate: true}
- {source: sn_customerservice_case, truncate: true}
- {source: sys_user, truncate: true}
- {source: sys_user_group, truncate: true}
- {source: task, truncate: true}
- {source: u_application_instance, truncate: true}
- {source: u_patient_safety, truncate: true}
- {source: cmdb_ci_appl, truncate: true}
- {source: alm_asset, truncate: true}
- {source: csm_consumer, truncate: true}
- {source: service_entitlement, truncate: true}
- {source: sn_app_cs_social_social_profile, truncate: true}
java -ea -jar sndml-3.0.1-b.7-mssql.jar -p property.prpty -y load.yaml &> 20180227.05.log
Hi,
I load the task table.
My YAML look like that:
metrics: task.metrics
tables:
- name: task
created: [2015-10-01, 2018-08-31]
truncate: true
partition : month
when I execute, it create the table and start to load but a lot of record are skipped :
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped bca068eedbc867c04ac4302f9d961919
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped f4a068eedbc867c04ac4302f9d96190b
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped f8a068eedbc867c04ac4302f9d961992
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped a1a0e8eedbc867c04ac4302f9d961914
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped b1a068e2db886b40dde43a4d7c9619cb
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped d6a0e8eedbc867c04ac4302f9d9619ea
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped cba02ceedbc867c04ac4302f9d961991
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped dfa02ceedbc867c04ac4302f9d9619cd
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped dfa02ceedbc867c04ac4302f9d9619e4
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped b7a06ceedbc867c04ac4302f9d96199f
03:09:37 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: Failed/Skipped 14b0e8e2db886b40dde43a4d7c961978
03:09:39 INFO [main] DatabaseInsertWriter task.M2018-08-01 PROCESS: loaded 13874 / 16405 (13874 / 303391)
What does that mean ?
Came across this while copying the "task" table.
06:21:08 INFO RestTableAPI [DPRUN0001011] DPRUN0001011 PROCESS: getStats count=173211 query=
06:21:08 INFO AppProgressLogger [DPRUN0001011] DPRUN0001011 INIT: logStart 173211
06:21:08 INFO RestTableReader [DPRUN0001011] DPRUN0001011 INIT: Starting (173211 rows)
06:22:09 ERROR AppJobRunner [DPRUN0001011] DPRUN0001011 ERROR: sndml.daemon.AppJobRunner.call: com.fasterxml.jackson.core.JsonParseException
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('}' (code 125)): was expecting a colon to separate field name and value
at [Source: (StringReader); line: 1, column: 12623778]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:2337) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:710) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:635) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipColon2(ReaderBasedJsonParser.java:2255) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipColon(ReaderBasedJsonParser.java:2186) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextFieldName(ReaderBasedJsonParser.java:944) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.databind.deser.std.BaseNodeDeserializer.deserializeObject(JsonNodeDeserializer.java:269) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.databind.deser.std.JsonNodeDeserializer.deserialize(JsonNodeDeserializer.java:69) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.databind.deser.std.JsonNodeDeserializer.deserialize(JsonNodeDeserializer.java:16) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.databind.ObjectMapper._readTreeAndClose(ObjectMapper.java:4635) ~[sndml-3.4.6-mssql.jar:?]
at com.fasterxml.jackson.databind.ObjectMapper.readTree(ObjectMapper.java:3042) ~[sndml-3.4.6-mssql.jar:?]
at sndml.servicenow.JsonRequest.execute(JsonRequest.java:46) ~[sndml-3.4.6-mssql.jar:?]
at sndml.servicenow.RestTableAPI.getRecords(RestTableAPI.java:113) ~[sndml-3.4.6-mssql.jar:?]
at sndml.servicenow.RestTableReader.call(RestTableReader.java:76) ~[sndml-3.4.6-mssql.jar:?]
at sndml.datamart.JobRunner.runLoad(JobRunner.java:223) ~[sndml-3.4.6-mssql.jar:?]
at sndml.datamart.JobRunner.call(JobRunner.java:99) ~[sndml-3.4.6-mssql.jar:?]
at sndml.daemon.AppJobRunner.call(AppJobRunner.java:85) [sndml-3.4.6-mssql.jar:?]
at sndml.daemon.AppJobRunner.run(AppJobRunner.java:60) [sndml-3.4.6-mssql.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
06:22:09 ERROR AppStatusLogger [DPRUN0001011] DPRUN0001011 PROCESS: logError JsonParseException
06:22:09 ERROR AgentDaemon [DPRUN0001011] GLOBAL FINISH: Aborting the daemon
It seems that sources with digits ([0-9]) aren't allowed:
if (!Pattern.compile("[a-z_]+").matcher(source).matches())
configError("Invalid source: " + source);
I couldn't see a workaround for this, is there a reason this isn't supported?
19:07:55 INFO [main] Generator GLOBAL INIT: dialect=pg schema=mittest namecase=LOWER namequotes=DOUBLE 19:07:55 INFO [main] Database GLOBAL INIT: SET TIME ZONE 'UTC' 19:07:55 INFO [main] LoaderConfig GLOBAL INIT: metrics: task.metrics tables: - {name: task, action: sync, partition: month} 19:07:57 INFO [main] RestTableAPI task PROCESS: getStats count=677389 query="sys_created_on<2020-01-07 19:07:55" createdRange=[2013-09-20 21:00:42,2020-01-07 19:07:01] 19:07:57 INFO [main] LoaderJob task INIT: partition=DatePartition[interval=MONTH size=77 min=2013-09-01 max=2020-02-01] 19:07:57 INFO [main] LoaderJob task INIT: begin sync task (677389 rows) 19:07:57 INFO [main] Synchronizer task.M2020-01-01 INIT: begin compare Exception in thread "main" java.lang.AssertionError: duplicate key: 75f6096adb8a0c900b3573e1ba96195a at servicenow.datamart.Synchronizer.initialize(Synchronizer.java:69) at servicenow.datamart.Synchronizer.initialize(Synchronizer.java:39) at servicenow.api.DatePartitionedTableReader.call(DatePartitionedTableReader.java:142) at servicenow.api.DatePartitionedTableReader.call(DatePartitionedTableReader.java:15) at servicenow.datamart.LoaderJob.call(LoaderJob.java:126) at servicenow.datamart.Loader.loadTables(Loader.java:95) at servicenow.datamart.Loader.main(Loader.java:60) [ec2-user@ip-172-31-42-132 mit]$
Is there any possibility/option to load domain separated data to different DB's?
So, each domain has it's own datamart DB.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.