Giter VIP home page Giter VIP logo

etcopydata's People

Contributors

dependabot[bot] avatar eltoroap avatar eltoroit avatar j-fischer avatar js4iot avatar oscarscholten avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

etcopydata's Issues

INVALID_QUERY_LOCATOR error during export

When exporting many SObjects that each have many records, it is possible to run into 'INVALID_QUERY_LOCATOR' errors. An explanation of the general problem can be found here:

https://help.salesforce.com/articleView?id=000323582&language=en_US&type=1&mode=1

If you are exporting 20 SObjects, with 10.000 records each, you will likely run into this problem. I have implemented a fix in this commit, for which I'd gladly create a pull request, but would like your feedback on the following:

My fix introduces a limiter on the total number of open connections/queries. It will work for 100% of the cases, but on the other hand exports of 50+ SObjects of just a handful of records are likely to be a bit slower due to a lower concurrency; the slowdown that I've measured is around 20%.

So my question is: what do you prefer: a setting to switch the limiter on/off, or use the default limiter for each and every project? My two cent: don't make this configurable and use a default limiter for all projects, this makes sure new users don't run into this problem. In our case, the "default" export went from 1 minute to 1m10 sec, which is not a noticeable difference in my opinion.

Cheers, Oscar

Support copy to production and delete destination data in CI/CD environments

Hi @eltoroit ,

I've been playing around with your plugin quite a bit and I can see the potential for it to be useful as a deployment tool for some data driven configuration. In order to use it as part of a CI/CD process, however, it would be great to provide a way to flag the plugin invocation as a production deployment without getting another prompt in the command line. In addition, there are cases where a complete removal of the configuration records prior to the load might be desired, so it would be good if the following condition could be "overruled" through a command line flag of some sort.

I understand why those guards are in place, given that this plugin was designed to load training/test data in lower environments. However, would you be open to consider the alternative use case?

I could also see that an explicit deletion on an object level could be useful (clear some object records but not others - see upsert PR), and I'd be happy to look into the PR for that.

Please let me know if you have any questions.

Thanks for considering those changes.

Cheers

// ETCopyData.ts, line 391, 
if (data.settings.deleteDestination) {
	const msg = "Destination Org can not be production because this app deletes data! (2)";
	Util.writeLog(msg, LogLevel.FATAL);
	Util.throwError(msg);
	reject(msg);
} else { //...

Import Fails: TypeError: Cannot read property 'get' of undefined

I was able to successfully export a number of our custom objects. However, the import blew up:

14:51:49.675Z   126     TRACE   Importer.js:153         [PTDataDestination2] Importing Sobject: [PT__Section__c] (8 of 12)
14:51:49.677Z   127     FATAL   Util.js:46              *** Abort Counter: 1 ***
14:51:49.677Z   128     FATAL   Util.js:47              "TypeError: Cannot read property 'get' of undefined\n    at orgDestination.discovery.getSObjects.get.parents.forEach (C:\\Users\\xxxx\\AppData\\Local\\sfdx\\plugins\\node_modules\\etcopydata\\lib\\@ELTOROIT\\Importer.js:181:86)\n    at Array.forEach (<anonymous>)\n    at records.forEach (C:\\Users\\xxxx\\AppData\\Local\\sfdx\\plugins\\node_modules\\etcopydata\\lib\\@ELTOROIT\\Importer.js:179:82)\n    at Array.forEach (<anonymous>)\n    at orgSource.settings.readFromFile.then (C:\\U
14:51:49.677Z   128.2   FATAL   Util.js:47              sers\\xxxx\\AppData\\Local\\sfdx\\plugins\\node_modules\\etcopydata\\lib\\@ELTOROIT\\Importer.js:177:25)\n    at <anonymous>"
[Util.js:46][14:51:49.677Z]: *** Abort Counter: 1 ***
[Util.js:47][14:51:49.677Z]: "TypeError: Cannot read property 'get' of undefined\n    at orgDestination.discovery.getSObjects.get.parents.forEach (C:\\Users\\xxxx\\AppData\\Local\\sfdx\\plugins\\node_modules\\etcopydata\\lib\\@ELTOROIT\\Importer.js:181:86)\n    at Array.forEach (<anonymous>)\n    at records.forEach (C:\\Users\\xxxx\\AppData\\Local\\sfdx\\plugins\\node_modules\\etcopydata\\lib\\@ELTOROIT\\Importer.js:179:82)\n    at Array.forEach (<anonymous>)\n    at orgSource.settings.readFromFile.then (C:\\U
[Util.js:47][14:51:49.677Z]: sers\\xxxx\\AppData\\Local\\sfdx\\plugins\\node_modules\\etcopydata\\lib\\@ELTOROIT\\Importer.js:177:25)\n    at <anonymous>"
 !    { Error
 !    at Function.throwError
 !    (C:\Users\xxxx\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Util.js:49:15)
 !    at orgSource.settings.readFromFile.then.catch
 !    (C:\Users\xxxx\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Importer.js:244:47)
 !    at <anonymous>
 !    cause: TypeError: Cannot read property 'get' of undefined
 !    at orgDestination.discovery.getSObjects.get.parents.forEach
 !    (C:\Users\xxxx\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Importer.js:181:86)
 !    at Array.forEach (<anonymous>)
 !    at records.forEach
 !    (C:\Users\xxxx\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Importer.js:179:82)
 !    at Array.forEach (<anonymous>)
 !    at orgSource.settings.readFromFile.then
 !    (C:\Users\xxxx\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Importer.js:177:25)
 !    at <anonymous>,
 !    name: 'Error',
 !    actions: null,
 !    exitCode: -1 }

If you need the export files I will need to send them to you privately.

Provide Support for Setting Bulk API Batch Size

Hi,

Seeing the following error when importing large amounts of data:

ERROR running ETCopyData:import: Failed to read request. Exceeded max size limit of 10000000

Is there any way to set the Bulk API Batch Size?

OrgWideEmailAddress not pulled in?

Is the a reason OrgWideEmailAddress data would not be pulled in?

Looks like its ignored all together.

Here is what is in my json for it:

    "sObjectsData": [
        {
            "name": "OrgWideEmailAddress",
            "ignoreFields": "OwnerId,CreatedBy,CreatedDate,CurrencyIsoCode",
            "maxRecords": -1,
            "orderBy": "Address",
            "where": null
        }
    ],

Need the ability to copy between orgs with/without namespacing

I love this plugin, been so valuable! However running into an issue where I cannot copy data from a non-namespaced sandbox/scratch org to a namespaced one (as the namepaced org objects would have a prefix). It would be very helpful for ISV development if we could provide a namespace parameter for the orgSource and orgDestination so that we can copy between non-namespaced/namespaced, or even two orgs with different namespacing. Thanks!

Beta NPM module is broken - Cannot find module 'lodash/core'

We use the beta channel of ETCopyData in our CI/CD workflows

Starting between late last night and this morning, we started seeing the following build failures:

There was a problem executing your sfdx command: Error: Cannot find module 'lodash/core'
     [java]     at Function._load (/usr/local/lib/sfdx/node_modules/@salesforce/lazy-require/lib/LazyLoader.js:89:24)
     [java]     at require (/usr/local/lib/sfdx/node_modules/v8-compile-cache/v8-compile-cache.js:161:20)
     [java]     at Object.<anonymous> (~/.local/share/sfdx/node_modules/jsforce/lib/api/analytics.js:8:9)
     [java]     at Module._compile (/usr/local/lib/sfdx/node_modules/v8-compile-cache/v8-compile-cache.js:192:30)
     [java] 
     [java]  Command: sfdx ETCopyData:import -c data/etcopydata -d [email protected] -s [email protected] --loglevel=trace --json

We made no changes to our application code and have reviewed the SFDX CLI release notes (and plugin versions) and the environment of our CI/CD container to make sure the versions haven't changed.

The one change is that the ETCopyData plugin has updated from 0.5.9b

Installing plugin etcopydata... installed v0.5.9b

The last successful build shows it was 0.5.8b:

Installing plugin etcopydata... installed v0.5.8b

What was in 0.5.9b that could've broken lodash?

Also, I need to point out how sloppy this release process is -- there's no git tag to match the NPM version -- no Github Issue to track the work.

Custom Settings record migration

Is there any option to migrate custom setting records? Because I could migrate it using cli sfdx force:data:tree:export and sfdx force:data:tree:import commands. It would be nice to have this option as well

Export without specifying destination?

As I mentioned in #4, when we do an export we won't know the destination yet. The imports will be done at a later time against an org that is yet to be created. Is there any reason you don't allow the destination to be blank when running an export (as of version 0.4.3)?

Export process runs indefinitely with "includeAllCustom": true

Last night I tried running an export with "includeAllCustom": true

This morning when I came in the process was still running and there were no output files created.

I ran the command with loglevel=trace and the last message printed out was about schema discovery:

13:43:38.670Z   146     TRACE   SchemaDiscovery.js:229  [PTDataDestination1] Found sObject [MyCustomObject__c].

Our schema is fairly complex (multiple levels deep). Any thoughts on debugging this (or is there a known issue)? My next step will be to start manually adding custom objects to sObjectsData and see how far I can get without it "hanging."

Never ending export when exporting account and opportunity

Hi,

I'm pretty now to SalesForce, trying to figure out about objects relationship.
So, i can't export Opportunity and Account at the same time (Saw a related post about it but didn't get an answer from it).
Tried to play with the ignore fields and reference fields but couldn't figure it out

Here is my json file and trace logs
`
{
"now": "2020-03-09T15:16:05.247Z",
"orgSource": "DevHub",
"orgDestination": "dreamhouse-org",
"sObjectsData": [
{
"name": "Opportunity",
"fieldsToExport": "Name",
"ignoreFields": "CreatedDate,CurrencyIsoCode,OwnerId,CreatedBy",
"maxRecords": -1,
"orderBy": "Name",
"twoPassReferenceFields": "AccountId",
"where": "Name = 'Banana sells'"
},
{
"name": "Account",
"fieldsToExport": "Name",
"ignoreFields": "DandbCompanyId,CreatedDate,CurrencyIsoCode,OwnerId,CreatedBy",
"maxRecords": -1,
"orderBy": "Name",
"twoPassReferenceFields": "SBQQ__DefaultOpportunity__c",
"where": null
}
],
"sObjectsMetadata": [],
"rootFolder": "./ETCopyData",
"includeAllCustom": false,
"stopOnErrors": false,
"ignoreFields": "OwnerId, CreatedBy, CreatedDate, CurrencyIsoCode",
"maxRecordsEach": null,
"deleteDestination": true,
"pollingTimeout": 10000
}

Timestamp # Level Line Number Description
15:16:05.214Z 1 TRACE ETCopyData.js:17 Log level: trace
15:16:05.218Z 2 INFO ETCopyData.js:19 ETCopyData:Full Process Started
15:16:05.219Z 3 TRACE ETCopyData.js:33 Parameter: source [DevHub]
15:16:05.220Z 4 TRACE ETCopyData.js:37 Parameter: destination [dreamhouse-org]
15:16:05.228Z 5 INFO Settings.js:202 Configuration value for [orgSource] read from command line: DevHub
15:16:05.229Z 6 INFO Settings.js:214 Configuration value for [orgDestination] read from command line: dreamhouse-org
15:16:05.230Z 7 INFO Settings.js:289 Configuration value for [rootFolder]: ./ETCopyData
15:16:05.239Z 8 INFO Settings.js:289 Configuration value for [ignoreFields]: OwnerId, CreatedBy, CreatedDate, CurrencyIsoCode
15:16:05.240Z 9 INFO Settings.js:289 Configuration value for [deleteDestination]: true
15:16:05.241Z 10 INFO Settings.js:289 Configuration value for [pollingTimeout]: 10000
15:16:05.242Z 11 INFO Settings.js:225 Configuration value for [sObjectsData]: 2 sObjects found.
15:16:05.243Z 12 INFO Settings.js:232 Configuration value for [sObjectsMetadata]: 0 sObjects found.
15:16:05.260Z 13 TRACE ETCopyData.js:271 Configuration settings read.
15:16:05.461Z 14 INFO OrgManager.js:27 [DevHub] Alias for username: [[email protected]]
15:16:06.954Z 15 TRACE SchemaDiscovery.js:229 [DevHub] Found sObject [Account].
15:16:06.956Z 16 TRACE SchemaDiscovery.js:229 [DevHub] Found sObject [Opportunity].
15:16:07.528Z 17 INFO OrgManager.js:27 [dreamhouse-org] Alias for username: [[email protected]]
15:16:08.577Z 18 TRACE SchemaDiscovery.js:229 [dreamhouse-org] Found sObject [Account].
15:16:08.580Z 19 TRACE SchemaDiscovery.js:229 [dreamhouse-org] Found sObject [Opportunity].
15:16:09.080Z 20 WARN ETCopyData.js:243 There are some field differences between the orgs.
15:16:09.082Z 21 WARN ETCopyData.js:247 If the following fields do exist in the org, then check security (FLS) because they could be hidden, but for now those fields will be ignored.
15:16:09.083Z 22 WARN ETCopyData.js:252 [DevHub] Field [Account.CustomerPriority__c] does not exist in [dreamhouse-org].
15:16:09.091Z 23 WARN ETCopyData.js:252 [DevHub] Field [Account.SLA__c] does not exist in [dreamhouse-org].
15:16:09.093Z 24 WARN ETCopyData.js:252 [DevHub] Field [Account.Active__c] does not exist in [dreamhouse-org].
15:16:09.094Z 25 WARN ETCopyData.js:252 [DevHub] Field [Account.NumberofLocations__c] does not exist in [dreamhouse-org].
15:16:09.095Z 26 WARN ETCopyData.js:252 [DevHub] Field [Account.UpsellOpportunity__c] does not exist in [dreamhouse-org].
15:16:09.096Z 27 WARN ETCopyData.js:252 [DevHub] Field [Account.SLASerialNumber__c] does not exist in [dreamhouse-org].
15:16:09.097Z 28 WARN ETCopyData.js:252 [DevHub] Field [Account.SLAExpirationDate__c] does not exist in [dreamhouse-org].
15:16:09.104Z 29 WARN ETCopyData.js:243 There are some field differences between the orgs.
15:16:09.105Z 30 WARN ETCopyData.js:247 If the following fields do exist in the org, then check security (FLS) because they could be hidden, but for now those fields will be ignored.
15:16:09.107Z 31 WARN ETCopyData.js:252 [DevHub] Field [Opportunity.DeliveryInstallationStatus__c] does not exist in [dreamhouse-org].
15:16:09.108Z 32 WARN ETCopyData.js:252 [DevHub] Field [Opportunity.TrackingNumber__c] does not exist in [dreamhouse-org].
15:16:09.110Z 33 WARN ETCopyData.js:252 [DevHub] Field [Opportunity.OrderNumber__c] does not exist in [dreamhouse-org].
15:16:09.111Z 34 WARN ETCopyData.js:252 [DevHub] Field [Opportunity.CurrentGenerators__c] does not exist in [dreamhouse-org].
15:16:09.112Z 35 WARN ETCopyData.js:252 [DevHub] Field [Opportunity.MainCompetitors__c] does not exist in [dreamhouse-org].
15:16:09.114Z 36 TRACE SchemaDiscovery.js:94 [DevHub] Field [Account.CustomerPriority__c] ignored because Org mismatch
15:16:09.119Z 37 TRACE SchemaDiscovery.js:94 [DevHub] Field [Account.SLA__c] ignored because Org mismatch
15:16:09.120Z 38 TRACE SchemaDiscovery.js:94 [DevHub] Field [Account.Active__c] ignored because Org mismatch
15:16:09.122Z 39 TRACE SchemaDiscovery.js:94 [DevHub] Field [Account.NumberofLocations__c] ignored because Org mismatch
15:16:09.123Z 40 TRACE SchemaDiscovery.js:94 [DevHub] Field [Account.UpsellOpportunity__c] ignored because Org mismatch
15:16:09.124Z 41 TRACE SchemaDiscovery.js:94 [DevHub] Field [Account.SLASerialNumber__c] ignored because Org mismatch
15:16:09.124Z 42 TRACE SchemaDiscovery.js:94 [DevHub] Field [Account.SLAExpirationDate__c] ignored because Org mismatch
15:16:09.125Z 43 TRACE SchemaDiscovery.js:94 [DevHub] Field [Opportunity.DeliveryInstallationStatus__c] ignored because
Org mismatch
15:16:09.127Z 44 TRACE SchemaDiscovery.js:94 [DevHub] Field [Opportunity.TrackingNumber__c] ignored because Org mismatch15:16:09.127Z 45 TRACE SchemaDiscovery.js:94 [DevHub] Field [Opportunity.OrderNumber__c] ignored because Org mismatch
15:16:09.128Z 46 TRACE SchemaDiscovery.js:94 [DevHub] Field [Opportunity.CurrentGenerators__c] ignored because Org mismatch
15:16:09.129Z 47 TRACE SchemaDiscovery.js:94 [DevHub] Field [Opportunity.MainCompetitors__c] ignored because Org mismatch
`

Regards,
Olivier

Help identifying the circular reference.

Im having problems finding the cause of the Import Order Deadlock.
Would be great if the commands could output indications of which fields are causing this conflict.

When specifying --json the only output should be the json results

When putting the --json option in the command the only output should in json. I have an automated system running that executes sfdx commands and parses the output (expecting only json).

If I don't specify a --loglevel it appears to be giving me TRACE level logging. Is there no way to turn logging completely off?

Also, if I specify --loglevel=error I get no JSON output. All I get is:

ETCopyData:Import... /

^ BTW I don't want the above message to appear at all when I specify --json.

Record Type Not Included?

I just did a simple export/import on account:

{
    "now": "2018-12-05T12:40:35.780Z",
    "orgSource": "PTDataSource1",
    "orgDestination": "PTDataDestination1",
    "sObjectsData": [
    	{
            "name": "Account",
            "ignoreFields": "OwnerId",
            "maxRecords": 20,
            "orderBy": "Name"
        }],
    "sObjectsMetadata": [],
    "rootFolder": "./ETDataFiles",
    "includeAllCustom": false,
    "stopOnErrors": true,
    "ignoreFields": "OwnerId, CreatedBy, CreatedDate, CurrencyIsoCode",
    "maxRecordsEach": -1,
    "deleteDestination": false,
    "pollingTimeout": 100000
}

The export worked but when i went to do an import each account record errored out with something like:

13:08:05.814Z   33      ERROR   Importer.js:229         *** [PTDataDestination1] Error importing [Account] record #13. CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:record type missing for: Account:--

Missing fields from CSV in bulk API

I have the same fields excluded consistently from the CSV import file when using the bulk API

I do a comparison of both orgs and the fields are there.
then Export, and the field has data in the JSON
I import using the BulkAPI, and those fields are not populated.
I've pulled down the request.csv, and those fields are not in the CSV
The same command using the standard API uploads those fields fine.

It appears that if the first record in the sobject__c.json has null or false as the value, then that field is not loaded for ANY record.

"./node_modules/sfdx-cli/bin/run" can't find the ETCopyData plugin

Hi, I've followed the intructions to run ETCopyData using sfdx. In other words, I've installed npm and then used the suggested import command: "./node_modules/sfdx-cli/bin/run ETCopyData import --configfolder ./xxxx/data --loglevel trace --json"

However, I'm getting a warning stating "ETCopyData import is not a sfdx command". I also noticed, I could not detect the installed plugin from sfdx (i.e. ./node_modules/sfdx-cli/bin/run plugins). It returned a "No plugins installed" message. I was able to verify the ETCopyData plugin was installed using "sf plugins".

Update: Fixed the issue by uninstalling plugin from sf and installing in sfdx

Any way to dynamically assign the destination?

For automated testing we are writing scripts to:

  • Create a new scratch org (via sfdx)
  • Install packages into the new org
  • Load data into the new org (this is where ETCopyData hopefully comes in)

The problem I am seeing is that you have to have the source and destination in the ETCoptData.json. It would be great if there were a way to specify the destination on the command line.

Also, I'd like to be able to do an export with out knowing what the destination is going to be. For example, I want to manually run the export to get the files ready for some org that won't exists until we are ready to run test. Then, when we spin up the org (in our script) for testing we import the data that we have on hand from a previous export. Any thoughts on being able to do something like this?

BTW - your post seems to indicate this is doable but I don't see how:

(https://medium.com/@ElToroIT/etcopydata-dd190f4e85f0)

ETCopyData, which is fully configurable, has several steps: compare, export, delete and import. You can execute these steps at once, or if you prefer, you can export the data from one org, keep the data around and import it into multiple scratch orgs or sandboxes.

PricebookEntry

Im using sfdx ETCopyData:full command with the folloing config, but the PricebookEntry wont get exported.
Any way to fix this?

Thanks

{ "name": "Product2", "ignoreFields": "", "maxRecords": -1, "orderBy": null, "where": null }, { "name": "Pricebook2", "ignoreFields": "", "maxRecords": -1, "orderBy": null, "where": null }, { "name": "PricebookEntry", "ignoreFields": "", "maxRecords": -1, "orderBy": null, "where": null }

Exit Code Zero even if there is an ETCopyData Failure

FYI - I posted the same question in the Saleforce DX Chatter group

I found that the exit code from sfdx is sometimes zero even if there was a failure. I'm using the ETCopyData plugin and I can't be sure if the problem is with the plugin or with sfdx itself. What I know is that in version 6.52.2 of sfdx-cli I get an exit code of "1" when there is an error in ETCopyData. With sfdx-cli 7.14 I get an exit code of zero. Any thoughts?

Timeouts During Import

Since installing version 0.4.3 I have had several occasions where the import times out. Its not always on the same table. Usually after a couple times retrying the import it goes through. I've never had this happen with the previous version (maybe just lucky?).

Here is the output from a couple of the timeouts:

18:12:45.441Z   73      TRACE   Importer.js:155         [PTDataDestination9] Importing Sobject: [PTPay__PGSettings__c] (1 of 13)
18:14:25.889Z   74      FATAL   Util.js:46              *** Abort Counter: 1 ***
18:14:25.890Z   75      FATAL   Util.js:47              "PollingTimeout: Polling time out. Job Id = 750L0000003vQNWIA2 , batch Id = 751L0000003CzxUIAS\n    at Timeout.poll [as _onTimeout] (C:\\Users\\me\\AppData\\Local\\sfdx\\plugins\\node_modules\\jsforce\\lib\\api\\bulk.js:573:17)\n    at ontimeout (timers.js:475:11)\n    at tryOnTimeout (timers.js:310:5)\n    at Timer.listOnTimeout (timers.js:270:5)"
 !    { Error
 !    at Function.throwError
 !    (C:\Users\me\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Util.js:49:15)
 !    at orgDestination.conn.bulk.load
 !    (C:\Users\me\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Importer.js:216:41)
 !    at C:\Users\me\AppData\Local\sfdx\plugins\node_modules\jsforce\lib\api\bulk.js:511:9
 !    at _combinedTickCallback (internal/process/next_tick.js:131:7)
 !    at process._tickCallback (internal/process/next_tick.js:180:9)
 !    cause:
 !    { PollingTimeout: Polling time out. Job Id = 750L0000003vQNWIA2 , batch Id = 751L0000003CzxUIAS
 !    at Timeout.poll [as _onTimeout]
 !    (C:\Users\me\AppData\Local\sfdx\plugins\node_modules\jsforce\lib\api\bulk.js:573:17)
 !    at ontimeout (timers.js:475:11)
 !    at tryOnTimeout (timers.js:310:5)
 !    at Timer.listOnTimeout (timers.js:270:5)
 !    name: 'PollingTimeout',
 !    jobId: '750L0000003vQNWIA2',
 !    batchId: '751L0000003CzxUIAS' },
 !    name: 'Error',
 !    actions: null,
 !    exitCode: -1 }
 
 18:16:53.924Z   82      TRACE   Importer.js:155         [PTDataDestination9] Importing Sobject: [PTD__OSettings__c] (4 of 13)
18:18:35.107Z   83      FATAL   Util.js:46              *** Abort Counter: 1 ***
18:18:35.109Z   84      FATAL   Util.js:47              "PollingTimeout: Polling time out. Job Id = 750L0000003vQNqIAM , batch Id = 751L0000003CzxoIAC\n    at Timeout.poll [as _onTimeout] (C:\\Users\\me\\AppData\\Local\\sfdx\\plugins\\node_modules\\jsforce\\lib\\api\\bulk.js:573:17)\n    at ontimeout (timers.js:475:11)\n    at tryOnTimeout (timers.js:310:5)\n    at Timer.listOnTimeout (timers.js:270:5)"
 !    { Error
 !    at Function.throwError
 !    (C:\Users\me\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Util.js:49:15)
 !    at orgDestination.conn.bulk.load
 !    (C:\Users\me\AppData\Local\sfdx\plugins\node_modules\etcopydata\lib\@ELTOROIT\Importer.js:216:41)
 !    at C:\Users\me\AppData\Local\sfdx\plugins\node_modules\jsforce\lib\api\bulk.js:511:9
 !    at _combinedTickCallback (internal/process/next_tick.js:131:7)
 !    at process._tickCallback (internal/process/next_tick.js:180:9)
 !    cause:
 !    { PollingTimeout: Polling time out. Job Id = 750L0000003vQNqIAM , batch Id = 751L0000003CzxoIAC
 !    at Timeout.poll [as _onTimeout]
 !    (C:\Users\me\AppData\Local\sfdx\plugins\node_modules\jsforce\lib\api\bulk.js:573:17)
 !    at ontimeout (timers.js:475:11)
 !    at tryOnTimeout (timers.js:310:5)
 !    at Timer.listOnTimeout (timers.js:270:5)
 !    name: 'PollingTimeout',
 !    jobId: '750L0000003vQNqIAM',
 !    batchId: '751L0000003CzxoIAC' },
 !    name: 'Error',
 !    actions: null,
 !    exitCode: -1 }

Import without Authenticating the orgSource?

Sort of related to #11, in our use case the code executing our imports may not have access to the org specified as the orgSource. If I try to import and do not have access to the orgSource I get an error like:

No AuthInfo found for name SourceData

Is there a reason you try to auth the orgSource during import? If there is a reason, would it be possible to have an option to forego that auth?

Again, in our use case we know both the source and destination will have the same schema.

ERROR: DELETE_FAILED: "Your attempt to delete Sample Account for Entitlements could not be completed because it is associated with the following entitlements.: Sample Entitlement"

I discovered that if your Dev Hub is a Spring '21 pre-release version, you get this error when loading the data if you want to delete existing data first.

DELETE_FAILED: "Your attempt to delete Sample Account for Entitlements could not be completed because it is associated with the following entitlements.: Sample Entitlement"

The problem is that Salesforce is now creating an entitlement record with a related account record as explained in this knowledge article.

The workaround is to clear the entitlements before the data is loaded. You can do this, using this trick:

  1. Create a new file named Entitlement.json with this contents:
{
    "fetched": 0,
    "records": [],
    "total": 0
}
  1. Modify the file named ETCopyData.json to include the Entitlements standard sObject. Under sObjectsData add this line:
{ "name": "Entitlement" }

Copying data from a managed package custom object

Hello,

First of all, congratulation, this plugin is awesome. Simple to configure. Very Quick execution.

Also, I am facing a problem when trying to copy a specific object (a manage packaged one) from source to target.

Issue occurs on export for the moment (i replaced sensitive information with "<>" and "..."):

{
"status": 1,
"name": "SfdxError",
"message": "Requested sObject [] was not found in the Org",
"exitCode": 1,
"commandName": "Export",
"stack": "SfdxError: Requested sObject [] was not found in the Org\n at Function.wrap (.../sfdxError.js:149:20)\n at Export.catch (.../ETCopyData/node_modules/@salesforce/command/lib/sfdxCommand.js:269:67)\n at process._tickCallback (internal/process/next_tick.js:68:7)",
"warnings": []
}

Also, i can confirm you thant my users have access to these object in source and target as i can query them with developper console.

Thank you.

Regards,
Franck

TypeError: Cannot read properties of undefined (reading 'prototype')

hi, thanks for such an amazing tool. We have been using this for a while. However, very recently we were not able to successfully execute any import or export operation because of this error.

I have got all sfdx and this pluging updated.. version of etcopydata 2.1.1

% sfdx ETCopyData import -c ETCopyData\DigitalProducts -s OUAT -d ODev --loglevel debug
TypeError: Cannot read properties of undefined (reading 'prototype')

Any idea on how to rectify this, please ?

Including 'Opportunity' object results in never-ending export

Hi,

Great plugin, great potential. Wow.

But... I'm trying to export Opportunity, and the export hangs, as soon as I include the Opportunity object. If I've got only the Opportunity object included, then it exports just fine.

But e.g. when I add (e.g. it can be anything) Account, it hangs and takes full CPU.

    "now": "2019-06-18T23:34:16.390Z",
    "orgSource": "xxxx--core2",
    "orgDestination": "xxxx--dev1",
    "sObjectsData": [
        {
            "name": "Account",
            "ignoreFields": "",
            "maxRecords": 500,
            "orderBy": null,
            "where": null
        },
        {
            "name": "Opportunity",
            "ignoreFields": "Pricebook2Id,TF_NumberOfBlankRoles__c",
            "maxRecords": 1,
            "orderBy": null,
            "where": null
        }
    ],
    "sObjectsMetadata": [],
    "rootFolder": "./ETData",
    "includeAllCustom": false,
    "stopOnErrors": true,
    "ignoreFields": null,
    "maxRecordsEach": -1,
    "deleteDestination": false,
    "pollingTimeout": 100000
}```


```ltm4:EtCopyData jburgers$ sfdx ETCopyData:export --loglevel=trace
WARNING: apiVersion configuration overridden at "45.0"
Timestamp	#	Level	Line Number	Description
23:34:16.362Z	1	TRACE	ETCopyData:42       	Log level: trace
23:34:16.364Z	2	INFO	ETCopyData:44       	ETCopyData:Export Process Started
23:34:16.385Z	3	INFO	Settings:402        	Configuration value for [orgSource]: xxxx--core2
23:34:16.386Z	4	INFO	Settings:402        	Configuration value for [orgDestination]: xxx--dev1
23:34:16.386Z	5	INFO	Settings:402        	Configuration value for [stopOnErrors]: true
23:34:16.387Z	6	INFO	Settings:402        	Configuration value for [rootFolder]: ./ETData
23:34:16.387Z	7	INFO	Settings:402        	Configuration value for [maxRecordsEach]: -1
23:34:16.387Z	8	INFO	Settings:402        	Configuration value for [pollingTimeout]: 100000
23:34:16.387Z	9	INFO	Settings:311        	Configuration value for [sObjectsData]: 2 sObjects found.
23:34:16.388Z	10	INFO	Settings:321        	Configuration value for [sObjectsMetadata]: 0 sObjects found.
23:34:16.392Z	11	TRACE	ETCopyData:318      	Configuration settings read.
23:34:16.454Z	12	INFO	OrgManager:34       	[xxxx--core2] Alias for username: [[email protected]]
23:34:17.725Z	13	TRACE	SchemaDiscovery:262 	[xxxx--core2] Found sObject [Account].
23:34:17.727Z	14	TRACE	SchemaDiscovery:262 	[xxxx--core2] Found sObject [Opportunity].
23:34:19.087Z	15	INFO	OrgManager:34       	[xxxx--dev1] Alias for username: [[email protected]]
23:34:20.189Z	16	TRACE	SchemaDiscovery:262 	[xxxx--dev1] Found sObject [Account].
23:34:20.191Z	17	TRACE	SchemaDiscovery:262 	[xxxx--dev1] Found sObject [Opportunity].
23:34:21.650Z	18	WARN	ETCopyData:289      	There are some *field* differences between the orgs.
23:34:21.652Z	19	WARN	ETCopyData:293      	If the following fields do exist in the org, then check security (FLS) because they could be hidden, but for now those fields will be ignored.
23:34:21.652Z	20	WARN	ETCopyData:298      	[xxxx--core2] Field [Account.OwnershipChangeStatus__c] does not exist in [xxxx--dev1].
23:34:21.653Z	21	TRACE	SchemaDiscovery:106 	[xxxx--core2] Field [Account.OwnershipChangeStatus__c] ignored because Org mismatch```


Regards,

Jeroen

add support for circular and self references

Hi,

We've been evaluating ETCopyData for the development of our application and it seems to be a good fit! Two features that seems to be missing is the capability to import SObjects that have potentially circular references or that have references within the same SObject to form a tree. In our data model we have several cases where SObject A has a reference to SObject B, and B has a reference to A as well. We've not been able to import these types of fields.

Can you confirm that this is indeed not possible?

We're happy to contribute an initial feature to support this. Ideally, ETCopyData would be able to detect these types of relationships automatically, but that could be quite expensive, so preferably you would run that analysis once and then configure it somehow. We propose to implement for now just the step where the user would configure the fields that are circular or self-referencing. ETCopyData will then first load all SObjects with all configured fields except those "special" reference fields, and once all SObjects are loaded, ETCopyData will upsert the relevant records to restore the circular and self-referencing relationships.

We propose the configuration field "twoPassReferenceFields".

One possible future extension is to also support reference fields that are "required". The proposed solution does not support required reference fields as it first loads the fields with a "null" value.

Another future improvement is to optimize self-referencing fields that do not result in any cycles, these can be loaded in one step by first sorting the records in-memory and inserting them in the correct order.

Before we start on this work, do you have any feedback for us?

Cheers, Oscar Scholten

Export only relevant descendents

Hello,

This seems like a great plugin! But I'm aiming to export only relevant descendents and I'm not sure if that's possible. Example: I'd like to export one specific Account by its ID. But then, if I want to export only the Opportunities related to this Account, I don't see any possible configuration to do this. Of course, I could place where AccountId = in the Opportunity where clause. But that'll only work for the second level. What would I do if I wanted to export a third level (e.g. OpportunityContactRoles)? Am I missing something, or it's simply not supported yet?

Many thanks!

Not exporting master-detail when sObject is not enabled for Search

I'm trying to export two objects:

dca_mass_action__Mass_Action_Configuration__c and dca_mass_action__Mass_Action_Mapping__c. The latter having a master-detail relationship.

The records from dca_mass_action__Mass_Action_Configuration__c export fine, but none of the child records are exported. There is no exporting of data seen in the trace, neither the file has been created.

When I only have dca_mass_action__Mass_Action_Mapping__c in my ETCopyData.json it errors out: ERROR running Export: Cannot read property 'EXPORT' of undefined

This is the ETCopyData.json

{
    "now": "2019-07-01T15:48:02.190Z",
    "orgSource": "imcd--dev1",
    "orgDestination": "imcd--uat",
    "sObjectsData": [
        {
            "name": "dca_mass_action__Mass_Action_Configuration__c",
            "ignoreFields": "",
            "maxRecords": -1,
            "orderBy": null,
            "where": "Name LIKE 'Report Incorrect%'"
        },
        {
            "name": "dca_mass_action__Mass_Action_Mapping__c",
            "ignoreFields": "",
            "maxRecords": -1,
            "orderBy": null,
            "where": null
        }
    ],
    "sObjectsMetadata": [],
    "rootFolder": "./",
    "includeAllCustom": false,
    "stopOnErrors": true,
    "ignoreFields": null,
    "maxRecordsEach": -1,
    "deleteDestination": false,
    "pollingTimeout": 100000
}

Trace:

15:48:02.164Z  1     TRACE  ETCopyData:42         Log level: trace
15:48:02.165Z  2     INFO   ETCopyData:44         ETCopyData:Export Process Started
15:48:02.186Z  3     INFO   Settings:402          Configuration value for [orgSource]: imcd--dev1
15:48:02.186Z  4     INFO   Settings:402          Configuration value for [orgDestination]: imcd--uat
15:48:02.187Z  5     INFO   Settings:402          Configuration value for [stopOnErrors]: true
15:48:02.187Z  6     INFO   Settings:402          Configuration value for [rootFolder]: ./
15:48:02.187Z  7     INFO   Settings:402          Configuration value for [maxRecordsEach]: -1
15:48:02.187Z  8     INFO   Settings:402          Configuration value for [pollingTimeout]: 100000
15:48:02.188Z  9     INFO   Settings:311          Configuration value for [sObjectsData]: 2 sObjects found.
15:48:02.188Z  10    INFO   Settings:321          Configuration value for [sObjectsMetadata]: 0 sObjects found.
15:48:02.192Z  11    TRACE  ETCopyData:318        Configuration settings read.
15:48:02.249Z  12    INFO   OrgManager:34         [imcd--dev1] Alias for username: [[email protected]]
15:48:05.319Z  13    TRACE  SchemaDiscovery:262   [imcd--dev1] Found sObject [dca_mass_action__Mass_Action_Configuration__c].
15:48:05.747Z  14    INFO   OrgManager:34         [imcd--uat] Alias for username: [[email protected]]
15:48:06.946Z  15    TRACE  SchemaDiscovery:262   [imcd--uat] Found sObject [dca_mass_action__Mass_Action_Configuration__c].
15:48:07.268Z  16    TRACE  Exporter:39           [imcd--dev1] Querying Data sObject [dca_mass_action__Mass_Action_Configuration__c]
15:48:07.269Z  17    TRACE  Exporter:134          [imcd--dev1] Querying [dca_mass_action__Mass_Action_Configuration__c] with SOQL: [SELECT CurrencyIsoCode,Id,Name,dca_mass_action__Active__c,dca_mass_action__Batch_Size__c,dca_mass_action__Description__c,dca_mass_action__DeveloperName__c,dca_mass_action__Last_Run_Completed_Date__c,dca_mass_action__Last_Run_Completed_With_Errors__c,dca_mass_action__Named_Credential__c,dca_mass_action__Schedule_Cron__c,dca_mass_action__Schedule_DayOfMonth__c,dca_mass_action__Schedule_DayOfWeek__c,dca_mass_action__S
15:48:07.269Z  17.2  TRACE  Exporter:134          chedule_Frequency__c,dca_mass_action__Schedule_HourOfDay__c,dca_mass_action__Schedule_MinuteOfHour__c,dca_mass_action__Schedule_MonthOfYear__c,dca_mass_action__Schedule_SecondOfMinute__c,dca_mass_action__Source_Apex_Class__c,dca_mass_action__Source_List_View_ID__c,dca_mass_action__Source_Report_Column_Name__c,dca_mass_action__Source_Report_ID__c,dca_mass_action__Source_SOQL_Query__c,dca_mass_action__Source_Type__c,dca_mass_action__Target_Action_Name__c,dca_mass_action__Target_Apex_Script__c,dca_
15:48:07.269Z  17.3  TRACE  Exporter:134          mass_action__Target_SObject_Type__c,dca_mass_action__Target_Type__c FROM dca_mass_action__Mass_Action_Configuration__c WHERE Name LIKE 'Report Incorrect%' ]
15:48:07.543Z  18    INFO   Exporter:88           [imcd--dev1] Queried [dca_mass_action__Mass_Action_Configuration__c], retrieved 13 records
jburgers-ltm4:MASConfigurations jburgers$

When I use the compare, it doesn't seem to 'compare' the detail object?

jburgers-ltm4:MASConfigurations jburgers$ sfdx ETCopyData:compare --loglevel=trace
WARNING: apiVersion configuration overridden at "45.0"
Timestamp	#	Level	Line Number	Description
15:58:50.856Z	1	TRACE	ETCopyData:42       	Log level: trace
15:58:50.857Z	2	INFO	ETCopyData:44       	ETCopyData:Compare Process Started
15:58:50.876Z	3	INFO	Settings:402        	Configuration value for [orgSource]: imcd--dev1
15:58:50.876Z	4	INFO	Settings:402        	Configuration value for [orgDestination]: imcd--uat
15:58:50.877Z	5	INFO	Settings:402        	Configuration value for [stopOnErrors]: true
15:58:50.877Z	6	INFO	Settings:402        	Configuration value for [rootFolder]: ./
15:58:50.878Z	7	INFO	Settings:402        	Configuration value for [maxRecordsEach]: -1
15:58:50.878Z	8	INFO	Settings:402        	Configuration value for [pollingTimeout]: 100000
15:58:50.878Z	9	INFO	Settings:311        	Configuration value for [sObjectsData]: 2 sObjects found.
15:58:50.879Z	10	INFO	Settings:321        	Configuration value for [sObjectsMetadata]: 0 sObjects found.
15:58:50.882Z	11	TRACE	ETCopyData:318      	Configuration settings read.
15:58:50.938Z	12	INFO	OrgManager:34       	[imcd--dev1] Alias for username: [[email protected]]
15:58:52.100Z	13	TRACE	SchemaDiscovery:262 	[imcd--dev1] Found sObject [dca_mass_action__Mass_Action_Configuration__c].
15:58:52.469Z	14	INFO	OrgManager:34       	[imcd--uat] Alias for username: [[email protected]]
15:58:53.665Z	15	TRACE	SchemaDiscovery:262 	[imcd--uat] Found sObject [dca_mass_action__Mass_Action_Configuration__c].
Timestamp      #   Level  Line Number           Description
─────────────  ──  ─────  ────────────────────  ───────────────────────────────────────────────────────────────────────────
15:58:50.856Z  1   TRACE  ETCopyData:42         Log level: trace
15:58:50.857Z  2   INFO   ETCopyData:44         ETCopyData:Compare Process Started
15:58:50.876Z  3   INFO   Settings:402          Configuration value for [orgSource]: imcd--dev1
15:58:50.876Z  4   INFO   Settings:402          Configuration value for [orgDestination]: imcd--uat
15:58:50.877Z  5   INFO   Settings:402          Configuration value for [stopOnErrors]: true
15:58:50.877Z  6   INFO   Settings:402          Configuration value for [rootFolder]: ./
15:58:50.878Z  7   INFO   Settings:402          Configuration value for [maxRecordsEach]: -1
15:58:50.878Z  8   INFO   Settings:402          Configuration value for [pollingTimeout]: 100000
15:58:50.878Z  9   INFO   Settings:311          Configuration value for [sObjectsData]: 2 sObjects found.
15:58:50.879Z  10  INFO   Settings:321          Configuration value for [sObjectsMetadata]: 0 sObjects found.
15:58:50.882Z  11  TRACE  ETCopyData:318        Configuration settings read.
15:58:50.938Z  12  INFO   OrgManager:34         [imcd--dev1] Alias for username: [[email protected]]
15:58:52.100Z  13  TRACE  SchemaDiscovery:262   [imcd--dev1] Found sObject [dca_mass_action__Mass_Action_Configuration__c].
15:58:52.469Z  14  INFO   OrgManager:34         [imcd--uat] Alias for username: [[email protected]]
15:58:53.665Z  15  TRACE  SchemaDiscovery:262   [imcd--uat] Found sObject [dca_mass_action__Mass_Action_Configuration__c].

If I'd query the object, it would return the data:

jburgers-ltm4:MASConfigurations jburgers$ sfdx force:data:soql:query -q "SELECT Id, Name FROM dca_mass_action__Mass_Action_Mapping__c" -u imcd--uat
WARNING: apiVersion configuration overridden at "45.0"
ID                  NAME
──────────────────  ──────────
a5R0Q0000009ZzMUAU  MAM-000064
a5R0Q0000009ZzNUAU  MAM-000065
a5R0Q0000009ZzOUAU  MAM-000066
a5R0Q0000009ZzPUAU  MAM-000067
a5R0Q0000009ZzQUAU  MAM-000068
a5R0Q0000009ZzRUAU  MAM-000069
a5R0Q0000009ZjaUAE  MAM-000002
a5R0Q0000009ZjbUAE  MAM-000003
a5R0Q0000009XM4UAM  MAM-000001
a5R0Q0000009fAZUAY  MAM-000071

ETCopyData:full exits unexpectedly after "sObjects should be processed in this order:"

I'm receiving this error when trying to run ETCopyData:full:

07:46:40.801Z 63 TRACE Importer.js:69 sObjects should be processed in this order:

This happens whether there are 5 custom sObjects (with relationships) or a single object with no relationships!

Here is the config file:

{
    "now": "2019-07-31T07:49:30.982Z",
    "orgSource": "[email protected]",
    "orgDestination": "ecoenergy_scratch",
    "sObjectsData": [
        {
            "name": "District__c",
            "ignoreFields": "OwnerId,CreatedBy,CreatedDate,CurrencyIsoCode,LastModifiedBy,LastModifiedDate",
            "maxRecords": -1,
            "orderBy": null,
            "where": null
        }
    ],
    "sObjectsMetadata": [
        {
            "name": "District__c",
            "fieldsToExport": "Id,Name",
            "matchBy": "Name",
            "orderBy": null,
            "where": null
        }
    ],
    "rootFolder": "sfdx-data",
    "includeAllCustom": false,
    "stopOnErrors": true,
    "ignoreFields": "OwnerId, CreatedBy, CreatedDate, CurrencyIsoCode, LastModifiedBy, LastModifiedDate",
    "maxRecordsEach": null,
    "deleteDestination": true,
    "pollingTimeout": 100000
}

And the output:

Timestamp      #   Level  Line Number           Description
─────────────  ──  ─────  ────────────────────  ──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
07:49:30.971Z  1   TRACE  ETCopyData.js:17      Log level: TRACE
07:49:30.974Z  2   INFO   ETCopyData.js:19      ETCopyData:Full Process Started
07:49:30.977Z  3   INFO   Settings.js:289       Configuration value for [orgSource]: [email protected]
07:49:30.977Z  4   INFO   Settings.js:289       Configuration value for [orgDestination]: ecoenergy_scratch
07:49:30.978Z  5   INFO   Settings.js:289       Configuration value for [stopOnErrors]: true
07:49:30.978Z  6   INFO   Settings.js:289       Configuration value for [rootFolder]: sfdx-data
07:49:30.978Z  7   INFO   Settings.js:289       Configuration value for [ignoreFields]: OwnerId, CreatedBy, CreatedDate, CurrencyIsoCode, LastModifiedBy, LastModifiedDate
07:49:30.978Z  8   INFO   Settings.js:289       Configuration value for [deleteDestination]: true
07:49:30.978Z  9   INFO   Settings.js:289       Configuration value for [pollingTimeout]: 100000
07:49:30.979Z  10  INFO   Settings.js:225       Configuration value for [sObjectsData]: 1 sObjects found.
07:49:30.979Z  11  INFO   Settings.js:232       Configuration value for [sObjectsMetadata]: 1 sObjects found.
07:49:30.984Z  12  TRACE  ETCopyData.js:271     Configuration settings read.
07:49:32.768Z  13  INFO   OrgManager.js:27      [[email protected]] Alias for username: [[email protected]]
07:49:36.967Z  14  INFO   OrgManager.js:27      [ecoenergy_scratch] Alias for username: [[email protected]]
07:49:40.125Z  15  TRACE  Exporter.js:45        [ecoenergy_scratch] Querying Metadata sObject [District__c]
07:49:40.126Z  16  TRACE  Exporter.js:111       [ecoenergy_scratch] Querying [District__c] with SOQL: [SELECT Id,Name FROM District__c ]
07:49:41.402Z  17  INFO   Exporter.js:69        [ecoenergy_scratch] Queried [District__c], retrieved 0 records
07:49:41.412Z  18  TRACE  Exporter.js:45        [[email protected]] Querying Metadata sObject [District__c]
07:49:41.412Z  19  TRACE  Exporter.js:111       [[email protected]] Querying [District__c] with SOQL: [SELECT Id,Name FROM District__c ]
07:49:42.671Z  20  INFO   Exporter.js:69        [[email protected]] Queried [District__c], retrieved 15 records
07:49:42.675Z  21  TRACE  Importer.js:69        sObjects should be processed in this order:

Is this a configuration issue (my fault) or a bug?

Importing business and person accounts

I'm trying to solve this problem:
The account data I want to import includes both business and persons accounts.
When I export Account data then re import the data it will try to insert a person account using the 'Name' Field which is not supported for DML on person accounts.
Is there a way around this?

Installation dependency warnings

During install several dependency issues come up:

> sfdx plugins:install etcopydata
This plugin is not digitally signed and its authenticity cannot be verified. Continue installation y/n?: y
Finished digital signature check.
warning etcopydata > @salesforce/command > @oclif/test > fancy-test > @types/[email protected]: This is a stub types definition. nock provides its own type definitions, so you do not need this installed.
warning etcopydata > @salesforce/command > @salesforce/core > @salesforce/ts-sinon > sinon > @sinonjs/formatio > [email protected]: This package has been deprecated in favour of @sinonjs/samsam
warning "etcopydata > @oclif/[email protected]" has unmet peer dependency "@oclif/plugin-help@^2".
Installing plugin etcopydata... installed v0.4.4

Run on Windows 10 from within VSCode's terminal using PowerShell.

Subsequent runs directly in PowerShell only repeat the unmet peer dependency:

> sfdx plugins:install etcopydata
This plugin is not digitally signed and its authenticity cannot be verified. Continue installation y/n?: y
Finished digital signature check.
warning "etcopydata > @oclif/[email protected]" has unmet peer dependency "@oclif/plugin-help@^2".
Installing plugin etcopydata... installed v0.4.4

config.plugins.filter is not a function

Hi, I've been using this plugin without issue before. However, when I recently tried to execute the Export or Import command, I encountered this message:

{
"status": 1,
"name": "Type",
"message": "config.plugins.filter is not a function",
"exitCode": 1,
"context": "Export",
"stack": "Type: config.plugins.filter is not a function ....."
}
(node:16448) Warning: Deprecated config name: apiVersion. Please use org-api-version instead.
(Use node --trace-warnings ... to show where the warning was created)

Please note, I used v2.1.1

Thanks a lot.

Copying more than 10 000 rows

Hello,

Another issue which i don't know how to deal with :

Is there a way to insert more than 10 000 rows at a time or in several step but with all the lookups correctly set ? Currently I have a breakdown after 10000 limit.

For information, I am wanting to insert at the same time more than 10 000 rows because i am trying to embrace the whole update of the hierarchy (in my case, it's the object account).

If i separate the records in different pack, i am affraid that lookups will not be correctly set as some of the good account are already stored in salesforce but with new id ref and other are being retrieved in cache memory of the execution with source id ref...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.