capgemini / xrm-datamigration Goto Github PK
View Code? Open in Web Editor NEWExport and import data for Microsoft Dataverse. Supports JSON and CSV.
License: MIT License
Export and import data for Microsoft Dataverse. Supports JSON and CSV.
License: MIT License
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Use the newer Client API for newer features.
Describe the solution you'd like
A clear and concise description of what you want to happen.
Use: https://github.com/microsoft/PowerPlatform-DataverseServiceClient
Additional context
Add any other context or screenshots about the feature request here.
Description
I'm trying to export data from several tables and for one of the tables I added a Fetch XML filter.
If the flag "OnlyActiveRecords" is set to True, then export process ignores my filter and exports all the active records from all the tables that are described in the schema file.
If I set that flag to false, the tool fails when tries to export data for the filtered entity. Log:
23-Jun-2022 00:01:57 - Verbose:CrmFileDataExporter GetProcessors started
23-Jun-2022 00:01:57 - Verbose:CrmFileDataImporter GetProcessors finished
23-Jun-2022 00:01:57 - Info:GenericDataMigrator MigrateData started
23-Jun-2022 00:01:57 - Info:GenericDataMigrator PerformMigratePass started, passNo:1
23-Jun-2022 00:01:57 - Info:DataFileStoreWriter Reset performed
23-Jun-2022 00:01:57 - Verbose:DataCrmStoreReader Reset performed
23-Jun-2022 00:01:57 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:0, page0
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader retrieved entity:msdyn_cannedmessage, page:1, query:0, retrievedCount:10, totalEntityCount:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:1, page:0, totalCount:10
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:1 entities:10 FirstEntity:msdyn_cannedmessage
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:10, batchNo:1
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:1 entities:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:1, page0
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader retrieved entity:msdyn_msdyn_cannedmessage_liveworkstream, page:1, query:1, retrievedCount:10, totalEntityCount:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:2, page:0, totalCount:20
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:2 entities:10 msdyn_msdyn_cannedmessage_liveworkstream
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:10, batchNo:2
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:2 entities:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:2, page0
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader retrieved entity:msdyn_msdyn_cannedmessage_msdyn_octag, page:1, query:2, retrievedCount:4, totalEntityCount:4
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:3, page:0, totalCount:24
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:3 entities:4 msdyn_msdyn_cannedmessage_msdyn_octag
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:4, batchNo:3
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:3 entities:4
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:3, page0
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader retrieved entity:msdyn_liveworkstream, page:1, query:3, retrievedCount:10, totalEntityCount:10
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:4, page:0, totalCount:34
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:4 entities:10 msdyn_liveworkstream
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:10, batchNo:4
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:4 entities:10
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:4, page0
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader retrieved entity:msdyn_octag, page:1, query:4, retrievedCount:3, totalEntityCount:3
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:5, page:0, totalCount:37
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:5 entities:3 msdyn_octag
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:3, batchNo:5
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:5 entities:3
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:5, page0
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader retrieved entity:msdyn_livechatconfig, page:1, query:5, retrievedCount:2, totalEntityCount:2
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:6, page:0, totalCount:39
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:6 entities:2 msdyn_livechatconfig
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:2, batchNo:6
23-Jun-2022 00:02:00 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:02:00 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:6 entities:2
23-Jun-2022 00:02:00 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:6, page0
23-Jun-2022 00:02:54 - Error: Sql error: Generic SQL error. CRM ErrorCode: -2147204784 Sql ErrorCode: -2146232060 Sql Number: 156
Fetch XML is tested in the builder and works fine.
ExportConfig:
{
"ExcludedFields": [],
"CrmMigrationToolSchemaPaths": [
"D:\xx\ConfigMigration\schema.xml"
],
"CrmMigrationToolSchemaFilters": {
"msdyn_oclocalizationdata": "<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">\n <entity name="msdyn_oclocalizationdata">\n <attribute name="msdyn_localizedtext" />\n <attribute name="msdyn_customerlanguageid" />\n <attribute name="statecode" />\n <attribute name="msdyn_oclocalizationdataid" />\n <order attribute="msdyn_localizedtext" descending="false" />\n \n <condition attribute="msdyn_oclocalizationdataid" operator="in" uitype="msdyn_oclocalizationdata">\n 44ff8f8e-5ded-ec11-bb3d-000d3af4d379\n a7527efe-73ed-ec11-bb3d-000d3af4d379\n \n \n \n"
},
"PageSize": 1000,
"BatchSize": 1000,
"TopCount": 10000,
"OnlyActiveRecords": true,
"JsonFolderPath": "D:\xx\ConfigMigration\ExtractedData",
"OneEntityPerBatch": true,
"FilePrefix": "ExportedData",
"SeperateFilesPerEntity": true,
"FieldsToObfuscate": [],
"LookupMapping": {}
}
Please correct me, if I'm doing something wrong.
Thank you.
At this moment it is done manually and not always done for each change, it should be automated
Add an example package for business units export/import
Describe the bug
Import fails when root business unit is included in the data
To Reproduce
Prepare package with root BU included , export and import to the target environment, you will receive an error.
Expected behavior
The root BU should be mapped and updated.
Is your feature request related to a problem? Please describe.
The logging we currently have is extensive and detailed. This is great for understanding exactly what happened but is too complicated for most of our user base.
Describe the solution you'd like
It would be nice if the engine returned a summary of what happened.
This should be a typed object that the various interfaces can surface accordingly (e.g. the XrmToolBox visually displays).
Describe the bug
Versions 3.1x are not packed correctly - missing dependencies
To Reproduce
Install nuget and see that only main dll is included
Expected behavior
All referenced dlls should be added as well
Additional context
Issue with nuget pack step
Is your feature request related to a problem? Please describe.
This isn't problem-related but while refactoring powerapps-packagedeployertemplate, I switch the services to use Microsoft's Logging Extensions. While integrating with this library, I wrote a wrapper for a very similar ILogger
interface.
Describe the solution you'd like
My question and the proposal is to update this library to use the same logging abstraction provided by MS for better integration with other tooling. For example, the respective Xrm Toolbox plugin could also be updated with a custom ILogger
implementation.
Additional context
https://docs.microsoft.com/en-us/dotnet/core/extensions/logging
I've opened this as a discussion before we consider investing time in the refactor. ๐
Both the PR build and CI build/release Azure DevOps pipelines are located in our internal Capgemini Reusable IP project.
These should both be removed from Capgemini Reusable IP and recreated in GitHub Support so that they are publicly available.
Is your feature request related to a problem? Please describe.
The Samples project, XrmToolbox plugin, and CLI tool are three ways to use the engine. Of course, we could use the nuget package in our own projects as well. Personally, I like the CLI since it's up to date with the latest engine and I could use it outside the context of the Samples project. Unfortunately, it means pulling the latest codebase and building manually.
Describe the solution you'd like
Build and publish the CLI tool as a dotnet global tool with the latest version of the engine.
It is also worth adding a small section in the README on the tool although it should be self documenting (via -help).
Raised by sonar cloud:
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ACA1707&id=xrm-datamigration&open=AYHzLVJ58Fz3_Lg92U_2
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ACA1707&id=xrm-datamigration&open=AYHzLVJ58Fz3_Lg92U_3
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ACA1707&id=xrm-datamigration&open=AYHzLVJ58Fz3_Lg92U_4
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ACA1707&id=xrm-datamigration&open=AYHzLVJ58Fz3_Lg92U_5
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ACA1707&id=xrm-datamigration&open=AYHzLVJ58Fz3_Lg92U_6
Is your feature request related to a problem? Please describe.
The terminology and branding of this repository is a bit inconsistent. For example, we have:
And if we include the XrmToolBox plug-in repository:
Describe the solution you'd like
It would be great if we could update all of the terminology and naming to be consistent. At the time of writing, we should probably be referring to Dataverse where we've got XRM/CDS/CRM. We're starting to see a suite of tools emerging for Power Apps with similar branding on our NuGet organisation -
It's debatable if a more appropriate namespace/package title for this would be Capgemini.PowerApps.DataMigration
(to align with what we've already got) or Capgemini.Dataverse.DataMigration
.
Additional context
The repository name would probably also need to be updated as part of this.
Describe the bug
There appears to be a limit of 100,000 records that can be extracted. I'm not sure if this is a Dynamics limit, an Sdk limit, or this tool's limit but the Fast Record Counter XrmToolbox tool can count past it.
To Reproduce
Export all records in a table with more than 100,000 records
Expected behavior
All records are to be extracted.
I was trying to import JSON files using the Dynamics 365: Data Importer Release task. My file name was starting with the prefix "ExtractedData". It was neither importing file nor giving an error message/warning. Below is the log of import task
When I tried to import the same file with ExtractedData prefix using CDS data migrater import xrm toolbox it worked. Later, I changed the filename prefix to "ExportedData" and it worked from "Dynamics 365: Data Importer Release task". We should handle this and give an appropriate error message/ warning like No file found something like that.
Remove not required Microsoft.CSharp nuget dependency
Describe the bug
My export config has two lookup mappings defined which target string fields. When I extract as CSV (via the XRM ToolBox), these fields are not wrapped with quotes ("..."
) like the other string fields. I'm not sure if this is because the lookup field is not defined in the Schema file?
The result of this is a broken CSV field if any mapping field value contains a comma (,
).
To Reproduce
Within the Lookup Mappings, specify a lookup to a text field.
{
"LookupMapping": {
"teamroles": {
"roleid": [
"name",
"businessunitid"
],
"teamid": [
"name",
"businessunitid"
]
}
}
}
Expected behavior
Mapped text values to also be wrapped with quotes.
Describe the bug
I have a schema with the field <field displayname="Image" name="new_image" type="image" primaryKey="false" customfield="false" />
which is failing with the error Error:Not supported attribute type System.Byte[]
only for CSV exports (JSON worked fine).
To Reproduce
entityimage
and include that field like above.Expected behavior
This error not to be thrown and instead the base64 to be exported or safe handling of this field type to be ignored.
Describe the bug
When export config contains fields mappings defined, but has the reference field excluded from export, the original guid will not be included in the exported data. It will cause error and fail during import.
To Reproduce
Create export config file and configure lookup mapping and add the field to excluded fields
"ExcludedFields": [
"systemuserid",
"roleid",
"systemuserrolesid",
"businessunitid"
],
"LookupMapping": {
"systemuser": {
"systemuserid": [
"domainname"
],
"businessunitid": [
"name"
]
},
"systemuserroles": {
"systemuserid": [
"domainname"
],
"roleid": [
"name",
"businessunitid"
]
}
}
Expected behavior
In this situation if lookup value can be found in the target system then it should be used, if not it should be left empty
Additional context
Fatal error during import
'ProcessZeroEntitiesFirst' calls 'All' but does not use the value the method returns. Linq methods are known to not have side effects. Use the result in a conditional statement, assign the result to a variable, or pass it as an argument to another method.
Describe the bug
When in export config filter is set up for many to many, it is not used and all records are exported
To Reproduce
...
Expected behaviour
Filter config to be considered when exporting many-to-many relationships.
Additional context
Migrated from internal Azure DevOps.
Describe the bug
A clear and concise description of what the bug is.
When importing a lot of data (> 1000 records) and using lookup maps and using an OAuth connection string, I am being asked to log in multiple times. Sometimes the login fails after too many correct attempts.
I'm not really sure why this is, are we creating new CrmService
s?
To Reproduce
Provide code snippets which demonstrate the usage scenario to reproduce the behavior.
I'm not sure...
Expected behavior
A clear and concise description of what you expected to happen.
Non-interactive login to work. (I am using the CLI)
Additional context
Add any other context about the problem here.
Remove not required nuget dependencies and ensure the same versions are used by all projects to prevent issues with dll versions mapping by the consuming client
Raised by Sonar Cloud:
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1208&severities=MAJOR&id=xrm-datamigration&open=AYHIVG6NIQuHv600XTv6
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1208&severities=MAJOR&id=xrm-datamigration&open=AYHIVG6NIQuHv600XTv7
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1500&severities=MAJOR&id=xrm-datamigration&open=AXwn37q4146gT_OOOYrA
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1500&severities=MAJOR&id=xrm-datamigration&open=AXwn37q4146gT_OOOYrB
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1516&severities=MAJOR&id=xrm-datamigration&open=AYHImH0sknSGSKrfpo6s
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1516&severities=MAJOR&id=xrm-datamigration&open=AYHImH0sknSGSKrfpo6t
Today I tried to run the full test suite locally before submitting a PR. This included the integrations tests.
Capgemini.Xrm.DataMigration.Core.IntegrationTests
worked perfectly after supplying some connection strings in the app.config
.
Capgemini.Xrm.DataMigration.Tests.Integration
did not work, again, after supplying connection strings in the app.config
.
I think we need to:
CONTRIBUTING.md
with steps on configuring the integration tests - even if it's simple.Is your feature request related to a problem? Please describe.
Requests are made per record. This can exhaust the API and is extremely slow for large data sets.
Describe the solution you'd like
Use an in-memory map and make fewer requests (maybe batch) to increase performance.
Additional context
Migrated from internal Azure DevOps.
Please add a new demo scenario for ownerid using different object types like System User, Team or Role.
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Describe the solution you'd like
A clear and concise description of what you want to happen.
Additional context
Add any other context or screenshots about the feature request here.
Currently only Csv implementation contains CSV in the class name, the other is for JSON but it is not so obvious and not consistent.
Include source symbols in NuGet package so that consumer can debug the engine.
This could be particularly useful for the XrmToolBox plugin.
https://docs.microsoft.com/en-us/nuget/create-packages/symbol-packages-snupkg
Support for Web API Flavour of deep insert and source control json files from web api
Is your feature request related to a problem? Please describe.
Microsoft have documented functionality that allows you to bypass custom logic whilst performing API requests: https://docs.microsoft.com/en-us/powerapps/developer/data-platform/bypass-custom-business-logic. Being able to apply said logic during data import would be fantastic in data migration scenarios which often require plug ins and workflows to be deactivated manually.
Describe the solution you'd like
To supply an array of entity names for which the Upsert requests will have this setting enabled, thus being able to control this at quite a granular level.
Is your feature request related to a problem? Please describe.
I've noticed a couple of issues with the README that make it appear a little clumsy -
Describe the solution you'd like
We would like to have more structured clean reusable tests. We should be using an approach that keeps tests consistent.
https://github.com/markcunninghamuk/FluidTest
https://www.nuget.org/packages/FluidTest
Is your feature request related to a problem? Please describe.
Office 365 is deprecated and insecure by modern standards
Describe the solution you'd like
support Oauth by updating Microsoft.CrmSdk.CoreTools to atleast version 9.1.0.13, also if using the https://github.com/seanmcne/Microsoft.Xrm.Data.PowerShell this will require updating also
Make the solution more generic and not CDS (dataverse) specific..
Raised by sonar-cloud:
https://sonarcloud.io/project/issues?resolved=false&rules=csharpsquid%3AS3267&id=xrm-datamigration&open=AXyTdEqr85ZD8_gqLGtv
https://sonarcloud.io/project/issues?resolved=false&rules=csharpsquid%3AS3267&id=xrm-datamigration&open=AXyTdEqa85ZD8_gqLGtu
https://sonarcloud.io/project/issues?resolved=false&rules=csharpsquid%3AS3267&id=xrm-datamigration&open=AXyTdEqG85ZD8_gqLGts
https://sonarcloud.io/project/issues?resolved=false&rules=csharpsquid%3AS3267&id=xrm-datamigration&open=AXyTdEqG85ZD8_gqLGtt
https://sonarcloud.io/project/issues?resolved=false&rules=csharpsquid%3AS3267&id=xrm-datamigration&open=AXyTdErP85ZD8_gqLGtw
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.