kontent-ai-bot / backup-manager-js Goto Github PK
View Code? Open in Web Editor NEWThis utility enables backup & restore of Kontent.ai projects
Home Page: https://kontent.ai
License: MIT License
This utility enables backup & restore of Kontent.ai projects
Home Page: https://kontent.ai
License: MIT License
It is possible that exporting projects with high number of languages does not include some language variants.
Test exporting data on sample project with high number of languages to see where the issue could be.
Kontent currently doesn't support a deep copy feature and only allows shallow copy for a single level. There are cases when deep copy needs to be used and therefore this tool could be extended to allow developers to deep copy a certain subsection of their content.
There should also be a way to control the "depth" of the deep copy.
When I look at the contents languageVariants.json I am missing LVs for content items in contentItems.json. I have 82 content_items of a content_type but only 58 language_variants are exported. When I query management API for a missing language item it appears in the response as expected
kbm action=backup
kbm action=restore
into an empty project failsAll language variants are exported and the restore will load them all.
// const backupDirectory = '../../data/kontent-backup-7-12-2020-13-5'
const backupDirectory = '../../data/kontent-backup-11-12-2020-9-53'
const contentItems = require(backupDirectory + '/contentItems.json')
const languageVariants = require(backupDirectory + '/languageVariants.json')
const contentTypes = require(backupDirectory + '/contentTypes.json')
const cmpContentTypeId = jp.query(contentTypes, '$[?(@.codename=="cmp")].id')[0]
console.log('CMP Content Type id:', cmpContentTypeId)
const cmpIdElementId = jp.query(contentTypes, '$[?(@.codename=="cmp")].elements[?(@.codename=="wgv_cmp_id")].id')[0]
console.log('wgv_cmp_id element id:', cmpIdElementId)
const contentItemsCmp = jp.query(contentItems, $[?(@.type.id==\'${cmpContentTypeId}\')].id
)
console.log('Number of CMP content_items:', contentItemsCmp.length)
let ci2Cmp = new Map()
console.log('Language variant count: ', languageVariants.length)
contentItemsCmp.forEach(ci => {
const cmpId = jp.query(languageVariants, $[?(@.item.id=='${ci}')].elements[?(@.element.id=='${cmpIdElementId}')].value
)[0]
ci2Cmp.set(ci,cmpId)
})
console.table(ci2Cmp)`
There is a long discussion on Intercom with Daniel. The action=restore
fails like this. Is it two issues or the same one?
Imported: arts___culture (default) | languageVariant Attempt 1: retrying in 2126ms Attempt 2: retrying in 2367ms Attempt 3: retrying in 5470ms ContentManagementBaseKontentError { validationErrors: [ ValidationError { message: "The requested modular content item 'd1b68bc3-0cb4-477f-b88e-8bd625c8b1bc' for the element 'tags' was not found." } ], message: "The provided request body is invalid. See 'validation_errors' for more information and specify a valid JSON object.", requestId: '50e370dea98d6e45', errorCode: 5, originalError: Error: Request failed with status code 400 at createError (/Users/owain/dev/wg-content-load/node_modules/@kentico/kontent-backup-manager/node_modules/axios/lib/core/createError.js:16:15) at settle (/Users/owain/dev/wg-content-load/node_modules/@kentico/kontent-backup-manager/node_modules/axios/lib/core/settle.js:17:12) at IncomingMessage.handleStreamEnd (/Users/owain/dev/wg-content-load/node_modules/@kentico/kontent-backup-manager/node_modules/axios/lib/adapters/http.js:236:11) at IncomingMessage.emit (events.js:327:22) at endReadableNT (_stream_readable.js:1224:12) at processTicksAndRejections (internal/process/task_queues.js:84:21) { config: { url: 'https://manage.kontent.ai/v2/projects/b198ddf3-8449-01e5-ab71-46c587c525c0/items/codename/cancer_research_uknnnnnnn/variants/codename/default', method: 'put', data: '{"elements":[{"element":{"codename":"manual"},"value":[{"codename":"true"}]},{"element":{"codename":"full_name"},"value":"Cancer Research UK"},{"components":[],"element":{"codename":"description"},"value":"<p>Cancer Research UK is a cancer research and awareness charity in the United Kingdom and Isle of Man, formed on 4 February 2002 by the merger of The Cancer Research Campaign and the Imperial Cancer Research Fund. Its aim is to reduce the number of deaths from cancer.</p>"},{"element":{"codename":"address"},"value":"2 Redman Place, London"},{"element":{"codename":"postcode"},"value":"E20 1JQ"},{"element":{"codename":"charity_registration_id"},"value":"CCEW-1089464"},{"element":{"codename":"gift_aid_eligible"},"value":[{"codename":"true"}]},{"element":{"codename":"payroll_giving_eligible"},"value":[{"codename":"true"}]},{"element":{"codename":"logo"},"value":[{"id":"c41a30a8-9003-4cc3-825a-3025f4ebccbd"}]},{"element":{"codename":"hero_image"},"value":[{"id":"1a269cf2-f6b8-4221-afe8-de2b49cba8e9"}]},{"element":{"codename":"withdrawn_on"},"value":null},{"element":{"codename":"website"},"value":""},{"element":{"codename":"youtube"},"value":""},{"element":{"codename":"twitter"},"value":""},{"element":{"codename":"facebook"},"value":""},{"element":{"codename":"parent_cmp"},"value":[]},{"element":{"codename":"related_cmps"},"value":[]},{"element":{"codename":"tags"},"value":[{"id":"d1b68bc3-0cb4-477f-b88e-8bd625c8b1bc"},{"codename":"health"}]},{"element":{"codename":"wgv_cmp_id"},"value":"CMP#GB1#WGV#84f69713-77ff-46d1-8fe7-44f0ab77a61a"},{"element":{"codename":"default_asset_folder_id"},"value":"97554d98-aa0f-46e6-ad41-84767dc7c11d"},{"element":{"codename":"ctid"},"value":"1089464"},{"element":{"codename":"ccew"},"value":""},{"element":{"codename":"ccsc"},"value":""},{"element":{"codename":"ccni"},"value":""},{"element":{"codename":"ct"},"value":""},{"element":{"codename":"md"},"value":""},{"element":{"codename":"jgos"},"value":""},{"element":{"codename":"jggc"},"value":""}]}', headers: [Object], transformRequest: [Array], transformResponse: [Array], timeout: 0, adapter: [Function: httpAdapter], xsrfCookieName: 'XSRF-TOKEN', xsrfHeaderName: 'X-XSRF-TOKEN', maxContentLength: 'Infinity', validateStatus: [Function: validateStatus] }, request: ClientRequest { _events: [Object: null prototype], _eventsCount: 6, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, _contentLength: null, _hasBody: true, _trailer: '', finished: true, _headerSent: true, socket: [TLSSocket], _header: 'PUT /v2/projects/b198ddf3-8449-01e5-ab71-46c587c525c0/items/codename/cancer_research_uknnnnnnn/variants/codename/default HTTP/1.1\r\n' + 'Accept: application/json, text/plain, */*\r\n' + 'Content-Type: application/json\r\n' + 'X-KC-SDKID: npmjs.com;@kentico/kontent-management;0.4.2\r\n' + 'authorization: bearer ew0KICAiYWxnIjogIkhTMjU2IiwNCiAgInR5cCI6ICJKV1QiDQp9.ew0KICAianRpIjogImQ3MmVmZDYwYjAxMTQzNTlhNTAyZGZhNjkwMTA4NDFkIiwNCiAgImlhdCI6ICIxNjA3Njg4Njc3IiwNCiAgImV4cCI6ICIxOTUzMjg4Njc3IiwNCiAgInByb2plY3RfaWQiOiAiYjE5OGRkZjM4NDQ5MDFlNWFiNzE0NmM1ODdjNTI1YzAiLA0KICAidmVyIjogIjIuMS4wIiwNCiAgInVpZCI6ICI1ZDcyMjViM2FlNzAwMjBkZjZiNmQyYmEiLA0KICAiYXVkIjogIm1hbmFnZS5rZW50aWNvY2xvdWQuY29tIg0KfQ.gbUeRJOi7hueIr17VlUdqYBoJEWaAd6LnKTzePWvaSQ\r\n' + 'User-Agent: axios/0.19.2\r\n' + 'Content-Length: 2007\r\n' + 'Host: manage.kontent.ai\r\n' + 'Connection: close\r\n' + '\r\n', _onPendingData: [Function: noopPendingOutput], agent: [Agent], socketPath: undefined, method: 'PUT', maxHeaderSize: undefined, insecureHTTPParser: undefined, path: '/v2/projects/b198ddf3-8449-01e5-ab71-46c587c525c0/items/codename/cancer_research_uknnnnnnn/variants/codename/default', _ended: true, res: [IncomingMessage], aborted: false, timeoutCb: null, upgradeOrConnect: false, parser: null, maxHeadersCount: null, reusedSocket: false, _redirectable: [Writable], [Symbol(kCapture)]: false, [Symbol(kNeedDrain)]: false, [Symbol(corked)]: 0, [Symbol(kOutHeaders)]: [Object: null prototype] }, response: { status: 400, statusText: 'Bad Request', headers: [Object], config: [Object], request: [ClientRequest], data: [Object] }, isAxiosError: true, toJSON: [Function (anonymous)] } } Management API error occured: The provided request body is invalid. See 'validation_errors' for more information and specify a valid JSON object. The requested modular content item 'd1b68bc3-0cb4-477f-b88e-8bd625c8b1bc' for the element 'tags' was not found.
Add links to screenshots, if possible.
contentItems.json
{ "id": "a4edde85-aa34-483c-89cd-776be144aa00", "name": "XXXXX", "codename": "XXXX", "type": { "id": "5239fbf7-4cb6-4100-bc4b-1452c172ee10" }, "collection": { "id": "00000000-0000-0000-0000-000000000000" }, "sitemap_locations": [], "last_modified": "2020-08-26T15:30:39.9274371Z" }
Why is this feature required? What problems does it solve?
As there can be multiple workflow steps associated with a project and localisation flow depends on the workflow step of an item, we request you to add this to your backup and restore functionality. Additionally if the content items are archived and not used anymore, having redundant content items can prove problematic.
An ideal solution for the above problems.
Ideally, during backup the workflow step of each item should also be preserved in the zip file created and restored from the file.
Add any other context, screenshots, or reference links about the feature request here.
The content Item is not getting imported in other projects while any referenced images are there?
The referenced media items also should be imported.
Example steps:
I notice that when exporting a Kontent project, content of the deactivated language is excluded from the export (ex: not found in languageVariants.json). However, the language itself is still included in languages.json. It must be manually removed before importing so that it will excluded completely from the target project.
One problem, for example. A customer imports the backup into a project that is under the Developer subscription, but now exceeds the language limit with no way to delete the extras from the project. A deactivated language appears to count towards the language limit; unlike a deactivated user which does not count towards user limits.
On export, if a language is deactivated, it should also not be included in the backup. It is a moot point to include a deactivated language in a backup since no content of that language variant was exported anyway. In the event someone wanted to re-activate the language, no content will be present.
Larger projects ten to timeout or throw Network exceptions due to possibly overloading server. It might be beneficial to add configuration option that will provide a way of artificially delaying requests.
Add proxy support to Export/Import Service so Backup Manager can be used on a corporate network behind a proxy server.
Expose optional proxy config via IExport/Import Config?
See support case with andrewh from 29/07/2020
Why is this feature required? What problems does it solve?
I am writing a script that would be usable by users and uses loadFileAsync
and now it depends on the directory the script is executed on. If I have a structure like src/scripts/data.zip
and src/scripts/script.j
s which executes const zipFile = await fileService.loadFileAsync("data")
and I have started the script from the src folder which gives me an error.
An ideal solution for the above problems.
Make an overload function / new function that takes an absolute path for the file
Add any other context, screenshots, or reference links about the feature request here.
Similarly as with import, there should be a config option which indicates if given object is exported or not.
I have exported a package that has all items published: https://github.com/Kentico/kontent-starter-corporate-next-js/blob/main/kontent-backup.zip
When I import it using kbm --action=restore --apiKey=xxx --projectId=xxx --zipFilename=backupFile
Items are correctly imported, but not published.
All items should be published.
I have checked languageVariants.json
file content and all workflow_step
are pointing to "Published" step
When downloading large sets of data (e.g. variants), the results are logged to console only after all data is downloaded. App should give updates per each page that is fetched from API.
Please update the README clean in code part here :https://github.com/Kentico/kontent-backup-manager-js#clean-in-code
This references to ImportService instead of CleanService.
What went wrong?
What the correct behavior is?
Add any other context about the problem here.
Add links to screenshots, if possible.
The kbm export
command should accept --environmentId,
but currently accepts --projectId
instead. IExportConfig
has the correct attribute on github but the package itself still shows projectId
https://www.npmjs.com/package/@kontent-ai/backup-manager/v/4.1.0?activeTab=code
I think maybe this is caused by a failed build or something like that but I'm not sure.
Install latest version of kbm
Run kbm command with --environmentId
Successful export
The export fails with the following error Error: Project id was not provided
Our content editors upload content (adding new collections) to the Production environment in advance of our design team working on the layout/styling. The design team works off the Development environment so they can workout the layouts and tweak the styling.
Currently, we would clone the Production environment (with the newly added collections) to create a new Development environment so the design team can work off the latest content. The cloning process can take about a hour and it will take longer as we add in more content.
It would be useful to have a feature where a collection can be copied from one environment to another environment.
I did an export from one project, then tried to import it to another project, and it failed with 404 during POST of one of the assets, which has a "file_name": "Workflows+widget.png"
.
I repeated several times, but it didn't help.
Then I deleted this object manually from assets.json
, cleaned project and started import again. It failed again for another asset: "file_name": "high-res+Loren.png"
with same http status.
Then I removed all 7 assets with "+" char in names, and all assets has been successfully imported (565 of them).
I think it's quite likely this problem exists for the specific character in file name.
I'm using CLI on Windows 10, node v16.13.2: npx kbm --config=restore-staging-config.json
I also reproduced the same problem in https://kentico.github.io/kontent-template-manager . When I've done this, I copied the failed request and tried to execute it a few times. Problem reproduced each time.
I am trying to restore a backup from here https://github.com/makma/gatsby-starter-kontent-lumen with --importLanguages
parameter but it doesn't import the languages, neither changes the code name of the default language
--importLanguages
The restore imports the languages and proceeds OK
It would be very straightforward to pass this in from the cli.
On exporting large number of assets to binary files I have had to perform a number of re-runs due to node throwing the axios exception ECONNRESET
when retrieving from the CDN(!!)
It would be pertinent to have axios perform some sort of retry logic in the code before throwing an exception. Perhaps this could be made configurable.
At the "Create zip file" stage following error occurs:
There was an error processing your request: RangeError [ERR_INVALID_ARG_VALUE]: The argument 'size' is invalid. Received 4620737532
at Function.allocUnsafe (node:buffer:374:3)
at Function.concat (node:buffer:553:25)
at concat (/usr/lib/node_modules/@kentico/kontent-backup-manager/node_modules/jszip/lib/stream/StreamHelper.js:62:27)
at StreamHelper. (/usr/lib/node_modules/@kentico/kontent-backup-manager/node_modules/jszip/lib/stream/StreamHelper.js:96:61)
at Immediate._onImmediate (/usr/lib/node_modules/@kentico/kontent-backup-manager/node_modules/jszip/lib/utils.js:381:18)
at processImmediate (node:internal/timers:464:21) {
code: 'ERR_INVALID_ARG_VALUE'
}
This seems to be JSZip related, although size is not a known restriction.
Versions used:
Ubuntu 20.04 (In Docker)
Node: v16.1.0
npm: 7.11.2
jszip: 3.6.0
Customers might want to export the workflows configuration of the project and then import it into another one.
Add import and export for workflows.
Add export for roles.
Export for roles needs to be added too as workflows reference them. But we do not have an endpoint for importing roles yet so that one will be not be included in the backup manager either.
hey Mike, I wanted to give it a go today but...how can I test it? there is no information regarding setting up the content model nor an export from the backup manager...I could check the code but I'd like to see it in action first.
I request you to add max-http-header-size
option to the ExportService
class, so that I do not get Header overflow exception when I try to backup projects with large amount of content.
ExportService
class accepts max-http-header-size
option.
--restore
command is skipping some assets, leading to a validation error due to a missing asset.
--clean
command to create empty target environment--restore
command with the project backup folder (@JiriLojda I can provide you with the data)The manager should add all assets from the backup folder
node v18.16.0
npm v9.6.4
@kontent-ai/[email protected]
@kontent-ai/[email protected]
Likely related issue: #40
Have run into an enviroment that is too large to export. Run into "This environment has more than 35000 variants. At this moment we do not support validation of environments of this size."
Expand the export filter to include specific Item Types or create a new filter that takes Item Type codenames and only exports items and variants of those types.
exportFilter: { dataTypes: ["languageVariant"] contentTypeFilter: ["type_code_name_1", "type_code_name_2"] }
My first thought was to target languages but assume this can't be done via Management API, as filtering is limited. However, MAPI does have this endpoint for types that could be leveraged to export large environments, or more granular control on what a user exports
I was exporting the backup using CLI on Windows 10, node v16.13.2: npx kbm --config=backup-config.json
Suddenly, in the middle of that, process crashed with error:
<...>
Exported: ***** | languageVariant
Exported: ***** | languageVariant
There was an error processing your request: AxiosError: connect ETIMEDOUT 151.101.86.217:443
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1161:16) {
port: 443,
address: '151.101.86.217',
syscall: 'connect',
code: 'ETIMEDOUT',
errno: -4039,
config: {
transitional: {
silentJSONParsing: true,
forcedJSONParsing: true,
clarifyTimeoutError: false
},
adapter: [Function: httpAdapter],
transformRequest: [ [Function: transformRequest] ],
transformResponse: [ [Function: transformResponse] ],
timeout: 0,
xsrfCookieName: 'XSRF-TOKEN',
xsrfHeaderName: 'X-XSRF-TOKEN',
maxContentLength: -1,
maxBodyLength: -1,
env: { FormData: [Function] },
validateStatus: [Function: validateStatus],
headers: {
Accept: 'application/json, text/plain, */*',
'X-KC-SDKID': 'npmjs.com;@kontent-ai/management-sdk;3.0.0',
authorization: 'bearer *****',
'User-Agent': 'axios/0.27.2'
},
method: 'get',
url: 'https://manage.kontent.ai/v2/projects/*****/items/*****/variants',
data: undefined
},
request: <ref *1> Writable {
_writableState: WritableState {
objectMode: false,
highWaterMark: 16384,
finalCalled: false,
needDrain: false,
ending: false,
ended: false,
finished: false,
destroyed: false,
decodeStrings: true,
defaultEncoding: 'utf8',
length: 0,
writing: false,
corked: 0,
sync: true,
bufferProcessing: false,
onwrite: [Function: bound onwrite],
writecb: null,
writelen: 0,
afterWriteTickInfo: null,
buffered: [],
bufferedIndex: 0,
allBuffers: true,
allNoop: true,
pendingcb: 0,
constructed: true,
prefinished: false,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
errored: null,
closed: false,
closeEmitted: false,
[Symbol(kOnFinished)]: []
},
_events: [Object: null prototype] {
response: [Function: handleResponse],
error: [Function: handleRequestError],
socket: [Function: handleRequestSocket]
},
_eventsCount: 3,
_maxListeners: undefined,
_options: {
maxRedirects: 21,
maxBodyLength: 10485760,
protocol: 'https:',
path: '/v2/projects/*****/items/*****/variants',
method: 'GET',
headers: [Object],
agent: undefined,
agents: [Object],
auth: undefined,
hostname: 'manage.kontent.ai',
port: null,
nativeProtocols: [Object],
pathname: '/v2/projects/*****/items/*****/variants'
},
_ended: true,
_ending: true,
_redirectCount: 0,
_redirects: [],
_requestBodyLength: 0,
_requestBodyBuffers: [],
_onNativeResponse: [Function (anonymous)],
_currentRequest: ClientRequest {
_events: [Object: null prototype],
_eventsCount: 7,
_maxListeners: undefined,
outputData: [],
outputSize: 0,
writable: true,
destroyed: false,
_last: true,
chunkedEncoding: false,
shouldKeepAlive: false,
maxRequestsOnConnectionReached: false,
_defaultKeepAlive: true,
useChunkedEncodingByDefault: false,
sendDate: false,
_removedConnection: false,
_removedContLen: false,
_removedTE: false,
_contentLength: 0,
_hasBody: true,
_trailer: '',
finished: true,
_headerSent: true,
_closed: false,
socket: [TLSSocket],
_header: 'GET /v2/projects/*****/items/*****/variants HTTP/1.1\r\n' +
'Accept: application/json, text/plain, */*\r\n' +
'X-KC-SDKID: npmjs.com;@kontent-ai/management-sdk;3.0.0\r\n' +
'authorization: bearer *****' +
'User-Agent: axios/0.27.2\r\n' +
'Host: manage.kontent.ai\r\n' +
'Connection: close\r\n' +
'\r\n',
_keepAliveTimeout: 0,
_onPendingData: [Function: nop],
agent: [Agent],
socketPath: undefined,
method: 'GET',
maxHeaderSize: undefined,
insecureHTTPParser: undefined,
path: '/v2/projects/*****/items/*****/variants',
_ended: false,
res: null,
aborted: false,
timeoutCb: null,
upgradeOrConnect: false,
parser: null,
maxHeadersCount: null,
reusedSocket: false,
host: 'manage.kontent.ai',
protocol: 'https:',
_redirectable: [Circular *1],
[Symbol(kCapture)]: false,
[Symbol(kNeedDrain)]: false,
[Symbol(corked)]: 0,
[Symbol(kOutHeaders)]: [Object: null prototype]
},
_currentUrl: 'https://manage.kontent.ai/v2/projects/*****/items/*****/variants',
[Symbol(kCapture)]: false
}
}
I repeated same operation multiple times, and each time it failed, but for a different object. There was no long waiting before throwing (compared to usual frequencies of successful logs). After some time, repeating the same operation succeeded without any changes from my side.
This SO answer suggests adding keep-alive and timeout for axios, but I had no chance to check it.
Import filter for canImport -> contentType
is not working as expected.
What went wrong?
I want the contentTypes import to work for a given Content Type only. However, the import flow ignores all contentTypes completely.
canImport: { contentType: item => { if (item.codename === 'codename_of_one_content_type_in_your_project') { // content type will be imported only if the codename is equal to 'n020_app_ui_string' return true } // all other types will be excluded from import return false }, ... }
The import service should import the content type specified in the config.
What the correct behavior is?
The import service should import the content type specified in the config.
Tested this via GitHub actions by creating a custom action using typescript.
Add any other context about the problem here.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.