Giter VIP home page Giter VIP logo

notion-guardian's Introduction

Notion Guardian

A tool that automatically backups your Notion workspace and commits changes to another repository.

Notion Guardian offers a quick way to setup a secure backup of your data in a private repository โ€” allowing you to track how your notes change over time and to know that your data is safe.

The tool separates the logic for running the export and the actual workspace data into two repositories. This way your backups are not cluttered with other scripts. If you prefer to have a one-repo solution or want to backup specific blocks of your workspace, checkout the notion-backup fork by upleveled.

How to setup

  1. Create a separate private repository for your backups to live in (e.g. "my-notion-backup"). Make sure you create a main branch โ€” for example by clicking "Add a README file" when creating the repo.
  2. Use this repository ("notion-guardian") as a template in order to create a copy (Click the green "Use this template" button).
  3. Create a Personal Access Token (docs) with the "repo" scope and store it as REPO_PERSONAL_ACCESS_TOKEN in the secrets of the copied repo.
  4. Store your GitHub username in the REPO_USERNAME secret.
  5. Store the name of your newly created private repo in the REPO_NAME secret (in this case "my-notion-backup").
  6. Store the email that should be used to commit changes (usually your GitHub account email) in the REPO_EMAIL secret.
  7. Obtain your Notion space-id and token as described in this Medium post. Store it in the NOTION_SPACE_ID and NOTION_TOKEN secret.
  8. You will also need to obtain your notion_user_id the same way and store it in a NOTION_USER_ID secret.
  9. Wait until the action runs for the first time or push a commit to the repo to trigger the first backup.
  10. Check your private repo to see that an automatic commit with your Notion workspace data has been made. Done ๐Ÿ™Œ

How it works

This repo contains a GitHub workflow that runs every day and for every push to this repo. The workflow will execute the script which makes an export request to Notion, waits for it to finish and downloads the workspace content to a temporary directory. The workflow will then commit this directory to the repository configured in the repo secrets.

notion-guardian's People

Contributors

josehower avatar karlhorky avatar richartkeil avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

notion-guardian's Issues

Unexpected Symbol

image

Not sure what this error means. I am trying it for the first time. I have Notion Space ID as NOTION_SPACE_ID : ${{ secrets.25................. }}.

Am I supposed to enter the ID differently?

Export not unzipped?

Did I miss anything? It's the ZIP file that is committed. I have to do the following to unzip. ๐Ÿ™๐Ÿฝ

          # Find the zip file using the wildcard
          zip_file="$(ls *.zip | head -n1)"
          # Unzip the file
          unzip "$zip_file"
          # Get the top-level folder
          folder=$(unzip -Z1 "$zip_file" | awk -F/ '{print $1}' | sort | uniq)
          # Move the contents of the first-level folder to the current directory
          rsync "${folder%/}"/* .
          # Remove the first-level folder
          rm -r "${folder%/}"
          # Delete the zip file
          rm "$zip_file"

AxiosError: Request failed with status code 400

Hi @richartkeil

I have been using a fork of notion-guardian (notion-backup), but since yesterday the backup has been failing with this error:

AxiosError: Request failed with status code 400

Details (sanitized) - I'm not sure if this is what you need or not:

headers: AxiosHeaders {
      Accept: 'application/json, text/plain, */*',
      'Content-Type': 'application/json',
      Cookie: 'token_v2=***',
      'User-Agent': 'axios/1.3.5',
      'Content-Length': '197',
      'Accept-Encoding': 'gzip, compress, deflate, br'
    },
    baseURL: 'https://www.notion.so/api/v3',
    method: 'post',
    url: 'enqueueTask',
    data: '{"task":{"eventName":"exportBlock","request":{"blockId":"xxxxxxxx-xxxx-x[xx](https://github.com/alice-rosen/notion-dphil-automated/actions/runs/xxxxxxxxxx/jobs/xxxxxxxxxx#step:6:43)x-xxxx-xxxxxxxxxxxx","exportOptions":{"exportType":"markdown","locale":"en","timeZone":"Europe/Vienna"},"recursive":true}}}'
  },

And response details:

 response: {
     status: 400,
     statusText: 'Bad Request',

Apologies, this is my first time posting an issue, so please let me know if you need more info. Let me know if you have any ideas or if there is anything not too complicated I could try :)

Improvements

Hey @richartkeil, just wanted to ping you that I've finished my improvements to the script:

  1. Export from blocks instead of entire workspace (with recursive option)
  2. Switch to ESM, top-level await
  3. Change forking workflow to template workflow, simplify instructions
  4. Commit to same repo instead of an external one

https://github.com/upleveled/notion-backup

Feel free to take anything that is useful! (and close this issue whenever you like) Happy to answer any questions too.

Credited you at the bottom too, because your script proved to be a great basis for writing this.

The workflow cancelled by itself

Hi, thank you for your work! I am an ordinary user, I don't code so I don't know how to fix this. I followed all your steps and I trigged the workflow three times by modifying the README and they all stopped here after repeating "exported 302 pages" 352 lines. Is here any solution? Thanks! I didn't do anything during the process of the backup workflow.

image

Error when export zip file > 500 MB

It seemed that notion export zip file has sub zip archives if the export zip > 500 MB. I started to have this issue recently as the archive file size slowly increasing.

GitHub Action log:

[main 124f791] Automated Notion workspace backup
 1 file changed, 0 insertions(+), 0 deletions(-)
 create mode 100644 Export-***-Part-1.zip
remote: error: Trace: ***        
remote: error: See http://git.io/iEPt8g for more information.        
remote: error: File Export-***-Part-1.zip is 498.13 MB; this exceeds GitHub's file size limit of 100.00 MB        
remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com./        
To https://github.com/***/***
 ! [remote rejected] main -> main (pre-receive hook declined)
error: failed to push some refs to 'https://github.com/***/***'
Error: Process completed with exit code 1.

It might be something about the extract function at

await extract(workspaceZip, { dir: workspaceDir });

which is imported from
const extract = require(`extract-zip`);

But I have completely no knowledge of JS so please fix it.

Axios Request Failed with HTTP Status Code 403

Hi @richartkeil ๐Ÿ‘‹ Hope you are well.

Recently, our daily Notion backups (using our notion-backup fork of notion-guardian) started failing with an HTTP 403 status code:

AxiosError: Request failed with status code 403

We have had no code changes in the last 5 days, and the daily backups have succeeded until today (eg. yesterday's backup succeeded).

I originally also tried updating the token (which I thought was the problem originally) but this did not change the error.

Relevant request details (sanitized):

request: ClientRequest {
  method: 'GET',
  headers: {
    Accept: 'application/json, text/plain, */*',
    Cookie: 'token_v2=***',
    'User-Agent': 'axios/1.3.4',
    'Accept-Encoding': 'gzip, compress, deflate, br'
  },
  protocol: 'https:',
  hostname: 'file.notion.so',
  path: '/f/t/xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx/Export-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx.zip?id=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxx&table=user_export&expirationTimestamp=1680167194467&signature=xxxxxxxxxx_xxxxxxxx-xxxxxxxxxxxxxxxxxx&download=true&downloadName=xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxx%2FExport-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx.zip',
  baseURL: 'https://www.notion.so/api/v3',
}

And response details:

   response: {
     status: 403,
     statusText: 'Forbidden',

And axios config:

  config: {
    baseURL: 'https://www.notion.so/api/v3',
    method: 'get',
    url: 'https://file.notion.so//f/t/xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx/Export-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx.zip?id=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxx&table=user_export&expirationTimestamp=1680167194467&signature=xxxxxxxxxx_xxxxxxxx-xxxxxxxxxxxxxxxxxx&download=true&downloadName=xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxx%2FExport-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx.zip',
    responseType: 'stream',
    data: undefined
  },

I'm guessing because of the responseType: 'stream' in the error message, it probably has to do with this code here:

notion-guardian/index.js

Lines 72 to 76 in 80135ac

const response = await client({
method: `GET`,
url: exportURL,
responseType: `stream`,
});


Maybe Notion changed their internal API?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.