Giter VIP home page Giter VIP logo

gobackup's Issues

Config path

Need the ability for set config path manually like gobackup --config path_to_condig/gobackup.yml

Problem with run backup

Hello. I have the next problem. I run the script for run backup (centos7)
/usr/local/bin/gobackup perform >> ~/.gobackup/gobackup.log
Backup doesn't run and log files clean.
Screenshot_71

Maybe you know where I can find debug or etc?

Docker image support?

Hi, do you plan to support Docker image? For example, adding a Dockerfile and publish it to hub.docker.com?

当配置文件不存在archive段时会出现段错误

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x8 pc=0x76378a]

gobackup/archive/archive.go:16
includes := model.Archive.GetStringSlice("includes")

not clean tmp dir when exec on root permission and one job store to different destination

Amazon Linux AMI latest
gobackup -v 0.6.1

用于自动备份docker的存储,docker -v指定的目录所有者为root
所以在执行perform时使用sudo来执行
执行完毕后查看tmp/gobackup目录下的临时文件夹和tar打包文件没有被清理,需要手动清理

另,
指定一个任务时,存储目的地貌似只识别最后一项store
我有一项本地备份任务,希望在存储时,同时上传至多个不同的目的地,比如阿里OSS和aws s3。
如果拆分为两个任务,则会打包两次,生成两个压缩包和临时目录
结合上一个root下执行没有清理的问题,磁盘容量消耗非常大

请问有没有什么解决方法?谢谢~

scp out of memory 异常

如果备份文件很大,使用scp时会报内存溢出。

2017/10/11 09:14:56 -> Connecting...
2017/10/11 09:14:57 -> scp /Raid/psdbackup/nexus/2017.10.11.16.52.28.tar.gz
fatal error: runtime: out of memory

runtime stack:
runtime.throw(0xadea51, 0x16)
        /usr/local/go/src/runtime/panic.go:605 +0x95
runtime.sysMap(0xc820400000, 0x400000000, 0xc420001c00, 0xed1b18)
        /usr/local/go/src/runtime/mem_linux.go:216 +0x1d0
runtime.(*mheap).sysAlloc(0xeb88a0, 0x400000000, 0xc420037e20)
        /usr/local/go/src/runtime/malloc.go:470 +0xd7
runtime.(*mheap).grow(0xeb88a0, 0x200000, 0x0)
        /usr/local/go/src/runtime/mheap.go:887 +0x60
runtime.(*mheap).allocSpanLocked(0xeb88a0, 0x200000, 0xed1b28, 0x0)
        /usr/local/go/src/runtime/mheap.go:800 +0x334
runtime.(*mheap).alloc_m(0xeb88a0, 0x200000, 0xffffffffffff0101, 0xc420037ef0)
        /usr/local/go/src/runtime/mheap.go:666 +0x118
runtime.(*mheap).alloc.func1()

Feature Request: ExpandEnv

Background

To use gobackup with Kubernetes or other tools, we need to pass credentials dynamically via environment variable.

Currently, the yaml config file is kind of static. We could leverage os.ExpandEnv to enable the support for environment variable.

手动执行gobackup perform成功,但是定时任务提示找不到mysqldump

[root@iZwz9ccgakmnlprso2kzxlZ ~]# crontab -l
0 0 * * * /usr/local/bin/gobackup perform >> ~/.gobackup/gobackup.log
[root@iZwz9ccgakmnlprso2kzxlZ ~]# cat ~/.gobackup/gobackup.log
2019/07/10 00:00:01 ======== mysql ========
2019/07/10 00:00:01 WorkDir: /tmp/gobackup/1562688001676054782/mysql

2019/07/10 00:00:01 ------------- Databases -------------
2019/07/10 00:00:01 => database | mysql : master
2019/07/10 00:00:01 -> Dumping MySQL...
2019/07/10 00:00:01 -> Dump error: mysqldump cannot be found
2019/07/10 00:00:01 Cleanup temp dir:/tmp/gobackup...

2019/07/10 00:00:01 ======= End mysql =======

[root@iZwz9ccgakmnlprso2kzxlZ ~]# mysqldump
Usage: mysqldump [OPTIONS] database [tables]
OR mysqldump [OPTIONS] --databases [OPTIONS] DB1 [DB2 DB3...]
OR mysqldump [OPTIONS] --all-databases [OPTIONS]
For more options, use mysqldump --help
[root@iZwz9ccgakmnlprso2kzxlZ ~]#
[root@iZwz9ccgakmnlprso2kzxlZ ~]# mysql --version
mysql Ver 15.1 Distrib 10.1.21-MariaDB, for Linux (x86_64) using readline 5.1
[root@iZwz9ccgakmnlprso2kzxlZ ~]# mysqldump --version
mysqldump Ver 10.16 Distrib 10.1.21-MariaDB, for Linux (x86_64)
[root@iZwz9ccgakmnlprso2kzxlZ ~]#
[root@iZwz9ccgakmnlprso2kzxlZ ~]# gobackup --version
gobackup version 0.8.0

/bin/tar: file changed as we read it

deploy@localhost:~/.gobackup$ gobackup perform
2017/10/12 15:16:45 ======== yizaoyiwan ========
2017/10/12 15:16:45 WorkDir: /tmp/gobackup/1507821405731845154/yizaoyiwan

2017/10/12 15:16:45 ------------- Databases -------------
2017/10/12 15:16:45 => database | PostgreSQL: yizaoyiwan
2017/10/12 15:16:45 -> Dumping PostgreSQL...
2017/10/12 15:16:46 dump path: /tmp/gobackup/1507821405731845154/yizaoyiwan/postgresql/yizaoyiwan/yizaoyiwan.sql
2017/10/12 15:16:46
2017/10/12 15:16:46 ------------- Databases -------------

2017/10/12 15:16:46 ------------- Archives -------------
2017/10/12 15:16:46 => includes 1 rules
2017/10/12 15:16:46 ------------- Archives -------------

2017/10/12 15:16:46 ------------ Compressor -------------
2017/10/12 15:16:46 => Compress | tgz
2017/10/12 15:16:46 [debug] /bin/tar zcf /tmp/gobackup/1507821405731845154/yizaoyiwan/2017.10.12.23.16.46.tar.gz yizaoyiwan
2017/10/12 15:16:46 /bin/tar: yizaoyiwan: file changed as we read it

2017/10/12 15:16:46 Cleanup temp dir...

2017/10/12 15:16:46 ======= End yizaoyiwan =======

2017/10/12 15:16:46 /bin/tar: yizaoyiwan: file changed as we read it

Show this error and no backup file created in target folder.


  • gobackup version 0.5.0
  • config file same to #7

mysql 源增加additional_options配置参数后执行不成功

问题现象

mysql 加上配置additional_options:

additional_options: "--single-transaction --quick"

执行控制台输出 mysqldump: unknown variable 'single-transaction --quick'

初步分析

看代码发现additional_options是直接作为一个cmd arg 整体添加的exec.Command的。而Go的exec.Command 对args部分存在空格的参数支持行为有问题, 如exec.Command("ls", "-l -a") 需要拆成exec.Command("ls", "-l", "-a")

建议

spaceRegexp.Split(command, -1)逻辑应用每个arg, 类似这个逻辑: args = args.flat_map{|arg| spaceRegexp.Split(arg)}

please consider using pigz or gzip with multiple cores and split the backup file

thanks for the wonderful tool .
but during the compress process , gobackup only single core to run gzip , is it possible make this a configurable parameter in configuration file ?
or replace gzip with pigz for use multi cores to compress the backup file ?

and after compress , the backup file is a huge tgz , is it possible to make an option to split the backup file ?

Database socket connection

Please, allow to specify connection to local unix socket for mysql and redis database.
So user can provide configuration option 'socket', which will be used instead of 'host' and 'port' to connect to database.

不会自动递归创建临时目录,导致打包失败

不会自动递归创建临时目录,导致打包失败,具体如下

gobackup 版本 0.4.2
配置文件如下:

models:
  nexus:
    store_with:
      type: scp
      host: 17.17.8.1
      path: /Raid/xx/nexus
      private_key: /home/xx/.ssh/id_rsa
      username: xx
    archive:
      includes:
        - /data/nexus

执行结果:

[root@psd008 gobackup]# gobackup perform
2017/10/10 11:59:23 ======== nexus ========
2017/10/10 11:59:23 WorkDir: /tmp/gobackup/1507636763978943088/nexus

2017/10/10 11:59:23 ------------- Databases -------------
2017/10/10 11:59:23 ------------- Databases -------------

2017/10/10 11:59:23 ------------- Archives -------------
2017/10/10 11:59:23 => includes 1 rules
2017/10/10 11:59:23 [debug] /bin/tar -cPf /tmp/gobackup/1507636763978943088/nexus/archive.tar /data/nexus
2017/10/10 11:59:23 ------------- Archives -------------

2017/10/10 11:59:23 ------------ Compressor -------------
2017/10/10 11:59:23 => Compress with Tgz...
2017/10/10 11:59:23 [debug] /bin/tar zcf /tmp/gobackup/2017.10.10.19.59.23.tar.gz nexus
2017/10/10 11:59:23 /bin/tar: nexus: Cannot stat: No such file or directory
/bin/tar: Exiting with failure status due to previous errors

2017/10/10 11:59:23 Cleanup temp dir...

2017/10/10 11:59:23 ======= End nexus =======

Unmarshal cycler.json failed:unexpected end of JSON input

When performing "gobackup perform" for the first time ever:

2018/05/19 23:15:15 ------------- Storage --------------
2018/05/19 23:15:15 => Storage | local
2018/05/19 23:15:15 Store successed /root/backups
2018/05/19 23:15:15 Unmarshal cycler.json failed:unexpected end of JSON input
2018/05/19 23:15:15 ------------- Storage --------------

The second time I run "gobackup perform" no longer throws this error, so I guess this is a low priority bug.

compile error on arm

try to compile on arm server. i got this error ,

root@y2t-web:~/go/src/github.com/huacnlee/gobackup# go build

github.com/bramvdbogaerde/go-scp/auth

../../bramvdbogaerde/go-scp/auth/key.go:10: undefined: ssh.HostKeyCallback

can gobackup exclude subdirectories ?

Thanks gobackup author,Can gobackup exclude subdirectories ? such as include the path /home/wwwroot/test.cn, exclude the path home/wwwroot/test.cn/upload , the following is my configuration file:

 ## 源码备份
  sourceBackup:
    # 备份打包
    compress_with:
      # 压缩方式
      type: tgz
    # 存储
    store_with:
      # 存储到oss上
      ........
    # 文件备份
    archive:
      # 需要备份的文件、文件夹
      includes:
        - /home/wwwroot/test.cn
      excludes:
        - /home/wwwroot/test.cn/upload

ssh: must specify HostKeyCallback

This situation occurs when attempting to use scp storage, with user/password authentication (no private key):

2018/05/19 23:51:38 ------------- Storage --------------
2018/05/19 23:51:38 => Storage | scp
2018/05/19 23:51:38 PrivateKey /root/.ssh/id_rsa
2018/05/19 23:51:38 ssh: must specify HostKeyCallback

ssh: must specify HostKeyCallback

unknown function: pg_catalog.set_config()

pg_dump: [archiver (db)] query failed: ERROR: unknown function: pg_catalog.set_config()
pg_dump: [archiver (db)] query was: SELECT pg_catalog.set_config('search_path', '', false)

An error occurred when I tried to backup a cockroachdb (postgresql) database

image

image

Here's my yml file
models:
localhost:
compress_with:
type: tgz
store_with:
type: local
keep: 20
path: /home/ernanie/backup/cockroach
databases:
bank:
database: bank
type: postgresql
host: localhost
port: 26257
username: root

备份压缩文件本地缓存自动清理或者设置最大文件数进行滚动清理

gobackup.yml 配置:

models:
  example:
    compress_with:
      type: tgz
    store_with:
      type: oss
      max_retries: 5
      timeout: 300
      threads: 1 (1 .. 100)
    archive:
      includes:
        - /home/git/.ssh/
        - /etc/nginx/nginx.conf
        - /etc/nginx/conf.d
      excludes:
        - /home/ubuntu/.ssh/known_hosts
        - /etc/logrotate.d/syslog

现象:

设置 cronjob 每日备份,gobackup 会在 /tmp/gobackup 目录下生成对应每日备份文件的压缩文件。我设置的对象存储的保存策略,但本地备份的压缩文件不断积累,需要手动清理。

需求:

  1. 新增配置,支持备份对象存储后本地文件直接清除。
  2. 支持本地缓存文件的最大数量,支持滚动清理。

阿里云存储,如果 path 内带斜杠,会生成空目录

path 带斜杠,会给阿里云生成,一个斜杠空目录,删也删不掉,不知道是哪边的问题了,阿里云工单工程师也蒙圈了。

经过探索,发现:设置阿里云存储 path: /xxx/ 就会有问题,试验发现 设置 path: xxx,就好了

阿里云那边:
默认进到控制台
https://oss.console.aliyun.com/bucket/oss-cn-hangzhou/aaa-backup/
看到的是这样的
一个 斜杠目录,点击是无限循环,删又删不掉....

image

自己改改 url,如下:
https://oss.console.aliyun.com/bucket/oss-cn-hangzhou/aaa-backup/object?path=/
能看到了,还是删不掉,下载不了

image

gobackup not working with read-only home directory

I'm trying to run gobackup on a webhost with limited permissions. Unfortunately, I don't write permissions for the whole home directory (~), but only for a sub-directory.

gobackup tries to write to a file called ~/.gobackup/cycler/.json, but fails and exits. Here is the log:

2018/11/23` 19:21:19 ------------- Databases -------------
2018/11/23 19:21:19 => database | mysql : wordpress
2018/11/23 19:21:19 -> Dumping MySQL...
2018/11/23 19:21:24 dump path: /tmp/gobackup/1542997279119109937/wordpress/mysql/wordpress
2018/11/23 19:21:24
2018/11/23 19:21:24 ------------- Databases -------------

2018/11/23 19:21:24 ------------- Archives -------------
2018/11/23 19:21:24 => includes 1 rules
2018/11/23 19:21:24 ------------- Archives -------------

2018/11/23 19:21:24 ------------ Compressor -------------
2018/11/23 19:21:24 => Compress | tgz
2018/11/23 19:21:27 -> /tmp/gobackup/2018.11.23.19.21.24.tar.gz
2018/11/23 19:21:27 ------------ Compressor -------------

2018/11/23 19:21:27 ------------- Storage --------------
2018/11/23 19:21:27 => Storage | scp
2018/11/23 19:21:27 PrivateKey /home/www/p455475/.ssh/id_rsa
2018/11/23 19:21:27 -> scp /2018.11.23.19.21.24.tar.gz
2018/11/23 19:21:27 Store successed
2018/11/23 19:21:27 Load cycler.json failed:open /home/www/p455475/.gobackup/cycler/wordpress.json: no such file or directory
2018/11/23 19:21:27 Skip save cycler.json because it not loaded
2018/11/23 19:21:27 ------------- Storage --------------

2018/11/23 19:21:27 Cleanup temp dir:/tmp/gobackup...

2018/11/23 19:21:27 ======= End wordpress =======

Also, it says storing the files via SCP succeeded, but that's not actually true, i.e. no file is created.

Support restoring

Backup is useless if you can't conveniently restore from the archives. I know it's doable by invoking some shell commands to un-tar the archive and importing database dumps, but that is not very intuitive, let along there's no documentation at all.

I'd suggest adding a restore subcommand to allow users to selectively restore some part of the data.
Another solution would be to generate restore scripts (generic shell scripts) when perform finished.

阿里云 同样的 access_id 和 secret 在 carrierwave-aliyun 能用,gobackup 用报错

2019/01/01 16:18:52 oss: service returned error: StatusCode=400, ErrorCode=InvalidArgument, ErrorMessage=Authorization header is invalid., RequestId=5C0B226C2FF16CF75093E762

gobackup.yml:

# gobackup config example
# -----------------------
# Put this file in follow place:
# ~/.gobackup/gobackup.yml or /etc/gobackup/gobackup.yml
models:
  yfxs:
    compress_with:
      type: tgz
    store_with:
      type: oss
      keep: 10
      endpoint: oss-cn-hangzhou-internal.aliyuncs.com
      bucket: aaa
      accessKeyID: aaa
      accessKeySecret: aaa
      path: /aaa
    databases:
      aaa_production:
        type: postgresql
        host: localhost
        port: 5432
        database: aaa
        username: aaa
        password: aaa
    archive:
      includes:
        - /application/app/aaa/shared/etc/nginx.conf
        - /etc/nginx/nginx.conf
        - /etc/redis/redis.conf
                                                                                                                        30,0-1        All

能添加一点编译说明吗?

您好,我对go不大熟悉,我直接下载代码后编译,提示如下错误,麻烦看下能给点编译说明吗,多谢。

Gobackup not deleting temporary files

Looks like gobackup saves files at e.g. /tmp/gobackup/2020.09.30.18.01.06.tar.gz, but only deletes /tmp/gobackup/1601481601969223442

As a workaround, I've changed my cronjob to /usr/local/bin/gobackup perform >> /var/log/gobackup.log && rm -rf /tmp/gobackup (note the rm command at the end.)

/tmp/gobackup/ ls -la
total 236273360
drwxr-xr-x  2 root root        4096 Sep 30 18:31 .
drwxrwxrwt 12 root root        4096 Sep 30 06:31 ..
-rw-r--r--  1 root root 19797375857 Sep 27 20:13 2020.09.27.19.53.24.tar.gz
-rw-r--r--  1 root root 19832908870 Sep 28 00:20 2020.09.28.00.01.01.tar.gz
-rw-r--r--  1 root root 19872942935 Sep 28 06:20 2020.09.28.06.00.57.tar.gz
-rw-r--r--  1 root root 19961177560 Sep 28 12:22 2020.09.28.12.01.14.tar.gz
-rw-r--r--  1 root root 20056435330 Sep 28 18:21 2020.09.28.18.01.08.tar.gz
-rw-r--r--  1 root root 20233341709 Sep 29 00:21 2020.09.29.00.01.03.tar.gz
-rw-r--r--  1 root root 20292021283 Sep 29 06:22 2020.09.29.06.01.02.tar.gz
-rw-r--r--  1 root root 20317139957 Sep 29 12:21 2020.09.29.12.01.07.tar.gz
-rw-r--r--  1 root root 20685865291 Sep 30 00:21 2020.09.30.00.01.10.tar.gz
-rw-r--r--  1 root root 20742438814 Sep 30 06:25 2020.09.30.06.03.45.tar.gz
-rw-r--r--  1 root root 20947585039 Sep 30 12:21 2020.09.30.12.01.11.tar.gz
-rw-r--r--  1 root root 19204609375 Sep 30 18:20 2020.09.30.18.01.06.tar.gz
2020/09/30 18:00:01 ======== main ========
2020/09/30 18:00:01 WorkDir: /tmp/gobackup/1601481601969223442/main

2020/09/30 18:00:01 ------------- Databases -------------
2020/09/30 18:00:01 => database | redis : redis
2020/09/30 18:00:01 -> Invoke save...
2020/09/30 18:00:01 Copying redis dump to /tmp/gobackup/1601481601969223442/main/redis/redis
2020/09/30 18:01:06
2020/09/30 18:01:06 ------------- Databases -------------

2020/09/30 18:01:06 ------------ Compressor -------------
2020/09/30 18:01:06 => Compress | tgz
2020/09/30 18:20:52 -> /tmp/gobackup/2020.09.30.18.01.06.tar.gz
2020/09/30 18:20:52 ------------ Compressor -------------

2020/09/30 18:20:52 ------------- Storage --------------
2020/09/30 18:20:52 => Storage | s3
2020/09/30 18:20:52 -> S3 Uploading...
2020/09/30 18:23:46 => https://censored.s3.eu-central-1.amazonaws.com/backup%2F2020.09.30.18.01.06.tar.gz
2020/09/30 18:23:46 ------------- Storage --------------

2020/09/30 18:23:46 Cleanup temp dir:/tmp/gobackup/1601481601969223442...
models:
  main:
    databases:
      redis:
        type: redis
        mode: copy
        rdb_path: /var/lib/keydb/dump.rdb
        invoke_save: false
    compress_with:
      type: tgz
    store_with:
      type: s3
      keep: 20
      bucket: censored
      region: eu-central-1
      path: backup
      access_key_id: censored
      secret_access_key: censored

Upload speed limit

For the small Website cases, they are uses VPS may has a low network bandwidth (~2m - 5m).

If gobackup does't limit upload speed, the backup progress will block the network, then the website will not open.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.