gobackup / gobackup Goto Github PK
View Code? Open in Web Editor NEW🗄 CLI tool for backup your databases, files to cloud storages in schedully.
Home Page: https://gobackup.github.io
License: MIT License
🗄 CLI tool for backup your databases, files to cloud storages in schedully.
Home Page: https://gobackup.github.io
License: MIT License
Need the ability for set config path manually like gobackup --config path_to_condig/gobackup.yml
Hi, do you plan to support Docker image? For example, adding a Dockerfile
and publish it to hub.docker.com?
Mongodump asks for password on CLI and fails every time when password is not provided.
超过5g文件不能上传,是没有支持s3的分段上传么?
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x8 pc=0x76378a]
gobackup/archive/archive.go:16
includes := model.Archive.GetStringSlice("includes")
Amazon Linux AMI latest
gobackup -v 0.6.1
用于自动备份docker的存储,docker -v指定的目录所有者为root
所以在执行perform时使用sudo来执行
执行完毕后查看tmp/gobackup目录下的临时文件夹和tar打包文件没有被清理,需要手动清理
另,
指定一个任务时,存储目的地貌似只识别最后一项store
我有一项本地备份任务,希望在存储时,同时上传至多个不同的目的地,比如阿里OSS和aws s3。
如果拆分为两个任务,则会打包两次,生成两个压缩包和临时目录
结合上一个root下执行没有清理的问题,磁盘容量消耗非常大
请问有没有什么解决方法?谢谢~
good job!
We can run It on windows10 and Windows7:
tested:
1.mysql backup and files backup to oss
2.mysql backup and files backup to oss
3.OpenSSL - aes-256-cbc encrypt
note:
1.win7 need install tar(http://gnuwin32.sourceforge.net/packages/libarchive.htm),win10 not need
2.win7 and win10 need install openssl (http://gnuwin32.sourceforge.net/packages/openssl.htm),if you want aes-256-cbc encrypt
如果备份文件很大,使用scp时会报内存溢出。
2017/10/11 09:14:56 -> Connecting...
2017/10/11 09:14:57 -> scp /Raid/psdbackup/nexus/2017.10.11.16.52.28.tar.gz
fatal error: runtime: out of memory
runtime stack:
runtime.throw(0xadea51, 0x16)
/usr/local/go/src/runtime/panic.go:605 +0x95
runtime.sysMap(0xc820400000, 0x400000000, 0xc420001c00, 0xed1b18)
/usr/local/go/src/runtime/mem_linux.go:216 +0x1d0
runtime.(*mheap).sysAlloc(0xeb88a0, 0x400000000, 0xc420037e20)
/usr/local/go/src/runtime/malloc.go:470 +0xd7
runtime.(*mheap).grow(0xeb88a0, 0x200000, 0x0)
/usr/local/go/src/runtime/mheap.go:887 +0x60
runtime.(*mheap).allocSpanLocked(0xeb88a0, 0x200000, 0xed1b28, 0x0)
/usr/local/go/src/runtime/mheap.go:800 +0x334
runtime.(*mheap).alloc_m(0xeb88a0, 0x200000, 0xffffffffffff0101, 0xc420037ef0)
/usr/local/go/src/runtime/mheap.go:666 +0x118
runtime.(*mheap).alloc.func1()
To use gobackup
with Kubernetes or other tools, we need to pass credentials dynamically via environment variable.
Currently, the yaml config file is kind of static. We could leverage os.ExpandEnv
to enable the support for environment variable.
打印的日志默认显示的是0时区。
当前系统是东8区,3点,结果在日志中显示的是19点
[root@iZwz9ccgakmnlprso2kzxlZ ~]# crontab -l
0 0 * * * /usr/local/bin/gobackup perform >> ~/.gobackup/gobackup.log
[root@iZwz9ccgakmnlprso2kzxlZ ~]# cat ~/.gobackup/gobackup.log
2019/07/10 00:00:01 ======== mysql ========
2019/07/10 00:00:01 WorkDir: /tmp/gobackup/1562688001676054782/mysql2019/07/10 00:00:01 ------------- Databases -------------
2019/07/10 00:00:01 => database | mysql : master
2019/07/10 00:00:01 -> Dumping MySQL...
2019/07/10 00:00:01 -> Dump error: mysqldump cannot be found
2019/07/10 00:00:01 Cleanup temp dir:/tmp/gobackup...2019/07/10 00:00:01 ======= End mysql =======
[root@iZwz9ccgakmnlprso2kzxlZ ~]# mysqldump
Usage: mysqldump [OPTIONS] database [tables]
OR mysqldump [OPTIONS] --databases [OPTIONS] DB1 [DB2 DB3...]
OR mysqldump [OPTIONS] --all-databases [OPTIONS]
For more options, use mysqldump --help
[root@iZwz9ccgakmnlprso2kzxlZ ~]#
[root@iZwz9ccgakmnlprso2kzxlZ ~]# mysql --version
mysql Ver 15.1 Distrib 10.1.21-MariaDB, for Linux (x86_64) using readline 5.1
[root@iZwz9ccgakmnlprso2kzxlZ ~]# mysqldump --version
mysqldump Ver 10.16 Distrib 10.1.21-MariaDB, for Linux (x86_64)
[root@iZwz9ccgakmnlprso2kzxlZ ~]#
[root@iZwz9ccgakmnlprso2kzxlZ ~]# gobackup --version
gobackup version 0.8.0
deploy@localhost:~/.gobackup$ gobackup perform
2017/10/12 15:16:45 ======== yizaoyiwan ========
2017/10/12 15:16:45 WorkDir: /tmp/gobackup/1507821405731845154/yizaoyiwan
2017/10/12 15:16:45 ------------- Databases -------------
2017/10/12 15:16:45 => database | PostgreSQL: yizaoyiwan
2017/10/12 15:16:45 -> Dumping PostgreSQL...
2017/10/12 15:16:46 dump path: /tmp/gobackup/1507821405731845154/yizaoyiwan/postgresql/yizaoyiwan/yizaoyiwan.sql
2017/10/12 15:16:46
2017/10/12 15:16:46 ------------- Databases -------------
2017/10/12 15:16:46 ------------- Archives -------------
2017/10/12 15:16:46 => includes 1 rules
2017/10/12 15:16:46 ------------- Archives -------------
2017/10/12 15:16:46 ------------ Compressor -------------
2017/10/12 15:16:46 => Compress | tgz
2017/10/12 15:16:46 [debug] /bin/tar zcf /tmp/gobackup/1507821405731845154/yizaoyiwan/2017.10.12.23.16.46.tar.gz yizaoyiwan
2017/10/12 15:16:46 /bin/tar: yizaoyiwan: file changed as we read it
2017/10/12 15:16:46 Cleanup temp dir...
2017/10/12 15:16:46 ======= End yizaoyiwan =======
2017/10/12 15:16:46 /bin/tar: yizaoyiwan: file changed as we read it
Show this error and no backup file created in target folder.
mysql 加上配置additional_options:
additional_options: "--single-transaction --quick"
执行控制台输出 mysqldump: unknown variable 'single-transaction --quick'
看代码发现additional_options
是直接作为一个cmd arg
整体添加的exec.Command
的。而Go的exec.Command 对args部分存在空格的参数支持行为有问题, 如exec.Command("ls", "-l -a")
需要拆成exec.Command("ls", "-l", "-a")
将spaceRegexp.Split(command, -1)
逻辑应用每个arg, 类似这个逻辑: args = args.flat_map{|arg| spaceRegexp.Split(arg)}
thanks for the wonderful tool .
but during the compress process , gobackup only single core to run gzip , is it possible make this a configurable parameter in configuration file ?
or replace gzip with pigz for use multi cores to compress the backup file ?
and after compress , the backup file is a huge tgz , is it possible to make an option to split the backup file ?
Please, allow to specify connection to local unix socket for mysql and redis database.
So user can provide configuration option 'socket', which will be used instead of 'host' and 'port' to connect to database.
solved
# ./gobackup
ash: ./gobackup: not found
不会自动递归创建临时目录,导致打包失败,具体如下
gobackup 版本 0.4.2
配置文件如下:
models:
nexus:
store_with:
type: scp
host: 17.17.8.1
path: /Raid/xx/nexus
private_key: /home/xx/.ssh/id_rsa
username: xx
archive:
includes:
- /data/nexus
执行结果:
[root@psd008 gobackup]# gobackup perform
2017/10/10 11:59:23 ======== nexus ========
2017/10/10 11:59:23 WorkDir: /tmp/gobackup/1507636763978943088/nexus
2017/10/10 11:59:23 ------------- Databases -------------
2017/10/10 11:59:23 ------------- Databases -------------
2017/10/10 11:59:23 ------------- Archives -------------
2017/10/10 11:59:23 => includes 1 rules
2017/10/10 11:59:23 [debug] /bin/tar -cPf /tmp/gobackup/1507636763978943088/nexus/archive.tar /data/nexus
2017/10/10 11:59:23 ------------- Archives -------------
2017/10/10 11:59:23 ------------ Compressor -------------
2017/10/10 11:59:23 => Compress with Tgz...
2017/10/10 11:59:23 [debug] /bin/tar zcf /tmp/gobackup/2017.10.10.19.59.23.tar.gz nexus
2017/10/10 11:59:23 /bin/tar: nexus: Cannot stat: No such file or directory
/bin/tar: Exiting with failure status due to previous errors
2017/10/10 11:59:23 Cleanup temp dir...
2017/10/10 11:59:23 ======= End nexus =======
When performing "gobackup perform" for the first time ever:
2018/05/19 23:15:15 ------------- Storage --------------
2018/05/19 23:15:15 => Storage | local
2018/05/19 23:15:15 Store successed /root/backups
2018/05/19 23:15:15 Unmarshal cycler.json failed:unexpected end of JSON input
2018/05/19 23:15:15 ------------- Storage --------------
The second time I run "gobackup perform" no longer throws this error, so I guess this is a low priority bug.
try to compile on arm server. i got this error ,
root@y2t-web:~/go/src/github.com/huacnlee/gobackup# go build
../../bramvdbogaerde/go-scp/auth/key.go:10: undefined: ssh.HostKeyCallback
Update install script '0.7.1' - '0.7.2'
Thanks gobackup author,Can gobackup exclude subdirectories ? such as include the path /home/wwwroot/test.cn
, exclude the path home/wwwroot/test.cn/upload
, the following is my configuration file:
## 源码备份
sourceBackup:
# 备份打包
compress_with:
# 压缩方式
type: tgz
# 存储
store_with:
# 存储到oss上
........
# 文件备份
archive:
# 需要备份的文件、文件夹
includes:
- /home/wwwroot/test.cn
excludes:
- /home/wwwroot/test.cn/upload
This situation occurs when attempting to use scp storage, with user/password authentication (no private key):
2018/05/19 23:51:38 ------------- Storage --------------
2018/05/19 23:51:38 => Storage | scp
2018/05/19 23:51:38 PrivateKey /root/.ssh/id_rsa
2018/05/19 23:51:38 ssh: must specify HostKeyCallback
ssh: must specify HostKeyCallback
当在同一host上面执行 gobackup perform --model model1; gobackup perform --model model2; 当其中一个任务完成就会删除其它任务正在使用的/tmp/gobackup目录,导致其它任务备份失败,比如在crontab当时定义了多个任务的场景。
未来会支持backblaze存储么?这个很便宜
pg_dump: [archiver (db)] query failed: ERROR: unknown function: pg_catalog.set_config()
pg_dump: [archiver (db)] query was: SELECT pg_catalog.set_config('search_path', '', false)
An error occurred when I tried to backup a cockroachdb (postgresql) database
Here's my yml file
models:
localhost:
compress_with:
type: tgz
store_with:
type: local
keep: 20
path: /home/ernanie/backup/cockroach
databases:
bank:
database: bank
type: postgresql
host: localhost
port: 26257
username: root
models:
example:
compress_with:
type: tgz
store_with:
type: oss
max_retries: 5
timeout: 300
threads: 1 (1 .. 100)
archive:
includes:
- /home/git/.ssh/
- /etc/nginx/nginx.conf
- /etc/nginx/conf.d
excludes:
- /home/ubuntu/.ssh/known_hosts
- /etc/logrotate.d/syslog
设置 cronjob 每日备份,gobackup 会在 /tmp/gobackup
目录下生成对应每日备份文件的压缩文件。我设置的对象存储的保存策略,但本地备份的压缩文件不断积累,需要手动清理。
Cant upload files to ftp with TLS, any solution?
path 带斜杠,会给阿里云生成,一个斜杠空目录,删也删不掉,不知道是哪边的问题了,阿里云工单工程师也蒙圈了。
经过探索,发现:设置阿里云存储 path: /xxx/
就会有问题,试验发现 设置 path: xxx
,就好了
阿里云那边:
默认进到控制台
https://oss.console.aliyun.com/bucket/oss-cn-hangzhou/aaa-backup/
看到的是这样的
一个 斜杠目录,点击是无限循环,删又删不掉....
自己改改 url,如下:
https://oss.console.aliyun.com/bucket/oss-cn-hangzhou/aaa-backup/object?path=/
能看到了,还是删不掉,下载不了
Some OSS cloud service supports s3 protocol with ForcePathStyle
, eg Qiniu
I'm trying to run gobackup on a webhost with limited permissions. Unfortunately, I don't write permissions for the whole home directory (~), but only for a sub-directory.
gobackup tries to write to a file called ~/.gobackup/cycler/.json, but fails and exits. Here is the log:
2018/11/23` 19:21:19 ------------- Databases -------------
2018/11/23 19:21:19 => database | mysql : wordpress
2018/11/23 19:21:19 -> Dumping MySQL...
2018/11/23 19:21:24 dump path: /tmp/gobackup/1542997279119109937/wordpress/mysql/wordpress
2018/11/23 19:21:24
2018/11/23 19:21:24 ------------- Databases -------------
2018/11/23 19:21:24 ------------- Archives -------------
2018/11/23 19:21:24 => includes 1 rules
2018/11/23 19:21:24 ------------- Archives -------------
2018/11/23 19:21:24 ------------ Compressor -------------
2018/11/23 19:21:24 => Compress | tgz
2018/11/23 19:21:27 -> /tmp/gobackup/2018.11.23.19.21.24.tar.gz
2018/11/23 19:21:27 ------------ Compressor -------------
2018/11/23 19:21:27 ------------- Storage --------------
2018/11/23 19:21:27 => Storage | scp
2018/11/23 19:21:27 PrivateKey /home/www/p455475/.ssh/id_rsa
2018/11/23 19:21:27 -> scp /2018.11.23.19.21.24.tar.gz
2018/11/23 19:21:27 Store successed
2018/11/23 19:21:27 Load cycler.json failed:open /home/www/p455475/.gobackup/cycler/wordpress.json: no such file or directory
2018/11/23 19:21:27 Skip save cycler.json because it not loaded
2018/11/23 19:21:27 ------------- Storage --------------
2018/11/23 19:21:27 Cleanup temp dir:/tmp/gobackup...
2018/11/23 19:21:27 ======= End wordpress =======
Also, it says storing the files via SCP succeeded, but that's not actually true, i.e. no file is created.
Backup is useless if you can't conveniently restore from the archives. I know it's doable by invoking some shell commands to un-tar the archive and importing database dumps, but that is not very intuitive, let along there's no documentation at all.
I'd suggest adding a restore
subcommand to allow users to selectively restore some part of the data.
Another solution would be to generate restore scripts (generic shell scripts) when perform
finished.
This is my configuration file.
models:
yizaoyiwan:
compress_with:
type: tgz
store_with:
type: local
path: /home/deploy/.gobackup/backups
databases:
yizaoyiwan:
type: postgresql
database: yizaoyiwan
host: localhost
username: postgres
password: xxxx
archive:
includes:
- /home/deploy/.bashrc
if I manually set PGPASSWORD in my ~/.bashrc
, it works and continues to perform the backup.
https://github.com/huacnlee/gobackup/blob/master/database/postgresql.go#L77
2019/01/01 16:18:52 oss: service returned error: StatusCode=400, ErrorCode=InvalidArgument, ErrorMessage=Authorization header is invalid., RequestId=5C0B226C2FF16CF75093E762
gobackup.yml:
# gobackup config example
# -----------------------
# Put this file in follow place:
# ~/.gobackup/gobackup.yml or /etc/gobackup/gobackup.yml
models:
yfxs:
compress_with:
type: tgz
store_with:
type: oss
keep: 10
endpoint: oss-cn-hangzhou-internal.aliyuncs.com
bucket: aaa
accessKeyID: aaa
accessKeySecret: aaa
path: /aaa
databases:
aaa_production:
type: postgresql
host: localhost
port: 5432
database: aaa
username: aaa
password: aaa
archive:
includes:
- /application/app/aaa/shared/etc/nginx.conf
- /etc/nginx/nginx.conf
- /etc/redis/redis.conf
30,0-1 All
I want to backup mysql, and the yml config just has 1 store_with
i can set.
How to dump once and save to local
and s3
?
I don't want to set 2 models in config file.
Thank you.
您好,我对go不大熟悉,我直接下载代码后编译,提示如下错误,麻烦看下能给点编译说明吗,多谢。
Looks like gobackup saves files at e.g. /tmp/gobackup/2020.09.30.18.01.06.tar.gz
, but only deletes /tmp/gobackup/1601481601969223442
As a workaround, I've changed my cronjob to /usr/local/bin/gobackup perform >> /var/log/gobackup.log && rm -rf /tmp/gobackup
(note the rm
command at the end.)
/tmp/gobackup/ ls -la
total 236273360
drwxr-xr-x 2 root root 4096 Sep 30 18:31 .
drwxrwxrwt 12 root root 4096 Sep 30 06:31 ..
-rw-r--r-- 1 root root 19797375857 Sep 27 20:13 2020.09.27.19.53.24.tar.gz
-rw-r--r-- 1 root root 19832908870 Sep 28 00:20 2020.09.28.00.01.01.tar.gz
-rw-r--r-- 1 root root 19872942935 Sep 28 06:20 2020.09.28.06.00.57.tar.gz
-rw-r--r-- 1 root root 19961177560 Sep 28 12:22 2020.09.28.12.01.14.tar.gz
-rw-r--r-- 1 root root 20056435330 Sep 28 18:21 2020.09.28.18.01.08.tar.gz
-rw-r--r-- 1 root root 20233341709 Sep 29 00:21 2020.09.29.00.01.03.tar.gz
-rw-r--r-- 1 root root 20292021283 Sep 29 06:22 2020.09.29.06.01.02.tar.gz
-rw-r--r-- 1 root root 20317139957 Sep 29 12:21 2020.09.29.12.01.07.tar.gz
-rw-r--r-- 1 root root 20685865291 Sep 30 00:21 2020.09.30.00.01.10.tar.gz
-rw-r--r-- 1 root root 20742438814 Sep 30 06:25 2020.09.30.06.03.45.tar.gz
-rw-r--r-- 1 root root 20947585039 Sep 30 12:21 2020.09.30.12.01.11.tar.gz
-rw-r--r-- 1 root root 19204609375 Sep 30 18:20 2020.09.30.18.01.06.tar.gz
2020/09/30 18:00:01 ======== main ========
2020/09/30 18:00:01 WorkDir: /tmp/gobackup/1601481601969223442/main
2020/09/30 18:00:01 ------------- Databases -------------
2020/09/30 18:00:01 => database | redis : redis
2020/09/30 18:00:01 -> Invoke save...
2020/09/30 18:00:01 Copying redis dump to /tmp/gobackup/1601481601969223442/main/redis/redis
2020/09/30 18:01:06
2020/09/30 18:01:06 ------------- Databases -------------
2020/09/30 18:01:06 ------------ Compressor -------------
2020/09/30 18:01:06 => Compress | tgz
2020/09/30 18:20:52 -> /tmp/gobackup/2020.09.30.18.01.06.tar.gz
2020/09/30 18:20:52 ------------ Compressor -------------
2020/09/30 18:20:52 ------------- Storage --------------
2020/09/30 18:20:52 => Storage | s3
2020/09/30 18:20:52 -> S3 Uploading...
2020/09/30 18:23:46 => https://censored.s3.eu-central-1.amazonaws.com/backup%2F2020.09.30.18.01.06.tar.gz
2020/09/30 18:23:46 ------------- Storage --------------
2020/09/30 18:23:46 Cleanup temp dir:/tmp/gobackup/1601481601969223442...
models:
main:
databases:
redis:
type: redis
mode: copy
rdb_path: /var/lib/keydb/dump.rdb
invoke_save: false
compress_with:
type: tgz
store_with:
type: s3
keep: 20
bucket: censored
region: eu-central-1
path: backup
access_key_id: censored
secret_access_key: censored
For the small Website cases, they are uses VPS may has a low network bandwidth (~2m - 5m).
If gobackup does't limit upload speed, the backup progress will block the network, then the website will not open.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.