Giter VIP home page Giter VIP logo

lrbkup's Introduction

Adobe Lightroom Backup

Warning: This project is still very much in development. It is not intended for public usage in it's current state. Nor is it a goal. It's is public hoping that someone might find majority of the code useful and tailor it to their own needs. For a full list of known issues, bugs, or incomplete features, please check the issues page.

This is a script that helps me backup my Lightroom catalogs and photos to an external hard drive and to AWS S3, plus more. This originally started because of a data loss from a hard drive plus my lack of laziness to run an aws and/or rsync command. In addition to double checking myself manually. With this script the goal is to have no less than 2 copies of the data and minimize the amount I store on my local hard drive. This way if there is a data loss with any one of the devices, there is another backup somewhere. Also it helps solve the issue of filling up any one device. Eventually this will happen, so it helps me maintain these issues. Mostly I just automated most of my typical actions around Adobe LR.

Why not write a Adobe LR plug-in

This is a great question and it still may happen in the future. I actually started a small sample project for this (no repo for it yet). I stopped when I ran into one issue that I'm still trying to figure out. The issue of where to put the plug-in and how the user would interact with it. What I mean is, is this a (what Adobe calls) Module at the top? Is this something that will go under the "Backup" button? Is this something that is just a little window in the side toolbar under a specific view? If you have any thoughts or suggestions on where to put this, that would be super helpful. I welcome advice and feedback to improve this project.

It's not perfect

Yes I know. There are steps you have to do specifically within Lightroom. This is why I am still trying to figure out how to do this as a LR plug-in. However I have put in many alerts for reminders that you need to take these actions until I find a better solution. Mostly this helps me get up and going quickly to ensure that I do not suffer a data loss again.

Things that work

  • Backup to local hard drive to external hard drive
  • Backup external hard drive to aws s3

Things that still need work

This is just a short list of the main features that I'm still working on. Please check the issues page for a full list of features and other bugs I'm working through.

  • Freeing up local hard drive space
  • Pulling s3 images down to external hard drive

Requirements

  • Mac OS X (Sorry I do not have a windows machine to work on, and Adobe does not have a Linux version of their software)
  • Bash 4 (It supports associative arrays which are used in this script)
  • You will have to set up a folder watch on your /Volumes/ directory
  • You will need the aws command line tool set up and installed plus an aws account

Setup/install

  • Pull the project
  • Make sure you have the above requirements
  • Make sure the bash script is executable and in your path
  • Add the applescript to /Library/Scripts/Folder\ Action\ Scripts/
  • Add watch and applescript to on /Volumes/

Usage

Basic idea/workflow.

$ ./lrbkup
> Please pick which backup you want to do
> 1) backup to external
> 2) backup to S3
>

Thats basically it. Enjoy.

lrbkup's People

Contributors

byronmansfield avatar

Watchers

 avatar  avatar

lrbkup's Issues

Write sort and order reusable function

I have a use case where a generic sort and order function could be in two scenarios. It is simply just to make things a little more optimized, so it is lower in priority.

Case A) Pushing out multiple days worth of photos to S3. It would be nice to send them by smallest days first. This way you can have at least some full days, oppose to having 0 full days if the first one happens to be huge in size.

Case B) When pulling a large amount of images from many days. It would be nice to get the smallest first and largest last, so that you can get some whole days and it's something to start working with, instead of waiting a long time for a whole day to finish first.

The function would take an associated array of dates and size by MB, and return an associated array sorted by smallest byte size date first, to largest.

Clean up the clean drive unused tasks

There is a lot of unused tasks that need to be removed once tested the new ones. As I developed the script, I just stubbed out some functions that I thought would need to be separate. Instead I was able to optimize a few into one. Now some are just lingering. I need to make sure they work, and clean up.

Clear local hard drive still assuming predefined external hard drive

In a fresh run of the script, and when using the clear local feature. It never prompts you for which external hard drive to compare to. I'm assuming that this is because it is predetermined or hard coded in somewhere. Instead it should prompt you to select which hard drive you want to compare it to.

Hunt down and fix hard coded path's

There are still some lingering hard coded path's, variables, and other strings that should be changed over to variables.

In particular I noticed the aws lightroom path in the aws command inside the s3_diff_check function for the bucket variable. I'm sure there is more.

Fix how file in dir count gets populated

In a step to count how many files are in a directory for comparison purposes. I am counting 1 more than I should, therefore I am comparing with this error. I think it is in the s3_diff_check somewhere.

Disk usage report

I also would like to see a little disk usage report at the end of some tasks. Like how much the disk was at before and after task. Also I would like to capture this report into the log files as well. Ideas: How much was used and free, before and after task, along with a percentage of how much the task used up, and how much the task freed up. Depending on the task run.

Somethings can't handle spaces correctly

I need to leave this note in the README until this is fixed. I have run across some issues with some of the script not handling spaces well, or even escaping it with the . This is a placeholder for when I identify them.

Clean up old comments and clarify documentation/notes in script

There has been a build up of cruft from rapid development among lots of short bursts of code where I just need to leave notes for myself next time I step back into the project. These notes need to be cleaned up. Some kept but just clarified and better formatted.

Spinner

This is a nice to have. I think it would be good user feed back to have a bash spinner to show that the script is working. I have started this, but yet to get it working. I just need to complete it.

Fix improper bash syntax

There is a lot of improper or less recommended bash syntax that needs to be changed for best practices and consistency. Mostly I think using $VAR instead of what it should be using ${VAR}.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.