Giter VIP home page Giter VIP logo

backbone.dualstorage's Introduction

Backbone dualStorage Adapter v1.4.2

A dualStorage adapter for Backbone. It's a drop-in replacement for Backbone.Sync() to handle saving to a localStorage database as a cache for the remote models.

Usage

Include Backbone.dualStorage after having included Backbone.js:

<script type="text/javascript" src="backbone.js"></script>
<script type="text/javascript" src="backbone.dualstorage.js"></script>

Create your models and collections in the usual way. Feel free to use Backbone as you usually would; this is a drop-in replacement.

Keep in mind that Backbone.dualStorage really loves your models. By default it will cache everything that passes through Backbone.sync. You can override this behaviour with the booleans remote or local on models and collections:

SomeCollection = Backbone.Collection.extend({
    remote: true // never cached, dualStorage is bypassed entirely
    local: true  // always fetched and saved only locally, never saves on remote
    local: function() { return trueOrFalse; } // local and remote can also be dynamic
});

You can also deactivate dualsync to some requests, when you want to sync with the server only later.

SomeCollection.create({name: "someone"}, {remote: false});

Data synchronization

When the client goes offline, dualStorage allows you to keep changing and destroying records. All changes will be sent when the client goes online again.

// server online. Go!
people.fetch();       // load people models and save them into localstorage

// server offline!
people.create({name: "Turing"});   // you still can create new people...
people.models[0].save({age: 41});  // update existing ones...
people.models[1].destroy();        // and destroy as well

// collections track what is dirty and destroyed
people.dirtyModels()               // => Array of dirty models
people.destroyedModelIds()         // => Array of destroyed model ids

// server online again!
people.syncDirtyAndDestroyed();    // all changes are sent to the server and localStorage is updated

Keep in mind that if you try to fetch() a collection that has dirty data, only data currently in the localStorage will be loaded. collection.syncDirtyAndDestroyed() needs to be executed before trying to download new data from the server.

It is possible to tell whether the operation succeeded remotely or locally by examining options.dirty in the success callback:

model.save({
	name: "Turing"
}, {
	success: function(model, response, options) {
		if (options.dirty) {
			// saved locally
		} else {
			// saved remotely
		}
	}
});

Offline state detection

dualStorage always treats an Ajax status code of 0 as an indication it is working offline. Additional status codes can be added by setting offlineStatusCodes to either an array:

Backbone.DualStorage.offlineStatusCodes = [408];

or a function that accepts the response object and returns an array:

Backbone.DualStorage.offlineStatusCodes = function(xhr) {
    var codes = [];

    if (...) {
        codes.push(xhr.status);
    }

    return codes;
}

Data parsing

Sometimes you may want to customize how data from the remote server is parsed before it's saved to localStorage. Typically your model's parse method takes care of this. Since dualStorage provides two layers of backend, we need a second parse method. For example, if your remote API returns data in a way that the default parse method interprets the result as a single record, use parseBeforeLocalSave to break up the data into an array of records like you would with parse.

  • The model's parse method still parses data read from localStorage.
  • The model's parseBeforeLocalSave method parses data read from the remote before it is saved to localStorage on read.

Local data storage

dualStorage stores the local cache in localStorage. Each collection's (or model's) url property is used as the storage namespace to separate different collections of data. This can be overridden by defining a storeName property on your model or collection. Defining storeName can be useful when your url is dynamic or when your models do not have the collection set but should be treated as part of that collection in the local cache.

Install

Clone like usual or via npm npm install nilbus/backbone.dualstorage --save.

Contributing

See CONTRIBUTING.md

Authors

Thanks to Edward Anderson for the test suite and continued maintenance. Thanks to Lucian Mihaila for creating Backbone.dualStorage. Thanks to Jerome Gravel-Niquet for Backbone.localStorage.

backbone.dualstorage's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

backbone.dualstorage's Issues

Example for use with Phonegap

Hi,

I'd like to use this with phonegap. I've got a working app without Backbone - which I'm new to, so please have patience with me.

I wonder if there is an example you could point me to?

I dropped dualStorage into my project and load via requirejs:

dualstorage: 'libs/dualstorage',
dualstorage: {
            deps: ['underscore', 'backbone']
        }
...

var dualstorage = require(['dualstorage']),

I assume that's it.

And my models still load as long as I have server connection but as soon as the server is gone, my models won't be loaded any more. I thought the default is to cache everything - so I'd expect the models to be served from local storage once I'm offline - no?

I also saw this issue:
#7

It's still marked open - Do I need to expect additional challenges if I want to load models from cross origin?
I can already load models with plain Backbone by setting the url property in my model. Anything I need beyond that?

Many thanks for your help

Uncaught TypeError: Object #<Object> has no method 'set'

Any ideas on why this is failing? New to backbone and using mongodb as backend which uses _id.

This is the line that is raising the exception in backbone.dualstorage: model.set(model.idAttribute, model.id);

Here is my model:

window.Role = Backbone.Model.extend({
idAttribute: '_id',
urlRoot: "/roles",
});

window.RoleCollection = Backbone.Collection.extend({
model: Role,
url: "/roles",

});

Thank you for any help you can offer!

Not re-using Backbone.Sync?

By not reusing the original Backbone.Sync, it makes it harder to integrate with other Backbone "plugins" that also replace Backbone.Sync.

In our specific case, we're interested in using Backbone.io (https://github.com/scttnlsn/backbone.io) to allow Backbone to sync over Socket.io channels, instead of Ajax. Backbone.io works by replacing Backbone.Sync.

I thought that, by careful loading order, I could integrate the two, but this isn't going to work, since you don't re-use the existing Backbone.Sync for "remote".

From a quick glance, it looks like you Coffeescript'ed the Backbone.Sync and pulled emulateHTTP and emulateJSON, but otherwise are identical.

So, wondering if you'd consider re-using any existing Backbone.Sync that might exist as the onlineSync function?

Thanks!

Add support for custom .parse() functions

May be tough to figure out an across-the-board solution, since you're doing all of the work before we hit .parse(). For now, I have renamed my custom parse() function to remoteParse() and am calling it like this: parsedResp = model.remoteParse(resp); from line 164 (in the .coffee file) to parse the incoming JSON. I pass the `parsedResp on through the rest of that function declaration.

Updating a model/collection that has never been sync'd

This seems kind of obscure, but hang with me.

I am currently working on a project that will be using Backbone.dualStorage. I have encountered an issue with syncing a model that contains a collection nested within it.

The issue stems from the fact that the model is originally populated from a single JSON data source that contains every element in the nested collection. Therefore, Backbone.sync is never called on the nested collection. Now, if I want to update the collection within the object, it blows up because the collection was never sync(read)'d, only created (so it has no id).

It appears that this issue is also present in the Backbone.localStorage logic, but I can't really test it right now.

This might not be the way Backbone was intended to be used, but it is a problem for me. Does anyone else consider this to be an issue? If so, I'll patch up my fork and make a pull request.

Custom Model/Collection storeName

I am using Backbone.dualStorage as a part of model layer in a library shared across multiple projects. It is, therefore, required to assign each application its own cache namespace. Solution I am using is very simple:

@@ -296,7 +296,7 @@
 
 dualsync = function(method, model, options) {
   var error, local, originalModel, success;
-  options.storeName = result(model.collection, 'url') || result(model, 'url');
+  options.storeName = result(model, 'storeName') || result(model.collection, 'url') || result(model, 'url');
   options.success = callbackTranslator.forDualstorageCaller(options.success, model, options);
   options.error = callbackTranslator.forDualstorageCaller(options.error, model, options);
   if (result(model, 'remote') || result(model.collection, 'remote')) {

Would be great to see this as a part of default implementation.

error on SyncDirtyAndDestroyed #2

I have a second error with syncDirtyAndDestroyed.

the reported error is:
Uncaught TypeError: Cannot call method 'save' of undefined backbone.dualstorage.js:25
Backbone.Collection.syncDirty backbone.dualstorage.js:25
Backbone.Collection.syncDirtyAndDestroyed backbone.dualstorage.js:48
API.fetchWeeks pickdrop.js:209

The events leading up to this error were
create a model, save it, update it, then on a second update an error occurred on the save leaving the model on the _dirty list.
The application was reloaded and the model appears in the localStorage but does not have a dirty attribute although it still appears on the _dirty list.

the model attribs were not saved correctly, but the id was not removed from the _dirty list so each time the application is loaded and I attempt a syncDirtyAndDestroyed the error repeats.

Permanent error responses treated as temporary sync failures

This is more a braindump / request for comments than an actual issue. I apologize for its length.

I am seeing the following behavior:

Trying to save a new record fails on the server (the server returns 422 Unprocessable Entity). But the success callback is executed since the local storage works.

This is inconsistent behavior. Depending on whether I use the save callbacks or the ajax callbacks. This is an example using a 404 response instead.

new Backbone.Model({}, {url: 'http://google.com/not-found'}).
  save({}, {success: function() { console.log("Got success callback from save") } }).
  fail(function() { console.log("Got fail callback from jqxhr") })

What I am wondering is… what should be the behaviour of dualStorage? I'm inclined to think that it should not persist locally something that the server has refused for permanent reasons (i.e. for some set of http status codes).

Perhaps more importantly: should dualStorage hide those server errors and disguise them as success?

Additionally, models are not marked dirty, so the idea of what models exist that the client has (in local storage) and the server has (in whatever storage it uses) are out of sync.

There is still the question on what should be the behavior when creating a model while offline since save can only succeed in that situation. It is syncDirtyAndDestroyed() that will see the errors.

Is there any recomended current approach? Is there any other "future approach" that could be developed? I'm just using dualStorage "the wrong way"?

How to tell if a collection has dirty data?

Hello,

Please is there a method to check if a collection does have dirty data before calling Collection.syncDirtyAndDestroyed() as this method would fail when there's no such data?

Looking forward to your reply. Thanks

Option for performing local sync first

I'm finding this project really useful, thank you very much for it!

I'm building a mobile app for places with low connectivity. One issue I find is that it's better to get completely offline than on slow EDGE/3G connections, cause when it knows it's online it will wait even for the slowest queries. I'd like to be able to tell dualStorage to infer offline, and perform sync in the background, or manually when I tell it to.

Is there any way we can do so right now? If not, I'm happy to receive general guidance and try it in a fork!

Possibly related: #4

Thank you very much in advance.

multi copies of data

I am trying out dualStorage and I am confused by the data in my localStorage. When I fetch my Backbone COLLECTION which has 6 records, the localStorage shows 6 entries which are all arrays containing the 6 api records.

syncDirty() issues with new records. Calling .set() on server responses' JSON

When doing a syncDirty(), my new Models in localStorage are correctly posted to the API via the Ajax request, however the returned record from the POST request (a JSON object) gets sent to localsync() with a 'create' request.

  case 'update':
    if (_.isString(model.id) && model.id.length === 36) {
      originalModel = model.clone();
      options.success = function(resp, status, xhr) {
        localsync('delete', originalModel, options);
        localsync('create', resp, options);
        return success(resp, status, xhr);
      };
      options.error = function(resp) {
        options.dirty = true;
        return success(localsync(method, originalModel, options));
      };
      model.set(model.idAttribute, null);
      return onlineSync('create', model, options);
    } else {
      // Etc......

So far so good, this makes sense to me as we now have a new record with a REAL iD as opposed to the generated one and we need to create a new record representing that.

The problem however is the fact that localsync() with a Create method does this:

Store.prototype.create = function(model) {
  if (!_.isObject(model)) {
    return model;
  }
  if (!model.id) {
    model.id = this.generateId();
    model.set(model.idAttribute, model.id);
  }
  localStorage.setItem(this.name + this.sep + model.id, JSON.stringify(model));
  this.records.push(model.id.toString());
  this.save();
  return model;
};

It is trying to use the .set() method of a Backbone model on a simple Json hash returned by the server. This seems like a bug where perhaps a model should be created in this function from the attribute hash sent by the server.

Am I missing something here? Any help would be greatly appreciated.

Use localStorage immediately before remoteSync

This is a great piece of code, but I'm curious what you think about the following behavior change (which I've hacked together as a proof of concept, and could clean up for a pull request).

When dualSync('read') is called, could we immediately call localSync(), which might already have data immediately ready? Then it could call remoteSync() for updated data which might arrive slower.

I've actually built this to check a callback on the model shouldRemoteSync() which has some custom brains on whether remoteSync() needs to be called.

Maybe this falls beyond the scope of what you plan to do here, and maybe there's a Better Way -- just wanted to start the conversation.

Data synchronization race conditions?

I haven't yet had a chance to closely inspect the code or do my own testing but was wondering what happens if the client is offline and makes some changes to a model whilst at the same time changes are made to the same model on the server?

  1. When the client comes back online (tested by doing a http ping to the server) and syncDirtyAndDestroyed() is called by an onlineStatusChecker function does the client or the server "win"?
  2. What happens if fetch is called on the model's collection before syncDirtyAndDestroyed() is called?
  3. What happens if syncDirtyAndDestroyed() is called before fetch but the ajax requests haven't yet completed?

Tests

Backbone.localStorage has a test suite, but most tests fail if the test suite is run twice. Also, dualstorage has network requests to do as well, so the test suite might need a server-side bit (or an extensive XHR mock).

SyncDirty fails for models with non Integer id

For example when using MongoDB ObjectIds (e.g. 51e2b9190cf239e4b579c50f), syncDirty will fail with:

Uncaught TypeError: Cannot call method 'save' of undefined backbone.dualstorage.js:26

because

parseInt("51e2b9190cf239e4b579c50f") === 51

which of course is not the right model id.

Since "the id is an arbitrary string (integer id or UUID)", shouldn't line 17 of backbone.dualstorage.coffee be

model = if id.length == 36 then @where(id: id)[0] else @get(id)

instead of:

model = if id.length == 36 then @where(id: id)[0] else @get(parseInt(id))

?

Version declared in bower.json does not match current release

Current version is 1.1.0, but the bower.json file still states 1.0.2

Bower complains when installing:

bower backbone.dualstorage#~1.1.0       not-cached git://github.com/lucian1900/Backbone.dualStorage.git#~1.1.0
bower backbone.dualstorage#~1.1.0          resolve git://github.com/lucian1900/Backbone.dualStorage.git#~1.1.0
bower backbone.dualstorage#~1.1.0         download https://github.com/lucian1900/Backbone.dualStorage/archive/v1.1.0.tar.gz
bower backbone.dualstorage#~1.1.0          extract archive.tar.gz
bower backbone.dualstorage#~1.1.0         mismatch Version declared in the json (1.0.2) is different than the resolved one (1.1.0)

Collection._byId not being updated by Collection.create

I'm having an issue when using Backbone.dualStorage in localstorage mode. I am using Backbone 0.9.2 and dualStorage 1.0. I can't tell if this is an issue with Backbone or your dualStorage adapter (or perhaps I am doing something wrong). Models are getting added to the collection, but the Collection._byId property is not being updated, and isn't until the page is refreshed. This is causing the Collection.get(id) method to fail on valid model IDs.

I have found a somewhat dirty workaround to this issue, but perhaps someone can find a cleaner fix for me.

model=app.planets.create({name: mName, description: mDescription}, {remote: false});

/*
  bug workaround:
  _byId is a key/value store that backbone maintains for a collection
  _byId was not getting updated after this model was added to the collection
  this was causing planets.get(id) to return undefined
  this next statement adds the model if it does not exist
*/
if(app.planets._byId[model.id] == undefined){
  app.planets._byId[model.id] = model;
}

Thanks for your work on this adapter! It's really nifty.

PATCH support

Local attributes for models that are not yet synced can be lost after save(attrs, {patch: true})

A model will have a temporary id when it is recently created and has not yet been successfully synced.

When an un-synced model is updated with patch: true and the server responds with a partial or empty set of attributes, attributes not in the server response will be missing in the local copy. This happens because the unsynced version is discarded, and a new model is created locally. See backbone.dualstorage.coffee:271.

It instead should update the existing local model's attributes with the server response, just as it does for a synced model.
#51 may change the way attributes are updated, so it may be best to wait to fix this until that is resolved.

Checking localStorage first before fetching from server

I might be missing something, but having trouble getting collections to load from localstorage. When I implement dualstorage and run fetch on a collection - the data is fetched from the remote server and saved into localstorage - as expected. But when I refresh the page, the API request is made again - instead of the data getting retrieved from localstorage. When I turn off wifi and refresh, the data is retrieved from localstorage correctly, even though it still makes a failed API request.

Is this the correct behavior for dualstorage?

Local fetch() has no complete() callback.

With the fallowing example-code with the local attribute set to true:

var Favourites = Backbone.Model.extend({
    local: true, //Only sync against LocalStorage                                                                           
    url: 'favourites',
    initialize: function(){
        var self = this;
        this.fetch().complete(function(){
            console.debug(self);
        });
    },

this.fetch().complete() is not defined. This deferred call is a function from the jQuery ajax() function. This usage was recommended to me in this Backbone bug: jashkenas/backbone#423

syncDirtyAndDestroyed for many collections - Throttle issues

As far as I can tell by reading the current code, Throttle doesn't support queueing more than one task with the same name. So if you try to syncDirtyAndDestroyed for 3 or more collections at the same time, I believe only the first collection will sync, and the last one will get queued.

Also, when the queued task is run, it looks like the task function is called without any parameters, so completionCallback is never fired for the queued task.

Maybe if syncDirtyAndDestroyed accepted a callback function, I could throttle the calls myself, but that doesn't seem possible either.

Another issue I found is that completionCallback is never fired if the syncing collection doesn't have any dirty or destroyed models.

And if there are many dirty or destroyed models in the same collection, each save() or destroy() calls the completionCallback, but I think this should get called after all models finished syncing.

Maybe the entire Throttle thing is a work in progress and this issues are already noticed? Anyways, just though I'd bring it up.

Triggering callbacks.

Hi. I have a backbone view that's setup like this:

# view.js
# persist is called when clicking ine commit button in the form.
persist: ->
    @model.save {},
      succes: (resp, status, xhr)->
        debug.log("Success")
      error: (resp, status, xhr)->
        debug.log @model.errors

Then, when I use dualStorage, the success and error callbacks aren't triggered any longer. I see logging in the console, and the sync works fine, but it ignores my callbacks.

Am I doing something wrong, or should I listen to some other event, or how should I go about doing custom stuff based on the result of dualStorage's result when communicating with the backend.

Best ragards

Emil

SyncDirty fails on urls without trailing forward slash

Hi,

If my url doesn't end with a trailing foward slash, like this:
var Item = Backbone.Model.extend({ urlRoot: 'items' });
then local:true is ignored.

This is also the format that is suggested in the backbone documentation
(var Notes = Backbone.Collection.extend({ url: '/notes' });

I discovered this because I had changed the url format and removed the slash for the get url after there were already some dirty sync entries written.

After removing the trailing slash from the urls, they would never be removed from the localstorage because in line 25, the url is parsed without checking if a trailing forward slash is present:
url = result(this, 'url');
while entries in the localStorage are always written with the trailing slash + '_dirty'.

Thanks

Recurse when save model

Versions:
Backbone dualStorage Adapter v1.0.1
Backbone.js 1.0.0
jQuery v1.9.1
Underscore.js 1.4.4

When app ofline, and we try save model many times:
'too much recursion' in underscore.js at line 770
( _.extend function).
Sample code to reproduce:

var Model = Backbone.Model.extend({
    url: '/test',
    remote: false,
    local: true,
    parse: function(response, options) {
        console.log(response);
        return response;
    }
});

var model = new Model({name: 'test'});
model.save();
model.save({name: 'asd'});
model.save({name: 'asd3'});
model.save({name: '234234'});
console.log(model.attributes);

I fix this by monkey-patching functions 'dirty' and 'clean' to return model.attributes, not model.
If this functions return model, then this model set as .attributes property, and we will have recursion (sorry for my english, just trace this functions in sample):

  dirty: (model) ->
    dirtyRecords = @recordsOn @name + '_dirty'
    if not _.include(dirtyRecords, model.id.toString())
      dirtyRecords.push model.id
      localStorage.setItem @name + '_dirty', dirtyRecords.join(',')
    model.attributes

  clean: (model, from) ->
    store = "#{@name}_#{from}"
    dirtyRecords = @recordsOn store
    if _.include dirtyRecords, model.id.toString()
      localStorage.setItem store, _.without(dirtyRecords, model.id.toString()).join(',')
    model.attributes

is this clean solution?

RangeError: Maximum call stack size exceeded

Here is a gist showing how to reproduce the error.

Maybe I'm doing something wrong, not sure. The local == true thing is just to demonstrate the problem. In my actual setup, I'm using remote == false when I detect that the client is offline. But either way I get the same error.

Tested in Chrome 29.0.1547.57

Correct handling of fetch's merge and add options

Right now, localsync create does some things differently depending on the add and merge options. Should it be? Localsync should probably emulate/mirror the server response, and not care what options are passed to fetch and subsequently set.

Often when using add: true or merge: true with fetch, you're dealing with an API endpoint that does not return a full result set.

can't get Backbone.dualStorage working with requireJS

Hi there,

I'm trying desperately to set up dualStorage along backbone using requireJS with no results.

backbone vers 1.0.0
requireJS vers 2.1.2

requireJS shim config 

{
    "backbone": {
        "deps": ["underscore", "jquery", "init", "underscoreString"],
        "exports": "Backbone"
    },

    "dualStorage":{
        "deps":["backbone"],
        "exports": "dualStorage"
    }
}

and then the first module requires 

require([
    "jquery",
    "backbone",
    "routerMobile",
    "dualStorage",
    ...
], ...  

First i had this problem of "undefined set method" on models that i eventually solved by adding a condition like "if (!!model.set) ... " but then i'm facing all kind of problems related to model Id in my backbone code...

I'm i missing smthg ? do you need more informations ?

dualSync returning arrays

It appears that dualSync is sometimes returning arrays of objects rather than the objects themselves.

Needs more investigation.

Connecting to not-origin

Previously, there was a hack with a global URLs object to support connecting to something other than the origin.

We need something else to do the same, since it's a necessary feature for PhoneGap.

usage of 'options.remote'

In the Javascript code the following can be found in the dualsync function:

if (!model.local && options.remote !== false)

should that not be:

if (!model.local && model.remote !== false)

Compiling

We should split the coffeescript components into different semantic files.

Then the Makefile should:

  • Combine the coffeescript source files into a single javascript source file
  • Build a minified/uglified javascript version

Remote fail on model fetch (read) from previously-stored collection results in malformed storeName string

Hi!

I have found that the combination of a certain series of events causes a predictable issue with the localStorage key generated by window.Store.find():

  1. Fetch a collection from remote successfully. dualStorage is successful with localSync in creating localStorage copies of each of the individual contained models in the collection. Yay!
  2. Attempt to fetch a model later from remote that is part of the previously successfully-fetched collection.
  3. Remote fail/error*. dualStorage then calls localSync, which attempts to retrieve that model from localStorage. (The remote error is an expected failure, as the models contained in the collection haven't been persisted yet remotely).
  4. find in window.Store always appends the model.id to the existing storeName (@name). But in this case, the model's ID is already on the storeName.
  5. Results in the model's id repeating twice in the resulting key it attempts to look up in localStorage, which of course fails.

So, in my case, it tries to retrieve key
http://myapi.com/api/v1/wines/840014200899001000000001840014200899001000000001

instead of the correct

http://myapi.com/api/v1/wines/840014200899001000000001

(My IDs are long, yep; they're not actually mine).

My workaround for now is a bit hackish and may not take into account subtleties. But I have a habit of not getting around to opening issues if I try to find a perfect solution :).

  find: (model) ->
    storeKey = @name
    if model and model.id
      re = new RegExp "\/#{model.id}$"
      if not re.exec @name
        storeKey += @sep + model.id
    JSON.parse localStorage.getItem(storeKey)

Issues with using clone

I've been experimenting with using this plugin with Backbone-Relational (http://backbonerelational.org/). It doesn't work out-of-the-box, but it looks like it would be pretty simple to make them play nice.

It seems that we'd just need to save a local copy of the sub-collection when a parent model is cached. Right now, when the parent is cached, it saves an array of IDs in place of the actual sub-collection's models/attributes. This is ideal, but we need to have a local version of those models as well. Then, when a parent model is loaded from the local cache, it would have to also load its sub-collections from localStorage as well.

Can you think of any reason this wouldn't work, or foresee any problems I might run into?

I haven't had time to fully comprehend how this plugin works yet, but I intend to use this plugin with Backbone-Relational even if I have to make these changes myself. If you have any help or other guidance to offer, I'd be grateful. I'm on a tight schedule and the sooner I can make this work, the better.

Shorter localStorage Keys

HTML5 storage is limited, and having to store a long guid for every object (or even a somewhat long service + endpoint for every object) will take up precious space. I think it would be enough to create one guid per browser/client. Let's say we call this guid 'localDbId'. Now, create one localStorage item per Collection.url. Then just use an incrementing number for the model keys:

  1. Locally, each model would have an int key, value = the model
  2. Each Collection would have a url key, value = delimited list of associated model IDs
  3. Each database would have a single localDbId key containing a guid

On the local client, the incrementing number is enough to identify each store uniquely. When you think about syncing multiple devices to a central, remote store, you'd simply add each device's single localDbId to the incrementing number:
url + '/' localDbId + '-' + n++

Now you have a very cheap local unique identifier without losing your global identifier. This would be handy for developers who want to delegate id-creation to the client app, and would also allow tracking of which models were created by whom (more interesting for enterprise apps, maybe).

parseBeforeLocalSync is applied to the server response inconsistently

dualsync method read filters the server response through parseRemoteResponse if it is defined on the model or collection. This makes sense for when you get nonstandard server responses that you want to parse before passing the response to localsync.

Aside from dualsync read, create and update also receive server responses that update the model. These however are not parsed with parseRemoteResponse.

Cannot call method 'save' of undefined

Anyone else seen this? Here's how to reproduce (everything else is working great):

// I have a collection instance in the variable `theseThings`
// with a url hooked up (http://localhost:1337/things)
// The front-end is an HTML page served from a file URL, and the local server has CORS enabled, so everything works great.


// Get a few things from server..
theseThings.fetch();


//
// Kill server
//

// Now add a new thing
theseThings.add({});


//
// Start server again
//

// Now try to syncDirtyAndDestroyed()
theseThings.syncDirtyAndDestroyed();
Result:
theseThings.syncDirtyAndDestroyed();
TypeError: Cannot call method 'save' of undefined

Thanks!

Versions
  • Backbone.dualStorage version 1.1.0 (from bower)
  • Backbone 1.1.0 (from bower)
  • Underscore 1.5.2 (from bower)
  • jQuery 2.0.3 (from bower)

dualsync() should return a jQuery promise (or Deferred)

Default Backbone implementation states that it is returning a jqXHR object. jQuery also documents that a jqXHR objects implement the Promise interface, giving them all the properties, methods, and behavior of a Promise (see Deferred object for more information)

So it would be very useful that dualsync also returns a Promise.

Since the API is synchronous, the returned promise would have been already resolved.

If Online, best way to initiate sync

I have dualstorage successfully working and saving data from my rails app, but in the context of a user/device being on line how can I continue to poll for updates or new data on the server. Currently, the sync only takes place when I completely close out the app on my device or refresh the browser window.

Any comments on best practice to make this happen?

Thank you,

S

error on SyncDirtyandDestroyed

I'm running the recent commit and ran into an error

Uncaught TypeError: Cannot call method 'save' of undefined backbone.dualstorage.js:25
Backbone.Collection.syncDirty backbone.dualstorage.js:25
Backbone.Collection.syncDirtyAndDestroyed backbone.dualstorage.js:48

when I look in the localStorage, there are 20 ids listed with api/events_dirty, but there are only 19 actual items in the localStorage that have a UUID. Looking at the list, it is number 10 in the list that did not exist.

the test scenario was load the app, create a record, go-offline, create more records, return online. When the app goes online I automatically trigger SyncDirtyAndDestroyed({async:false}) on all collections simultaneously, followed by a fetch({async:false}).

I'm guessing that perhaps the ajax sync was in_process when the app went offline and thus the code missed the reply...but that's just a guess.

It also appears that when the fetched records replace/update the local version that a "sync" event is not firing, because I end up with duplicates in my backbone view that get added on the fetch(), but not removed.

dualsync remote/local options confusing

For models where both properties local and remote are specified, the case where local==true and remote==true causes function dualSync to only execute the onlineSync method.

  dualsync = function(method, model, options) {
    var response, success;
    console.log('dualsync', method, model, options);
    options.storeName = result(model.collection, 'url') || result(model, 'url');
    if (result(model, 'remote') || result(model.collection, 'remote')) {
      return onlineSync(method, model, options); //////<----------Case where local==true is skipped since we're returning
    }
    if ((options.remote === false) || result(model, 'local') || result(model.collection, 'local')) {
      return localsync(method, model, options);
    }

My suggested replacement, which also allows you to pass options.local as a working parameter and make it a little bit more readable is:

  dualsync = function(method, model, options) {
    var response, success;
    var remote, local;
    console.log('dualsync', method, model, options);
    options.storeName = result(model.collection, 'url') || result(model, 'url');

    remote = result(model, 'remote') || result(model.collection, 'remote') || (options.remote === true);
    local =  result(model, 'local') || result(model.collection, 'local') || (options.local === true);

    if ( remote && !local ) {
      return onlineSync(method, model, options);
    }
    if ( local && !remote ) {
      return localsync(method, model, options);
    }

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.