­

JsDuck tag for Deferred methods

I started working with the awesome JSDuck to document my projects (do you like the doc system on Sencha Touch?) and, since I’m a great fan of Deferred object, I saw the lack of support for this kind of pattern.
Setting the @return tag with something like @return {jQuery.Deferred} is not enough since the returned value that matters is that passed through the .resolve() method of the deferred object.

Would be nice to mark a method as “deferred” and use the @return tag to document the value passed with .resolve(), keeping in mind that the actual value returned by the function is a just e promise.

Luckly JsDuck supports custom tags:

require "jsduck/meta_tag"
module JsDuck::Tag
  # Implementation of @deferred tag
    class Deferred < JsDuck::MetaTag
    def initialize
      @name = "deferred"
      @key = :deferred
      @signature = {:long => "deferred", :short => "DEF"}
      @boolean = true
    end
  end
end

Put this somewhere, for example ‘jsduck/deferred.rb’ and remember to call JsDuck with the param “–meta-tags=jsduck/deferred.rb”.

October 12th, 2012|0 Comments

Deferred object as callback

Callbacks are a nice way of Iversion of control, but are even more powerful with deferred objects.
Basically callbacks are a way to pass back the control from a called function, the deferred/promise just do the opposite: it returns the control the called function.

Consider this scenario: a table widget, with the classic delete/modify buttons next to each row. Since the table widget is a generic component, we want to handle the delete operation on the model outside the widget, and we don’t event want to extend the generic object to do that, we want to user a callback:

$.widget('my_table_widget',{
  onDelete: function(id_of_row) {
    var that = $(this);
    // call the backend and delete the record
    $.ajax({
      method: 'delete',
      url: '/api/records/'+id_of_row
      })
      .done(function() {
        // ok, record delete, now I should update the table
        that.widget('remove_row',id_of_row);
        });
      }
    });

In the example above: the callback first calls the backend to remove the record and then removes the row from the table (I mean the DOM). In order to do this: it needs to know something about the table widget (in this case, the method “remove_row”).
But what if we could pass the ball back to the widget? Let’s rewrite it with a deferred object:

$.widget('my_table_widget',{
  onDelete: function(id_of_row) {
    var deferred = $.Deferred();
    // call the backend and delete the record
    if (confirm('Delete?')) {
      $.ajax({
        method: 'delete',
        url: '/api/records/'+id_of_row
        })
        .done(function() {
          // ok, record delete, now I should update the table
          deferred.resolve();
          })
        .fail(function() {
          deferred.reject();
          })
      }
    else {
      deferred.reject();
      }
    return deferred.promise();
    }
  });

It’s more linear but, most important, the callback knows nothing about the remove_that_damn_row_from_the_dom method of the widget, it just passed back the control, it’s like saying “I’m done, it’s your turn”.
More separation, less documentation to read, easier to implement, less errors.

On the widget side, the callback should be treated this way

// somewhere inside the table widget, here is where we execute the callback
var callback_result = options.onDelete.call(widget,id);
// if the callback answer with a deferred/promise, this will
// handle it when it's resolved/rejected
if (isPromise(callback_result)) {
  callback_result
    .done(function() {
      // ok, asynch operation on the other side is over
      // remove the row from the dome
      widget.widget('remove_row',id);
      })
    .fail(function() {
      // do nothing
      })
  }
October 10th, 2012|6 Comments

Lesson learned with package.json

I’ve been using NodeJS for a project of mine and I was used to setup my package.json in this way:

...
  "some_external_lib": ">= 1.0.0",
  "another_external_lib": ">= 1.4.0"
...

I thought: the newer a library is, the better.
And I was wrong. In this way there’s no control on what is going to production/staging servers, if a new version of some_external_lib becomes available, let’s say 1.5.0-i-am-very-alhpa, that unstable code will be deployed on production.

Since the local environment is no longer aligned with production/staging (even if we cast a npm install locally), any bug introduced by unstable code will be very hard to spot in the local environment (likely you’ll end up debugging on production server).

Lesson learned: eradicate any “>=” from your package.json.

October 2nd, 2012|0 Comments