Going "Web Scale" with JavaScript Polymorphism
he recent trend has been to push more and more work on to the client side as browser become more and more capable. This means that complexity of your javascript can get out of hand very quickly. Being able to scale simple piece up to handle complex and even specialized tasks is an important part of sound javascript application architecture.
One way to accomplish this is through simple polymorphism, or as in many Object Orient Orientated languages, Generic Programming. The general pattern is to have a single light class or module that has a standard set of methods which does nothing more than call methods of the same name on any number of other objects ( classes, modules, etc. )
For example we could implement a parser that given any kind of object, can figure out what it is, and deconstruct or construct objects of the same type. We can start be defining a simple parser module. Sounds simple, and harmless, but this is the kind of code that can spin out of control very quickly. It starts out innocent enough, just parsing strings. Then next week it needs to handle nested objects. No big, deal - That just one more if
/ else
check in the master parser function. Great! Then your boss tells you they are they are bringing on an integrating company who will be sending custom classes and data objects. And they have 30 different types flying around.
A contrived example, sure. But nested if statements and literal checks for all of the things will make your parser a nightmare to maintain and very prone to bugs. What we really want is a way to load custom parsers for each of the different types of things we might encounter - without really know what that thing is. To pull this off we will need to create a simple proxy that knows how to delegate work of to parser backends. A module for that might look something like this:
// parser.js
define([
"require"
,"module"
,"exports"
,"DummyParser"
,"util"
],
function( require, module, exports, dummy, util ){
var parsers, lookup_parser;
// parser cache
parsers = {
"default":dummy
};
// internal function to get a parser
lookup_parsers = function( type ){
return parsers.hasOwnProperty( typ ) ? parsers[ type ] : parsers[ "dummy" ]
};
/**
* Converts an object into a string
* @static
* @function module:parser#serialize
* @param {Mixed} item The itme to serialize
* @return {String} the resuting string
**/
exports.serialize = function( object ){
var type = util.typeOf( object )
,parser = lookup_parser( type );
// Just return the result of whatever the parser is
return parser.serialize.apply( null, arguments )
};
/**
* Converts a serialized string back to an object
* @static
* @function module:parser#deserialize
* @param {String} item The itme to deserialize
* @return {Object} The resulting object
**/
exports.deserialize = function( object ){
var type = util.typeOf( object )
,parser = lookup_parser( type );
// Just return the result of whatever the parser is
return parser.deserialize.apply( null, arguments );
};
/**
* Registers a new parser under a name
* @static
* @function module:parser#deserialize
* @param {String} name The name of the parser type
* @param {Object} parser object implementing the parser interface
**/
exports.register = function( name, parser ){
if( parsers.hasOwnProperty( name ) ){
var e = new Error();
e.message = "Parser with name " + name + "already exists";
throw e;
}
parsers[ name ] = parser;
}
}
);
There isn't much to our parser module. Our serialize
/ deserialize
method just figure out which parser it needs to use and off loads the work to the specific backend. In our situation, we would probably want to start off when a generic parser that dealt with simple object literals and strings. OK, simple enough. It might look a little like this:
// parser/dummyparser.js
define({
serialize: function( obj ){
var data = [];
// convert object into a string we can pull apart later
for( var key in obj ){
if(
obj.hasOwnProperty( key ) &&
typeof obj[key] != 'function'
){
data.push( key + "=" + String( obj[key] ) );
}
}
return data.join( "&" );
}
, deserialize: function( str ){
var pairs = str.split( "&" )
,data = {}
,items;
// rebuild the data object
for( var x = 0; x < pairs.length ){
items = pairs[x].split( "=" );
data[ item[0] ]= item[1];
}
// return the data object
return data;
}
});
OK, so some rough code for sure. But it is really there to make sure our parser can do something. However, we have made a pretty clean and generic system. When we need it to do more, different or new things we don't actually modify the parser, we just add new backends or modify existing ones to handle the tasks. A sort of client-side horizontal scaling. Not only is it flexible, but you can use all kinds of buzz words to impress your boss - Polymorphic, future proof parsing system that performs at web scale.