Monthly Archives: July 2013

Yes it’s “REST” but is it any good?

I’m not picking on the author of this discussion of different levels of REST APIs. As a matter-of-fact, I thought it was a quite good article. But the advice I see over and over again for how to build remote APIs seems focused on the URLs and how they are formed as a function of whether the API is “good” or “bad”. So let me just say this… If your API has you performing atomic units of action via multiple API calls to the back end, your API is bad whether it conforms to all the REST requirements or not.

So, if you’re taking money out of one account via one call and then transferring it to another via another call, you’ve:

  • allowed business logic to leak into code outside the back end
  • created a situation that is almost guaranteed to result in a corrupted database, keystore, or whatever at some point
  • made anyone using your API work much harder (for example, if after adding a new user, they also need to go add that user to a group, add an avatar picture, etc. all as separate operations)

One alternative I would recommend looking at (I’m not advocating this as some kind of REST replacement, just something you need to take ten minutes to learn about) is CQRS. The link is to a warehouse of info on it but the basic idea is that an API will have queries where you just get data for display and commands (think Gang of Four Command pattern) for performing actions in the system. Thus moving money from one account to another might be a command or adding a user might be another. In each case, you’ll provide enough data with the command to make sure the back end can completely perform it in one atomic operation and the front end stays out of the business logic/sequencing business.

I’ve yet to read the grand treatise on how to create a great CQRS API which runs on REST. I’d love to read such a thing if you find one. Please leave a comment below if you do.

About these ads

AngularJS Services and Promises

Promises, Deferred, Futures

Call it what you will, promises are another mechanism for dealing with the asynchronous nature of some things you’ll do within JavaScript. For example, calling remote services, timers, animations completing, etc. In each case you could just have a callback, but what if you wanted to attach several callbacks to the completion or failure of a single asynchronous job? That’s difficult with callbacks, but promises make it easy.

Promises make it easy to tie together several asynchronous actions and treat them as one. This makes it easy to react only when all have completed or one of them had a problem. This is a much better solution than the nesting of callbacks you often see done in JavaScript code. Nested callbacks are often written in a way that forces requests to be made serially when they could be executed in parallel and perform better as a result.

Promises can simplify code in another way because a promise which has already resolved and one which has yet to resolve are treated are both used in exactly the same way.

Promises in AngularJS

AngularJS offers promises via a service called $q. It is modeled after a promise library called (not surprisingly) Q. You’ll see promises returned from several of AngularJS’s services, including $http (for service calls), $timeout, and $route (though I admit I’m not sure how it’s being used in the latter case).

AngularJS even has support for promises being assigned directly to $scope variables. So you can take the promise you get back from a remote service call and assign it directly to a variable knowing that when the service call finally returns, the variable will be updated and any binding attached to the variable will redisplay. That’s significantly easier than other frameworks I’ve used in the past.

Service Calls

I thought I’d give three examples of calling a service with $http and show how the promise it returns may be used directly, and examples of how caching and aggregation can be made easier thanks to promises.

For these examples I’m just calling the OpenKeyval service to store and retrieve some JSON data. It’s simple, it’s free, and I know I can count on any code I give you being able to access it without any special API key.

Source code for all of the following and more is here: https://github.com/JohnMunsch/AngularJSExamples, in particular look at the app/scripts/controllers/promises.js and app/view/promises.html to see the code specific to these examples.

Click here to see all of the following as running examples on Github

    • Example 1: Store and retrieve some values. Promises returned from $http are assigned directly to $scope variables and AngularJS dereferences them automatically.
    • Example 2: Make multiple calls to different services aggregating the results of all the calls and merging the results into a single returned value.
    • Example 3: A common pattern for services is to be able to satisfy requests from a local cache when appropriate. Promises can make that an easy upgrade for an existing service.

A Shortcoming

$q is not as full featured as Q is. That’s not surprising because AngularJS is only trying to provide as much functionality as it needs without requiring you to load up another library. AngularJS does the same kind of thing when it comes to jQuery. It has a minimal subset of functionality built in (called jQLite) that it uses if jQuery wasn’t loaded prior to loading AngularJS.

However, this is where things suddenly differ. If you do load jQuery prior to loading AngularJS then it won’t use it’s own built in version, but will instead use the more full-featured jQuery implementation. However, as far as I know, loading up the Q library or jQuery before you load up AngularJS doesn’t cause AngularJS to use Q promises or jQuery Deferred objects instead of its own subset implementation. That seems to me like an oversight or shortcoming to their implementation.

Nevertheless, I hope I’ve shown with the examples above that you can still do some really interesting things even with the limited set of functionality $q provides.

Follow Up

After a few weeks I realized that something I didn’t really cover is how assigning a promise to a variable in $scope can hinder access of that same variable in the controller. For example, if you do something like this:

$scope.someValue = $http.get(someURL);

You’ve assigned a promise to the variable and from the standpoint of code in your view, nothing really changed. You can still do {{someValue.someSubValue}} and act blissfully unaware that someValue is a promise rather than a JavaScript object. But, try the same thing from the controller code and you’ll have all kinds of problems. $scope.someValue.someSubValue isn’t anything you can access in the controller because $scope.someValue is actually a promise and will always stay a promise, even once it’s resolved. The only thing that changes once it’s resolved is that any .then() function you call on it will immediately call the closure you pass into it with the value the promise now caches. Thus you will forever more have to use it within the controller like so:

$scope.someValue.then(function (value) { $log.info(value.someSubValue); });

If you have some values you’re getting via AJAX calls or via some other mechanism that returns a promise and you need to access them as much or more from a controller as from view code then only assign the resolved value into the scope and not the promise. For example:

$http.get(someURL).then(function (value) { $scope.someValue = value; });

In that case you’re waiting until the promise resolves and only assigning the final returned value into the scope. Now both the view code and the controller can directly access $scope.someValue.someSubValue freely.

Reference

Yeoman, Grunt, and Bower

Aside from being a top end legal team, Yeoman, Grunt, and Bower are also the names of some front-end development tools I use when working on new projects these days or whenever it’s going to be easier to throw together some code and test it out in a testbed rather than embedded in a larger project. First I’ll go over what role each of these has in the development of a modern website front-end. Then I’ll take a brief side trip into a tool (Node.js) which isn’t directly relevant to any of them, but one which is required in order to install them. Then we’ll get into some specifics for all three as I demonstrate using them to get started on an AngularJS project.

Yeoman is the code generator, it fulfills a role similar to that filled by the Rails app if you’ve ever used Ruby on Rails because it can be used to generate whole projects or individual pieces of code depending upon the specific code generators it has installed. It gives you a quick shell from which to start a project.

Grunt is the builder and utility tool. As with Yeoman it’s modular in nature and as a result it can fulfill a huge variety of roles. Typical things you might call upon Grunt to perform would be concatenating multiple CSS or JavaScript files together for faster download, minifying CSS or JavaScript files (again for faster download), running a small local server to make developing your website easier, looking for errors in your JavaScript, running JavaScript unit tests, running compilation tools for CoffeeScript, LESS, or Sass, and the list goes on and on. If you have a Java background you might think of Ant as the closest analogy for Grunt.

Finally, Bower is your web component installer. If you find yourself needing to install a JavaScript library or a CSS/JavaScript framework, Bower can handle that and even make sure that any other components upon which it depends (for example, Backbone.js requires Underscore.js) are also automatically installed. Java tools like Maven and Ivy offer similar functionality so that each developer can get the libraries he/she needs installed without having to check all of them into version control systems with the code being developed.

That side trip I mentioned

As strange as it might seem to require an installation utility (npm) in order to install an installation utility (Bower), that’s exactly what I’m going to tell you to do. These days any JavaScript software worth its salt seems to be installed in much the same way. First you go get the latest version of Node.js (there’s a big green install button on the middle of the http://nodejs.org/ page). When you install that you’ll also pick up a nifty little utility known as the Node Package Manager or npm for short. With npm installed you can then install all three of the aforementioned tools Yeoman, Grunt, and Bower.

As I mentioned earlier, you use the npm tool you just installed to install the other three by running “npm install -g yo” (the details are covered on Yeoman’s website: http://yeoman.io/ but that’s actually all you have to do). You’ll see npm download, compile, and install tons of stuff but you won’t actually be involved in the process. In that respect it’s much like other package managers like Yum or RPM.

Finally you’ll need some kind of Yeoman generator installed in order in order to generate our starting shell: “npm install -g generator-angular”

Yeoman

With all the installation out of the way, let’s create a directory for a new project (“mkdir TestApp”), change directory into the newly created directory, and just type “yo” at the command line and see what we see.

YeomanPick “Run the Angular generator” and hit enter, then answer the various yes or no questions it asks in order to generate the shell you want for your app. At the end it will also run npm to install the local copies of tools you may need and Bower to install the various libraries you need for your shell (for example, AngularJS and jQuery).

Yeoman has now fulfilled most of its mission but I want to look at some files it generated and then come back to using it as a tool even after we have our shell. There are many files worth noting which Yeoman generated but three in particular are:

  • package.json – This file tells npm which Node.js packages need to be installed in this particular project. Most of what you’ll find in here are packages Grunt needs to do its job. Rather than installing these globally, they are instead installed on a project by project basis so you can have different versions of the tools used by different projects.
  • bower.json – This file tells Bower which components to install in the app/bower_components directory. These are JavaScript and CSS components which you’ll use within the website you’re building.
  • Gruntfile.js – This file is task instructions for Grunt so it knows how to perform a set of different tasks for a given project. If you look near the bottom of the file you’ll see that the Angular generator has created a file with the tasks “server”, “test”, and “build” which in turn call many other sub-tasks to perform their work.

Finally it’s worth noting that that’s not the final use that Yeoman can have in our development. Just as a Rails user can continue to use the Rails command to generate new models, migrations, etc. within an already constructed project, the Angular generator can be used to generate services, routes, directives, filters, etc. as detailed here: https://github.com/yeoman/generator-angular

Grunt

Running Grunt from the command line with “grunt” should perform a default sequence where the JavaScript code is checked for errors, unit tests are run, and finally a “distribution” version of the application is built into the “dist” directory. Grunt will have concatenated CSS files and JavaScript files into common chunks like 7d151330.main.css, bd6ce9e3.plugins.js, c2ac0a01.scripts.js and the references to the original names within the HTML will have been replaced with the new file names. When you update some of your scripts and recompile, the file names will be different and there will be no need to worry about browsers continuing to use old cached versions of your scripts. You never have to perform this task, all of the files within the app directory may be deployed as soon as you are ready to do so, but Grunt performs a variety of optimization tasks which can make for a more performant site if you like.

One of the handiest things Grunt offers via the Gruntfile.js which was created is “grunt server”. Running that will compile files which you might have that need compiling (for example, if you use CoffeeScript rather than JavaScript or you’re using Sass), start up a server running the application, launch the index.html page in your default browser, and then watch for any changes to the files as you edit the CSS, HTML, and JavaScript and re-run compiles as needed and reload the page within the browser as you work on the website. I just love this feature because of how easy it makes it for me to work on features and immediately see the results in my browser, often I don’t even have to lift my fingers from the keyboard or switch apps to my browser because I can see if my changes accomplished what I want simply by looking at the automatically refreshed page.

At this point you would be forgiven for thinking that there was little point in installing Node.js except that it gave us the npm tool we needed to install all our other software, but actually Node.js is also being used behind the scenes by Grunt to run the local server I mentioned above.

Bower

Last but not least is Bower. It offers the opportunity to search for packages you may need in your project (for example, “bower search underscore” to find Underscore.js) and install them in your project (for example, “bower install –save underscore” to install the aforementioned package and add it to the list of dependencies in the bower.json file). Bower is also capable of understanding the difference between packages needed for development (for example, unit testing tools) and those needed for both development and runtime.

Since the .gitignore for the project will normally list both the app/bower_components and the node_packages directory, you won’t be checking in those pieces with your code (assuming you use Git as a version control mechanism). Instead, whenever checking out the project onto a new machine, just run “npm install” and “bower install” within the project directory and the tools will use the package.json and bower.json files to make sure all the needed pieces are installed.

But that’s not all

Keeping these tools up-to-date might prove difficult if not for the fact that the Node Package Manager is capable of doing so automatically via “npm update -g” for all your global software like Yeoman, Bower, etc. or “npm update” within the project directory to update Grunt packages, etc. I’m not going to contend that it always goes flawlessly, lots of this software is still beta and undergoing lots of changes so there have been days when I needed to remove it all and reinstall; but that’s rare and usually takes only a few minutes.