Filtered by AngularJS

Reset

localStorage is not async, but it's FAST!

October 6, 2015
7 comments Web development, AngularJS, JavaScript

A long time I go I wrote an angular app that was pleasantly straight forward. It loads all records from the server in one big fat AJAX GET. The data is large, ~550Kb as a string of JSON, but that's OK because it's a fat-client app and it's extremely unlikely to grow any multiples of this. Yes, it'll some day go up to 1Mb but even that is fine.

Once ALL records are loaded with AJAX from the server, you can filter the whole set and paginate etc. It feels really nice and snappy. However, the app is slightly smarter than that. It has two cool additional features...

  1. Every 10 seconds it does an AJAX query to ask "Have any records been modified since {{insert latest modify date of all known records}}?" and if there's stuff, it updates.

  2. All AJAX queries from the server are cached in the browser's local storage (note, I didn't write localStorage, "local storage" encompasses multiple techniques). The purpose of that is to that on the next full load of the app, we can at least display what we had last time whilst we wait for the server to return the latest and greatest via a slowish network request.

  3. Suppose we have brand new browser with no local storage, because the default sort order is always known, instead of doing a full AJAX get of all records, it does a small one first: "Give me the top 20 records ordered by modify date" and once that's in, it does the big full AJAX request for all records. Thus bringing data to the eyes faster.

All of these these optimization tricks are accompanied with a flash message at the top that says: <img src="spinner.gif"> Currently using cached data. Loading all remaining records from server....

When I built this I decided to use localForage which is a convenience wrapper over localStorage AND IndexedDB that does it all asynchronously and with proper promises. And to make it work in AngularJS I used angular-localForage so it would work with Angular's cycle updates without custom $scope.$apply() stuff. I thought the advantage of this is that it being async means that the main event can continue doing important rendering stuff whilst the browser saves things to "disk" in the background.

Also, I was once told that localStorage, which is inherently blocking, has the risk that calling it the first time in a while might cause the browser to have to take a major break to boot data from actual disk into the browsers allocated memory. Turns out, that is extremely unlikely to be a problem (more about this is a future blog post). The warming up of fetching from disk and storing into the browser's memory happens when you start the browser the very first time. Chrome might be slightly different but I'm confident that this is how things work in Firefox and it has for many many months.

What's very important to note is that, by default, localForage will use IndexedDB as the storage backend. It has the advantage that it's async to boot and it supports much large data blobs.

So I timed, how long does it take for localForage to SET and GET the ~500Kb JSON data? I did that like this, for example:


var t0 = performance.now();
$localForage.getItem('eventmanager')
.then(function(data) {
    var t1 = performance.now();
    console.log('GET took', t1 - t0, 'ms');
    ...

The results are as follows:

Operation Iterations Average time
SET 4 341.0ms
GET 4 184.0ms

In all fairness, it doesn't actually matter how long it takes to save because my app actually doesn't depend on waiting for that promise to resolve. But it's an interesting number nevertheless.

So, here's what I did. I decided to drop all of that fancy localForage stuff and go back to basics. All I really need is these two operations:


// set stuff
localStorage.setItem('mykey', JSON.stringify(data))
// get stuff
var data = JSON.parse(localStorage.getItem('mykey') || '{}')

So, after I've refactored my code and deleted (6.33Kb + 22.3Kb) of extra .js files and put some performance measurements in:

Operation Iterations Average time
SET 4 5.9ms
GET 4 3.3ms

Just WOW!
That is so much faster. Sure the write operation is now blocking, but it's only taking 6 milliseconds. And the reason it took IndexedDB less than half a second also probably means more hard work for it to sweat CPU over.

Sold? I am :)

Visual speed comparison of AngularJS and ReactJS

July 20, 2015
0 comments Web development, AngularJS, JavaScript

Last year I put together a little experiment called AJAX or Not? and blogged about it here. The basic idea was to display 1,000 rows in a table. There are several ways of doing it but I decided to compare the following three patterns:

  1. Rendering the whole table in Django server-side and return the whole HTML document.
  2. Rendering a skeleton page, then load the table content as a big chunk of HTML via AJAX.
  3. Rendering a skeleton page, and let AngularJS load all the content of the table from the server as JSON and let AngularJS render it into the DOM.

It was clear as day that the server-side rendering version was hands down the fastest. And the AngularJS rendering the slowest.

Note! AngularJS is amazing and super flexible and powerful because you don't really need to worry about how to re-render once the data changes. This is really useful when you do things like loading more data from a remote endpoint or doing some in-page filtering.

Enter ReactJS

The point of AJAX or Not was not to compare Javascript frameworks but I had some time and I thought I'd write an equivalent version of the AngularJS one with ReactJS (version 0.13.3).

Anyway, here's the code and it's using the GitHub fetch polyfill to do the AJAX query. The AngularJS code is here and here and as you can see it's using track by on the ng-repeat.

WebPageTest
To measure the difference I ran a comparison in WebPageTest which I encourage you open and study for a bit. You can watch the video and download the video here.

Also, note that the Django rendered version loads jQuery. That's because the functionality dictates that clicking on a link should show a confirmation box before going to the link. I know, it's silly but it's very realistic that every page needs some Javascript functionality.

Executive summary...

  • Django server-side takes 0.8 seconds, ReactJS version takes 2.0 seconds and the AngularJS version takes 2.9 seconds.
  • The ReactJS version is the fastest to display something. It displays the header and the image first. Only by 0.2 seconds before the Django server-side version.
  • The AngularJS version causes a lot more CPU utilization. This might really matter when you're on a low-end smartphone.
  • The ReactJS causes twice as much CPU utilization than the server-side version. The AngularJS causes twice as much CPU utilization than the ReactJS version.
  • AngularJS is slightly larger than ReactJS + fetch but I don't think this has a huge effect on the total load time.

Some other thoughts...

  • The ReactJS code is all in one place more or less. That's neat! But it's pretty darn big in terms of number of lines. AngularJS code is split half in the Javascript code and half in the HTML.
  • It's clear, if you want a fast loading page, avoid Javascript as much as you possibly can.
  • This experiment is very optimized in how it gets the data to be displayed. In fact, the server-side rendering time is close to 0 seconds because the whole HTML blob is stored in memcached. A more realistic thing is that extracting the data would take a lot longer if the query isn't so easy to cache. That would be a huge disadvantage for the fully server-side rendered version since if the data query takes a long time you'll sit and stare at a white screen longer. Doing the AJAX approach would definitely be a nicer experience.
  • The difference isn't that big. Both fancy Javascript frameworks have amazing features that leave jQuery in the dust but if you want your page to load crazy-fast, do as much server-side as you possibly can.

Premailer.io

July 8, 2015
3 comments Python, Web development, AngularJS, JavaScript

Premailer is a Python library for turning a HTML + CSS into HTML with all the CSS embedded as inline style attributes. This is sadly very necessary to ensure that your fancy HTML emails look spiffy across all email clients and email webapps.

So, last week I put together a little site to test the library via a browser: Premailer.io

It's just a simple webapp with a form where you can enter HTML in three different ways; textarea, by URL and by file upload.

You can also override all the possible advanced options that premailer supports.

What's kinda cool is that you can get a preview of how the HTML document will look like in an iframe that is dynamically loaded with the result from the conversion.

The webapp is of course open source and available on github.com/peterbe/premailer.io. The front-end is an AngularJS app and the build system is Lineman.js. The server is a Falcon server running on uWSGI via Nginx.

There's very little fancy here. There's no limitations or protections. I just hope it becomes handy for people to test premailer out.

The inspiration came from MailChimp's CSS Inliner Tool which is cute but very basic and doesn't allow you the same kinds of input.

If anybody with some AngularJS or highlight.js chops has time I'd love to help fix why the HTML is not syntax highlighted.

AngularJS $q notify and resolve as a local GET proxy

April 18, 2015
1 comment AngularJS, JavaScript

Something tells me there are already solutions like this out there that are written by much smarter people who have tests and package.json etc. Perhaps my Friday-brain failed at googling them up.

So, the issue I'm having is an angular app that uses a ui-router to switch between controllers.

In almost every controller it looks something like this:


app.controller('Ctrl', function($scope, $http) {
  /* The form that needs this looks something like this:
      <input name="first_name" ng-model="stuff.first_name">
   */
  $scope.stuff = {};
  $http.get('/stuff/')
  .success(function(response) {
    $scope.stuff = response.stuff;
  })
  .error(function() {
    console.error.apply(console, arguments);
  });
})

(note; ideally you push this stuff into a service, but doing it here in the controller illustrates what matters in this point)

So far so good. But so far so slow.

Every time the controller is activated, the AJAX GET is fired and it might be slow because of network latency.
And I might switch to this controller repeatedly within one request/response session of loading the app.

So I wrote this:


app.service('localProxy',
    ['$q', '$http', '$timeout',
    function($q, $http, $timeout) {
        var service = {};
        var memory = {};

        service.get = function(url, store, once) {
            var deferred = $q.defer();
            var already = memory[url] || null;
            if (already !== null) {
                $timeout(function() {
                    if (once) {
                        deferred.resolve(already);
                    } else {
                        deferred.notify(already);
                    }
                });
            } else if (store) {
                already = sessionStorage.getItem(url);
                if (already !== null) {
                    already = JSON.parse(already);
                    $timeout(function() {
                        if (once) {
                            deferred.resolve(already);
                        } else {
                            deferred.notify(already);
                        }
                    });
                }
            }

            $http.get(url)
            .success(function(r) {
                memory[url] = r;
                deferred.resolve(r);
                if (store) {
                    sessionStorage.setItem(url, JSON.stringify(r));
                }
            })
            .error(function() {
                deferred.reject(arguments);
            });
            return deferred.promise;
        };

        service.remember = function(url, data, store) {
            memory[url] = data;
            if (store) {
                sessionStorage.setItem(url, JSON.stringify(data));
            }
        };

        return service;
    }]
)

And the way you use it is that it basically returns twice. First from the "cache", then from the network request response.

So, after you've used it at least once, when you request data from it, you first get the cached stuff (from memory or from the browser's sessionStorage) then a little bit later you get the latest and greatest response from the server. For example:


app.controller('Ctrl', function($scope, $http, localProxy) {
  $scope.stuff = {};
  localProxy('/stuff/')
  .then(function(response) {
    // network response
    $scope.stuff = response.stuff;
  }, function() {
    // reject/error
    console.error.apply(console, arguments);
  }, function(response) {
    // update
    $scope.stuff = response.stuff;
  });
})

Note how it sets $scope.stuff = response.stuff twice. That means that the page can load first with the cached data and shortly after the latest and greatest from the server.
You get to look at something whilst waiting for the server but you don't have to worry too much about cache invalidation.

Sure, there is a risk. If your server response is multiple seconds slow, your user might for example, start typing something into a form (once it's loaded from cache) and when the network request finally resolves, what xhe typed in is overwritten or conflicting.

The solution to that problem is that you perhaps put the form in a read-only mode until the network request resolves. At least you get something to look at sooner rather than later.

The default implementation above doesn't store things in sessionStorage. It just stores it in memory as you're flipping between controllers. Alternatively, you might want to use a more persistent approach so then you instead use:


controller( // same as above
  localProxy('/stuff/', true)
  // same as above
)

Sometimes there's data that is very unlikely to change. Perhaps you just need the payload for a big drop-down widget or something. In that case, it's fine if it exists in the cache and you don't need a server response. Then set the third parameter to true, like this:


controller( // same as above
  localProxy('/stuff/', true, true)
  // same as above
)

This way, it won't fire twice. Just once.

Another interesting expansion on this is, if you change the data after it comes back. A good example is if you request data to fill in a form that user updates. After the user has changed some of it, you might want to pre-emptivly cache that too. Here's an example:


app.controller('Ctrl', function($scope, $http, localProxy) {
  $scope.stuff = {};
  var url = '/stuff/';
  localProxy(url)
  .then(function(response) {
    // network response
    $scope.stuff = response.stuff;
  }, function() {
    // reject/error
    console.error.apply(console, arguments);
  }, function(response) {
    // update
    $scope.stuff = response.stuff;
  });

  $scope.save = function() {
      // update the cache
      localProxy.remember(url, $scope.stuff); 
      $http.post(url, $scope.stuff);
  };
})

What do you think? Is it useful? Is it "bonkers"?

I can think of one possible beautification, but I'm not entirely sure how to accomplish it.
Thing is, I like the API of $http.get that it returns a promise with a function called success, error and finally. The ideal API would look something like this:


app.controller('Ctrl', function($scope, $http) {
  $scope.stuff = {};
  // angular's $http service expanded
  $http.getLocalProxy('/stuff/')
  .success(function(cached, response) {
    /* Imagine something like:
        <p class="warning" ng-if="from_cache">Results you see come from caching</p>
     */
    $scope.from_cache = cached;
    $scope.stuff = response.stuff;
  })
  .error(function() {
    console.error.apply(console, arguments);
  });
})

That API looks and feels just like the regular $http.get function but with an additional first argument to the success promise callback.

AJAX or not

December 22, 2014
1 comment Web development, AngularJS, JavaScript

From the It-Depends-on-What-You're-Building department.

As a web developer you have a job:

  1. Display a certain amount of database data on the screen
  2. Do it as fast as possible

The first point is these days easily taken care of with the likes of Django or Rails which makes it über easy to write queries that you then use in templates to generate the HTML and voila you have a web page.

The second point is taken care of with a myriad of techniques. It's almost a paradox. The fastest way to render something on the screen is to generate everything on the server and send it wholesome. It means the browser can very quickly (and boosted by GPU) render something on the screen. But if you have a lot of data that needs to be displayed it's often better to send just a little bit of HTML and then let some Javascript kick in and take care of extracting the rest of the information using AJAX.

Here I have prepared three different versions of ways to display a bunch of information on the screen:

https://www.peterbe.com/ajaxornot/

Visual comparison on WebPagetest
What you should note and take away from this little experimental playground:

  1. All server-side work is done in Django but it's served straight out of memcache so it should be fast server-side.

  2. The content is NOT important. It's just a list of blog posts and their categories and keywords.

  3. To make it somewhat realistic, each version needs to 1) display a JPG and 2) have a Javascript onclick event that throws a confirm() dialog box.

  4. The AngularJS version loads significantly slower but it's not because AngularJS is slow, but because it's able to do so much more later. Loading a Javascript framework is like an investment. Big cost upfront and small cost later when you need more magic to happen without having a complete server refresh.

  5. View 1, 2 and 3 are all three imperfect versions but they illustrate the three major groups of solving the problem stated at the top of this blog post. The other views are attempts of optimizations.

  6. Clearly the "visually fastest" version is the optimization version 5 which is a fork of version 2 which loads, on the server-side, everything that is above the fold and then take care of the content below the fold with AJAX.
    See this visual comparison

  7. Optimization version 4 was a silly optimization. It depends on the fact that JSON is more "compact" than HTML. When you Gzip the content, the difference in size doesn't matter anymore. However, it's an interesting technique because it means you can do all business logic rendering stuff in one language without having to depend on AJAX.

  8. Open the various versions in your browser and try to "feel" how pages the load. Ask your inner gutteral heart which version you prefer; do you prefer a completely blank screen and a browser loading spinner or do you prefer to see some skeleton structure first whilst waiting for the bulk content comes in?

  9. See this as a basis of thoughts and demonstration. Remember the very first sentence in this blog post.

One-way data bindings in AngularJS 1.3

December 11, 2014
1 comment AngularJS, JavaScript

You might have heard that AngularJS 1.3 has "one-time bindings" which is that you can print the value of a scope variable with {{ ::somevar }} and that this is really good for performance because it means that once rendered it doesn't add to the list of things that the angular app needs to keep worrying about. I.e. it's one less thing to watch.

But what's a good use case of this? This is a good example.

Because ng-if="true" will cause the DOM element to be re-created it will go back to the scope variable and re-evaluate it.

Previous page
Next page