A lazy loading solution for Angular 1.x

In this post, I’m going to show you a solution of lazy loading Angular 1.x modules. Angular 1.x doesn’t have it supported out of the box and it’s a very critical feature for many large applications dealing serious businesses.

The demo project used for this post can be found from here https://github.com/jack4it/angular-1x-lazy-load.

Aren’t this problem already solved by Angular 2 and Aurelia?

Some of you might ask, given that Angular 2 is already in beta stage, and also there is another even better framework called Aurelia almost ready for its first release, why do we still need to care about Angular 1.x? There indeed are some valid reasons for that.

  • Many existing Angular 1.x projects will just not migrate to the new framework
  • Both Angular 2 and Aurelia are just in beta stage and it’ll take time for the majority to be confident enough to start to use them on new critical projects
  • etc.

So this solution will still be helpful for at least a while.

And a bonus point, in this solution, I’m also gonna show you how to write ES6/ES2015 codes and use systemjs loader even for today’s Angular 1.x projects. Another bonus point, the lazy loaded modules are also well bundled using systemjs-builder. So that you can have a seamless workflow for both development and production environments.

In the rest of this post, if not explicitly declared, by the term Angular, I’ll just mean Angular 1.x.

Why does it matter?

It’s funny that Angular fosters modular design/separation of concern for large client applications, but doesn’t provide a lazy loading story. The module meta language it provides is far from ideal, but it still works (plain ES6/ES2015 module is the one true king of module kingdom).

Modular design helps with a lot of things including team collaboration, maintainability, and etc. But it doesn’t really help in production if the good modules all have to be loaded entirely beforehand for the app to run.

In reality, we want to load only the needed modules initially for a faster boot experience and lazily load the other modules when user triggers the related functionality of the app. And this really matters for most serious applications regarding performance.

All right then, how?

So you are still interested in this offering. Great, let’s get to the details. In order to achieve this lazy loading goal, three problems have to be solved:

  1. When, where and how is a module going be triggered to load?
  2. How is a module going to be actually loaded?
  3. Once the module is loaded, how should it be registered to Angular, so that it can be used down the road?
I’ll give all answers to these three questions later in following sections. But first let’s imagine a demo project, so that we can code it up and it’ll be much easier to see the real working codes than just read a dry post.

The little demo project

We’ll have this structure for the demo. Logically the app will have a homepage (the initial load) where we can link to other two lazy-loaded pages (powered by Angular). They are the contact page and about page.

The app.* will serve for homepage purpose as the main entry point of the app. In each lazy-loaded module, we’ll have all their Angular resources defined in a self-contained way and wire them all up in the respective module.js which you’ll see later also serves the purpose of bundling point.

2015-12-29_12-51-27

Without further due, let’s get to resolve the three problems to lazy load Angular modules.

The trigger

In a JavaScript client app, it usually takes a router component to serve the navigation purpose. It is natural to think if we can somehow extend the router, then we can trigger the actual loading when a navigation is requested and register the loaded modules to Angular. And this is indeed true for our solution. We’ll use ui-router to easily define the lazy loading points and seamlessly wire up with systemjs to do the actual loading work.

We favor ui-router over ng-route because it provides more convenient ways of providing lazy loading support which in turn comes from the ui-router-extras project, the future states. Following is a snippet of how the wire-up looks like.

The key pieces to notice in the above snippet are:

  • A state factory called systemLazy is created by using $futureStateProvider.stateFactory function. This state factory delegates the state preparation (the lazy loading) to a service called SystemLazyLoadService. More on the details of this service in next section
  • Then we add two future states, the contact and about modules using function addSystemLazyState which in turn calls function $futureStateProvider.futureState. Notice how we take care of the state name, the routing Url, the source location of the JavaScript module and optionally the export key of the Angular module (respectively contact and about found in the module.js files)

The loading and registration

Now let’s talk about the actual module loading and the registration of the loaded Angular module. As I mentioned above, this is achieved by the SystemLazyLoadService which looks like below snippet.

You may noticed that this is just a regular ES6/ES2015 module which is also registered as an Angular service. The logic is fairly straightforward. It mainly does two things:

  1. Loading: On line 11, we are doing System.import and let systemjs take care of the actual loading business. Thanks to the great systemjs loader, this single line of code is all we need for the loading part
  2. Registration: Once the module is loaded back via systemjs, the next big thing is to register the module into Angular, so that we can use the module down the road. We are using a nice library called ocLazyLoad to take of this part of the business. Again, while it is just one line of code on line 18, ocLazyLoad is actually doing a lot of work behind the scene. With ocLazyLoad’s help, we can stay away from dealing with Angular’s variety of providers to register all lazy loaded Angular resources

The last and important matter: bundling

Now we have solved the three problems in order to enable lazy loading of Angular modules. By integrating all these libraries, we now can seamlessly define the lazy loading points and load the respective module only when it is needed. Nice, but there is one last very important thing before we can call this solution complete. It is the bundling. As I mentioned above, the well crafted modules will not help in a production environment if we don’t have a bundle story.

By using systemjs-builder, we have also achieved this goal easily. Following is an excerpt of the bundle.js file you can find from the demo project.

Notice at the bottom of the script we have three separate bundles generated, namely the app entry point (the initial loading), the contact module and the about module. These modules are corresponding to the future states defined in app.js.

Following is a config sample to enable the usage of the generated bundle files. With this config, systemjs will be able to load the bundles instead of the actual individual module files.

Summary

In this post, I presented a solution to enable lazy loading for Angular 1.x modules. This solution will help a lot regarding app boot performance when the app functionalities grow along the road.

While the next generation JavaScript frameworks like Angular 2 and Aurelia are great and almost ready to release, I see there are still a large base of existing apps that will just stay with Angular 1.x and this lazy loading solution can be of a great support for their maintenances.

The accompanied demo project can be found from here https://github.com/jack4it/angular-1x-lazy-load.

Hope this helps,

-Jack

Advertisements
A lazy loading solution for Angular 1.x

A LESS plugin for systemjs/builder

In previous post, I have briefly mentioned that systemjs/builder has a great support of extensibiity by providing a plug-in mechanism. In this post, I will show you how we can leverage this and make loading/bundling LESS files work on top of the systemjs loading pipeline. We are essentially aiming for two goals:

  1. During development time, we should be able to save and refresh to see the results of LESS file changes
  2. During producing time, we should be able to compile and bundle the generated CSS into the bundle file

The github repository of this plug-in and its usage can be found from here https://github.com/jack4it/system-less.

A brief word of LESS

According to its official website: Less is a CSS pre-processor, meaning that it extends the CSS language, adding features that allow variables, mixins, functions and many other techniques that allow you to make CSS that is more maintainable, themable and extendable.

LESS can run in multiple different environments, most importantly, in browser and node.js. These are the two exact environments that our plug-in will need to support. However, unlike the usuall cases, we will invoke LESS API programmatically, instead of running the node.js CLI or using a <script /> to include it on a web page.

The entry point of LESS API looks likes below:

A quick overview of the plug-in mechanim of systemjs

According to systemjs documentation:

A plugin is just a set of overrides for the loader hooks of the ES6 module specification. The hooks plugins can override are locate, fetch, translate and instantiate.

The behavior of the hooks is:

  • Locate: Overrides the location of the plugin resource
  • Fetch: Called with third argument representing default fetch function, has full control of fetch output.
  • Translate: Returns the translated source from load.source, can also set load.metadata.sourceMap for full source maps support.
  • Instantiate: Providing this hook as a promise or function allows the plugin to hook instantiate. Any return value becomes the defined custom module object for the plugin call.

In our case, we are going to override the Translate hook and another undocumented but obviously necessary one for the bundling scenario. It’s called bundle.

The implementation of system-less, a LESS plug-in for systemjs

Our first goal is to be able to load LESS files and apply the generated CSS styles on the fly during development time. We implement this by overriding the Translate hook like this:

There are three major parts of this implementation. First, we import the LESS browser compilation module less/lib/less-browser. This module is a wrapper of the core LESS logic. Second, we call the render method to compile the loaded LESS file content. Notice that the file content is already loaded by the systemjs pipeline, so that we don’t need to worry about the network loading part of it. Third, once we get the compiled results, the CSS styles, we need to inject them to the DOM, so that the browser will be able to pick them up and render the related markups with the new styles.

It’s a fairly straightforward logic to compile and apply LESS files in browsers.

Now it comes to the second goal of being able to compile and bundle LESS into the bundle file. This is a must-have goal for today’s web landscape. We can’t afford to load and compile LESS on the fly for a production system. That would be a kill of perfromance. Unlike loading LESS in browser, bundling via systemjs-builder happens in node.js environment. So the logic will be a bit different. Here is what it looks like:

There a few different things to notice from this implementation. First, we have a minified version of the injection logic which will be inlined into the bundle. It is to be called to inject the CSS styles when systemjs loads the bundles. Second, now we have stubs of system.register for each of the LESS/CSS files. This will be interpretated correctly by systemjs during the load time. Third, optionally for this post but a must-have for a real plug-in, we use clean-css to optimize the generated CSS styles. With this implementation, during producing time, systemjs-builder will be able to figure out the LESS files and compile and bundle them into the bundle file together with other resources.

Summary

In this post, I walked through the process of developing a systemjs/builder plug-in for LESS resources. This plug-in mechanism is a powerful tool to extend the systemjs/builder functionality. In fact, there are already quite a few great plug-ins developed and can be used directly in your project. With these plug-ins, we can easily set up a seamless workflow that easily save and refresh for the development time and optimize the loading performance for production time using bundling.

Hope this helps,

-Jack

A LESS plugin for systemjs/builder

JavaScript modules and a loader, systemjs

In this post, I will talk a little bit about how to write modular JavaScript codes and how to use the modules via a popular loader, systemjs. This post is accompanied by this demo project: https://github.com/jack4it/es6-module-systemjs-demo.

The very first piece of JavaScript code

Every JavaScript developer can understand the following code, if you are fine to call it “code”.

It’s nothing but a cool little trick to help a user select the entire text inside the input box. It was the very first piece of JavaScript code I wrote, I still remember, in 2000, almost 15 years ago. JavaScript can be simply used like that. In fact, 15 years ago, there were so many places on the web using this kind of scattered JavaScript to serve various purposes that can’t be achieved by only HTML (or table if you recall). The point is, JavaScript by that time was not something that you will treat as a first class web development technique. Java, PHP or ASP were, but not JavaScript.

Fast forward

15 years later, now days, JavaScript is not only the first citizen of the web front-end society, but also the cool kid for many other areas like server side (node.js) and even Internet of things. This piece of code onmouseover="javascript:this.select()" is never cool anymore, instead it is almost like a crime if you write it not just for kidding. JavaScript is not a scripting language anymore. In fact, all main stream JavaScript implementations are JIT compilation based or at least mix compilation and interpretation, for decent performance purpose.

We started to write thousands of lines of JavaScript codes in either one file or a set of files. These codes plus all the libraries (jQuery, Angular, just to name a few) are almost a sea of JavaScript codes, yet most of them are loaded into browser by using the plain old <script/> tags that we are all familiar with. But we all know the pain of using this tag and the consequences if not enough attention is paid when maintaining these tags, the ordering, in particular. And the round trips overhead the many <script/> tags will incur. You might argue that putting everything into a bundle or a few bundles solves the problem. But again, what about the churns you have to deal with the bundle definition files. You still have to be very careful with the ordering, etc.

The Popular module formats/loaders

So to resolve this JavaScript codes organization problem, the module concept had become more and more popular in the past several years, and eventually led to a few popular module formats and their loaders. One of them is CommonJS, the de facto module standard on node.js platform. The other is AMD which was invented for browser scenarios.

The CommonJS loading scenario is relatively straightforward because the modules/files are loaded directly from the local file system. It is naturally a sync operation. In fact, in node.js, it is just a require() function call, as shown below:


The AMD format, however, is a bit different. In a browser world, loading resources, e.g. scripts from an Internet server should always be asynchronous considering the IO latency. Also, the module codes need to be wrapped, usually in IIFE format, otherwise, they are all gonna be globals. Here is an example:


In order to load the scripts, a browser context aware loader is needed. RequireJS was the de facto AMD loader. It is a bit outdated now. I’ll show you why later. Today we have the new cool kid called systemjs. We’ll get back to systemjs’ details later in this post. Based on the above code snippets, we can see there are some clear pros and cons for each format.

CommonJS format is really nice in the sense that we don’t need to wrap things in a function call. And the node.js loader (require()) will take care of the exports holder as well. But the bad part is also obvious in that it doesn’t have async semantic in the loader. You can require anywhere in the code, but we really need it to be async for browser scenarios. AMD and its loaders support async very well, totally work, however, the syntax, the wrapper style is not ideal. It’s just half way to an ideal JavaScript module solution.

The ideal module solution, ES6/ES2015

The JavaScript community is moving very quickly lately. Particularly the ES6 or ES2015 had been approved with quite a few goodies in it. I personally think the new module format is the one with the biggest potential impacts to the web. With the new ES6 module format, we can re-write the codes above like this:


In the above code snippets, I’m also using the new ES6 class syntax which is another very nice feature. Back to the point of module format, you can see now we have a much cleaner way of defining modules and their dependencies. There is no need to wrap things in an IIFE anymore. There is also no need for the magic exports holder object either. With all these good aspects, what really left is that we need a loader to make this module format works. And the loader better work in a much less overhead way.

The systemjs loader

Systemjs is a loader that can support the new ES6 module format, perfectly. In addition to that, it supports not only the new format, but also all popular legacy formats, CommonJS, AMD, and even globals. Isn’t it awesome? In fact, it is even more powerful, I’ll show you why in below. For a detailed demo, please see this little github demo repository.

Firstly, systemjs can load various different module formats as I mentioned above. Though in reality, we don’t usually mix too many different formats in a project. What we really want is the ES6 module format support. It is amazing to have this supported even before the main stream browsers fully support it. It accomplishes this with the help of the transpilers, the popular ones are Babel, Traceur and our beloved TypeScript. What does this mean? This means that, you can write ES6 modules today and don’t need to worry about if the browsers support them or not, because the ES6 modules will be seamlessly transpiled down to ES5 which is fully supported today by all main stream browsers. The transpilation can happen on the fly in browser for development workflow. It can also and must happen during build/bundle time for production scenarios, of course.

Secondly, systemjs has plugin loaders supported, meaning that many other kinds of resources can be universally loaded via systemjs just like JavaScript. This pattern today is so popular and it is so easy to use to manage modular web client apps (aka SPA). For example, HTML template files can be loaded dynamically and also bundled together with the feature JavaScript components. So are the LESS/CSS files, they can be authored and bundled in the same modular way. And all load via systemjs seamlessly/happily.

Thirdly, systemjs is not alone. It is accompanied by two other awesome tools, JSPM and systemjs-builder. JSPM as the name suggests is a package manager for browser scenarios, similarly as npm to node.js. With JSPM’s support, you can easily consume both well designed npm packages, and  even the raw repositories on github.

Systemjs-builder is the build/bundle part of the systemjs story. Remember bundling is a way to overcome the HTTP 1.x header of line blocking issue that will eventually disappear once HTTP 2 is established by major Internet service providers. Bundling by that time will be an anti-patter that we would need to un-learn. Really the loader is the thing we are looking at and bundling is the nice necessary feature it carries to solve the reality issue. This is also the reason why I personally favor the loader concept/tool over the bundler solutions like webpack. Webpack is also an awesome tool, but I don’t see a clear future of it because it solves the problem in a not-very-correct way.

Summary

The momentum we are seeing in the JavaScript and web front-end community is very exciting. It is generating very good stuff right now. The ES6 module formats and systemjs are just two of the many. I wish this little post has been helpful for folks that are new to this new world. Again, please go check out this little github demo repository to get some practical experiences of how all these things work together, beautifully.

Till next time,

-Jack

JavaScript modules and a loader, systemjs

A list of readings on async programming

Understanding the SynchronizationContext in ASP.NET

http://vegetarianprogrammer.blogspot.com/2012/12/understanding-synchronizationcontext-in.html

It’s All About the SynchronizationContext

http://msdn.microsoft.com/en-us/magazine/gg598924.aspx

Don’t Block on Async Code

http://blog.stephencleary.com/2012/07/dont-block-on-async-code.html

ExecutionContext vs SynchronizationContext

http://blogs.msdn.com/b/pfxteam/archive/2012/06/15/executioncontext-vs-synchronizationcontext.aspx

A list of readings on async programming

Something I should have been aware of two years ago

I really wish I have known about these changes two years ago. So that I can save some hours for resolving a weird gacutil issue.

http://blogs.msdn.com/b/mjeelani/archive/2010/06/07/top-2-things-you-should-know-about-the-global-assembly-cache-gac-in-net-4-0.aspx

http://blogs.msdn.com/b/astebner/archive/2006/11/04/why-to-not-use-gacutil-exe-in-an-application-setup.aspx

http://blogs.iis.net/davcox/archive/2009/07/14/where-is-gacutil-exe.aspx

Hope this helps,

-Jack

Something I should have been aware of two years ago

If you don’t want to use one input method for all

I’m using Windows 8 for my regular daily work. I’m also using English as the display language and most time I type English for almost everything. However, I’m a Chinese and from time to time I need to type Chinese, for example, in the IM message window to chat with my friends. So that, of course I have Chinese language and its IMEs installed.

Not like in the previous versions of Windows, Windows 8 now uses Win+space hotkey to switch between the IMEs. The beloved years of Ctrl+space experience disappears, suddenly. And, another damn thing I really hate is, once I switch the IME, all running applications now are with that IME, meaning that I have to switch again and again between English input and Chinese input, once after I chat with a friend when I have to type Chinese.

So today I’m enough with these switches and spent some little time on the language configurations of Windows 8 and it turns out I can get the comfortable experience back. Smile

Just go to the Change your language preferences window, open the Advanced settings option, and turn on the Let me set a different input method for each app window option. Then log off and re-login, and voilà, all the familiar behaviors are back!

Hope this helps,

-Jack

If you don’t want to use one input method for all

LocalResource for Diagnostics, the size config matters

In one of my Azure projects, we use local storage to temporarily store some logging data files and then leverage the Windows Azure Diagnostics (WAD) to transfer these files to a storage account.

We encountered a weird problem when we tried to configure the size for the abovementioned local resource. I’ll call it LocalStorageLogDump as below. My initial thinking of this sizeInMB configuration is that as long as it is below the limit of the virtual machine’s local storage size, which is about 200GB for a small instance, it should be just fine. However, when I put a 5000MB (5GB) in the config, it failed to start the diagnostics monitor.

By reflectoring the diagnostic assembly and later reading the following article, I then realized, the local resource that is used for WAD, has to be less than the size of a local resource named DiagnosticStore which by default is 4GB and not in the .csdef file. Then after I explicitly added that configuration entry and gave it a larger value, WAD comes back to work.

http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.diagnostics.diagnosticmonitorconfiguration.overallquotainmb.aspx

<LocalResources>
<LocalStorage name=”LocalStorageLogDump” cleanOnRoleRecycle=”true” sizeInMB=”5000″ />
<LocalStorage name=”DiagnosticStore” cleanOnRoleRecycle=”false” sizeInMB=”10000” />
</LocalResources>

[Update: 9/16/2013]

The cleanOnRoleRecycle attribute is better with a “false” value, according to this post, in case you encountered the same/similar issues.

LocalResource for Diagnostics, the size config matters