The Road to Angular 2.0 part 2: ES6

Intro


I gave a presentation at the GOTO conference in Amsterdam titled: The Road to Angular 2.0. In this presentation, I walked the Road to Angular 2.0 to figure out why Angular 2.0 was so different from 1.x.

This series of blogposts is a follow up to that presentation.

theroadtoangular3

ES6


Last week we discussed the new template syntax in Angular 1.x. This week it is time to discus ES6 and how it affects Angular 2.0.

ECMAScript 6 is the next version of JavaScript. The specs have been frozen and now it is up to the browser vendors to implement them. ES6 brings us some exiting new features. Let's take a whirlwind tour and look at some of them.



Whirlwind tour


Fat Arrows


JavaScript is becoming more 'functional' in each iteration. ES5 added: map, reduce, filter and more. These functions take other functions as arguments. The functions that are passed in as arguments become less readable when they are inlined. For example this is quite verbose:
var doubled = [1, 2, 3, 4, 5].map(function(x) {
  return x * x;
};

With ES6's 'fat arrow' notation, writing lambda expression (anonymous functions) becomes really easy:
var doubled = [1, 2, 3, 4, 5].map((x) => x * x);

Fat arrow was created to allows us to write really short function definitions. Let's look and break down another example:
var square = x => x * x;

console.log(square(5)); // -> 25

Here you can see the function 'square' being defined as: x => x * x. What this says is: define a function with one parameter called x, which evaluates to x * x. The value of the expression is implicitly the return value, so there is no need for a return statement.

You can also define functions which take multiple parameters like so:
var add = (a, b) => a + b;

console.log(add(10, 5));

When creating a function with multiple parameters you must define them within parentheses.

You can also have multiple statements within a fat arrow by using brackets:
var squareAndPrint = x => {
  var squared = x * x;
  console.log(squared);
  return squared;
};

var fourSquared = squareAndPrint(4); // Prints 16 and returns 16

The fat arrow also has one other nice property: it doesn't change the 'this' context. Compare and contrast the following ES5 and ES6 code:
// ES5
var maarten = {
  name: 'Maarten',
  age: 25,
  birthDay: function() {
    console.log(this.name + ' is ' + this.age);
    var self = this; // setTimeout changes this context, so keep it safe.

    setTimeout(function() {
      self.age += 1;
      console.log(self.name + ' is ' + self.age);
    }, 1000);
  }
}

maarten.birthDay();
// Prints Maarten is 25
// Prints Maarten is 26

// ES6
var bert = {
  name: 'Bert',
  age: 65,
  // ES6 Enhanced object literal allows for each function definitions 
  birthDay() { 
    console.log(`${this.name} is ${this.age}`); // ES6 template strings 

    setTimeout(() => {
      this.age += 1;
      console.log(`${this.name} is ${this.age}`);
    }, 1000);
  }
}

bert.birthDay(); 
// Prints Bert is 65
// Prints Bert is 66

setTimeout normally changes the 'this' context, which is why in ES5 you often bind 'this' to some variable for later use. In the example this variable was called 'self'. The fat arrow keeps the outer 'context' of where it was called, in the example above 'bert' would be 'this'. This makes 'this' act a little more as you would expect it to work. For more info see: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/Arrow_functions

Const


In ES6 you can define constants which cannot be reassigned via the 'const' keyword:
const PI = 3.14;

PI = 15 // Error PI is readonlyconst PI = 3.15;

Constants cannot be reassigned, but they can be changed:
const names = [];
names.push('Sander');
names.push('Stefan');
console.log(names); // prints ['Sander', 'Stefan'];

Constants are lexically scoped:
function greet(greeting) {
  const NAME = 'Maarten';
  console.log(greeting + ' ' + NAME);
}

greet('Howdy!'); // logs: Howdy! Maarten

console.log(NAME); // NAME is not defined

NAME is created inside the scope for the greet function. Outside of the greet functions NAME is not defined.

Let


'let' is a lot like 'var' except it is scoped to the 'nearest' closing block. For example:
if (true) {
  let x = 10;
  console.log(x);
}

console.log(x); // Error 'x' is not defined in this scope

Here you can see that 'x' is only available inside the 'if' block because that is where the 'x' was defined. If x was defined with a var however, the number 10 would have been printed twice. So let allows you to scope variables more tightly.

However let definitions are accessible in child scopes:
let x = 'hello';

console.log(x); // prints 'hello';

if (true) {
  console.log(x); // prints 'hello'
}

console.log(x); // prints 'hello';

Redefining a let in a child scope does not affect the outer scope's let definition, because a let is defined per scope, for example:
let x = 'hello';

console.log(x); // prints 'hello';

if (true) {
  let x = 10;
  console.log(x); // prints 10
}

console.log(x); // Still prints 'hello';

When you try to redefine a 'let' in a child scope by using the 'let' from the parent scope you get a ReferenceError:
let x = 'hello';

console.log(x); // prints 'hello';

if (true) {
  let x = x + ' world!'; 
  console.log(x); // ReferenceError: can't access lexical declaration `x' before initialization
}

This can be explained because in the statement: let x = x + ' world!' the second 'x' refers to the 'let x' before the statement, and not the let x = 'hello' in the scope above. Within x + ' world!' the let is still uninitialized  which causes the error.

Destructuring


Destructuring makes it easy to get values from complex objects and assign them to variables.

For example to get certain properties from an object and assign them:
let frame = {
  x: 10,
  y: 200,
  width: 100,
  height: 300
};

let {x, y} = frame;

console.log(x); // prints 10
console.log(y); // prints 200

You can do the same thing for 'positions' in an array:
let tuple = ['Maarten', 16, 1989];

let [name, age, birthYear] = tuple;

console.log(name); // prints 'Maarten'
console.log(age); // prints 16
console.log(birthYear); // prints 1989

You can use destructuring on a functions parameters too:
let frame = {
  x: 10,
  y: 200,
  width: 100,
  height: 300
};

function moveBy({x, y, width, height}, [dx, dy]) {
  return {
    x: x + dx,
    y: y + dy,
    width, // In ES6 this is equivalent to width: width,
    height // In ES6 this is equivalent to height: height,
  };
}

frame = moveBy(frame, [10, 10]);

console.log(frame); // prints {x: 20, y: 210, width: 100, height: 300}

In the example above you can see both array and object destructuring happen in the moveBy function. What makes destructuring powerful is that it allows you to program to the 'shape' of the data structure. One thing about destructuring objects is that you can name the binding whatever you want. For instance, you could rewrite moveBy to this:
function moveBy({x: oldX, y: oldY, width, height}, [dx, dy]) {
  return {
    x: oldX + dx,
    y: oldY + dy,
    width, // In ES6 this is equivalent to width: width,
    height // In ES6 this is equivalent to height: height,
  };
}

Whatever the 'value' of the key is becomes the binding for the variable in the function. So

what  {x: oldX} says is: there's a key called x in the first parameter and I want to name it oldX.

Classes


JavaScript has prototypical inheritance, which makes it stand out from other languages, which use the more traditional classical inheritance such as C++, Java, Ruby, Python, C# and Objective-C. People coming from those languages would often create libraries that would use JavaScripts prototypical inheritance to mimic the more traditional classical inheritance.

ES6 gives us some syntactic sugar to make the more traditional classical inheritance possible, without having to use a library. It is important to note that behind the scenes 'classes' are still implemented using prototypical inheritance. Here an example:
class Living {
  constructor(alive) {
    this._alive = alive;
  }

  get isAlive() {
    return this._alive;
  }

  set alive(value) {
    this._alive = value;
  }
}

class Human extends Living {
  constructor(name) {
    super(true);
    this.name = name;
  }

  sayHi() {
    if (this.isAlive) {
       console.log(`${this.name} says hi!`);
     } else {
       console.log('What is dead may never die!');
     }
   }
}

var human = new Human('Maarten');

human.sayHi(); // prints Maarten says hi!

human.alive = false;

human.sayHi(); // prints What is dead my never die!

console.log(human instanceof Living); // prints true

console.log(human instanceof Human);  // prints true

The function 'constructor' is the constructor for that class, you cannot add multiple constructors via method overloading.

  •  'super' is used to call the parent constructor, in Human's case that is Living.

  • The 'get' before isAlive means isAlive is a computed property. See get. This makes this.isAlive possible without parenthesis.

  • The 'set' before alive means you can set the value via assignment. This makes human.alive = false possible.

  • You can only extend one class at a time, multiple inheritance is not possible.

Generators


Generators are complex creatures that allow for some pretty awesome functionality. I doubt that you ever need to write a generator yourself, but frameworks creators can use it to make your life easier.

So what is a generator? A generator is a function that can be paused mid execution to give or receive values. It does so via the 'yield' keyword. Lets look at a simple generator:
// The '*' denotes that threeCounter is a generator
function *threeCounter() {
  yield 1;
  yield 2;
  yield 3;
}

// Create an instance of actual generator by calling it.
let counter = threeCounter();

console.log(counter.next().value); // prints 1
console.log(counter.next().value); // prints 2
console.log(counter.next().value); // prints 3

console.log(counter.next().value); // prints undefined
console.log(counter.next()); // prints {value: undefined, done: true}

In the example above we define a generator called threeCounter, it will give a number when it is called, after it has been called three times it is done. When you call counter.next you are given an object, the object has two properties: value and done, value is what the generator yielded, and done is a boolean which says if the generator has any new values to give.

You can instantiate a generator as many times as you want:
let a = threeCounter(); 
let b = threeCounter(); // 'b' is completely separate from 'a'

console.log(a.next()); // prints {value: 1, done: false}
console.log(a.next()); // prints {value: 2, done: false}

console.log(b.next()); // prints {value: 1, done: false}

Each generator you create acts independently from other generators of the same type. I would like to say that calling a generators creates "instances" of that generators, like calling new on a class would. Perhaps it would have been better if generators were created with the 'new' keyword as well.

A generator is also an iterator, this means we can use it inside for each loops:
for (let number of threeCounter()) {
  console.log(number);
}

You can make generators that never stop providing values, for instance here's a generator which creates class names for a zebra striped tables:
function *zebraGenerator() {
  const GRAY  = '.gray';
  const WHITE = '.white';

  let color = GRAY;

  while(true) {
    if (color === GRAY) {
      yield color;
      color = WHITE;
    } else {
      yield color;
      color = GRAY;
    }
  }
}

let zebra = zebraGenerator();

console.log(zebra.next().value); // prints '.gray'
console.log(zebra.next().value); // prints '.white'
console.log(zebra.next().value); // prints '.gray'
console.log(zebra.next().value); // prints '.white'

So even though the zebraGenerator has a while(true), it doesn't run in an infinite loop, it stops each time there is a yield and provides the caller with a color.

We've seen how we can get value's from a generator but we can also provide generators with values:
function massiveCalculation(generator) {
  setTimeout(() => {
    generator.next(42);
  }, 5000);
}

function *resultPrinterGenerator(name) {
  console.log(`=== ${name} ===`);
  console.log('Started on: ' + new Date().toString());

  var result = yield;
  console.log(`The answer is: ${result}`);
  console.log('Stopped on is: ' + new Date().toString());
  console.log(`=== ${name} ===`);
}

var resultPrinter = resultPrinterGenerator('massiveCalculation');

/*
  Quirk: you have to call 'next' at least once before you
  can send a value to a generator.
*/
resultPrinter.next();

massiveCalculation(resultPrinter);

/* Console output:
=== massiveCalculation ===
Started on: Wed May 15 2015 12:51:49 GMT+0200 (CEST)
The answer is: 42
Stopped on is: Wed May 15 2015 12:51:54 GMT+0200 (CEST)
=== massiveCalculation ===
*/

I know this example above is kind of contrived, but it demonstrates how to send values from the outside to the generator by using generator.next(42). You can also see that you can pass parameters to the generator function itself. In the above example I gave the string 'massiveCalculation' as a parameter, so the printer could make a nice header.

Passing values to generators is typically something library creators use to make our lives easier. For example:
import csp from 'js-csp';

csp.go(function* () {
  let element = document.querySelector('#uiElement1');
  let channel = listen(element, 'mousemove');

  while (true) {
    let event = yield csp.take(channel);
    let x = event.layerX || event.clientX;
    let y = event.layerY || event.clientY;
    element.textContent = `${x}, ${y}`;
   }
});

This is from a library called js-csp, with it you can create Go like channels. In the example above

a channel for 'mousemove' events is created, and it is consumed using yield to print the location

of the mouse. With channels you can implement consumer and producer like patterns to manage

asynchronous events.

Another cool example uses generators make asynchronous code look like synchronous code:
co(function* () {
  try {
    let [croftStr, bondStr] = yield Promise.all([
      getFile('http://localhost:8000/croft.json'),
      getFile('http://localhost:8000/bond.json'),
    ]);

    let croftJson = JSON.parse(croftStr);
    let bondJson = JSON.parse(bondStr);

    console.log(croftJson);
    console.log(bondJson);
  } catch (e) {
    console.log('Failure to read: ' + e);
  }
});

This "co" function comes from the Co library, it lets you yield promises to "co" so it can handle the asynchronous parts of the code. It will resume running the code once all promises are resolved, this way you don't have to write the then or error functions. This makes the code look synchronous, which makes the code easier to understand.

Here is a really exhaustive look at generators from ES6 guru Dr. Axel Rauschmayer.

Of course Co is just a bridge until ES7's 'await' syntax arrives!

Modules


So there is a lot of cool new stuff in ES6, but there is still one problem: how are you going to share all the classes, generators, and variables you have made? Until a couple of years ago the most common way was to give people a JS file and namespace your code, something like this:
var $ = (function() {
  var m = {};
  var _p = 10; // private value do not touch!
  m.awesome = function(b) {
    return _p * b;
  };
  return m;
}());

This way you had private variables and created an API you exposed to some global variable. There are many downsides to this approach:

  • Name clashes with if some other library uses the $ sign other than you

  • Cannot import specific functions, you must take everything.

  • Cannot load modules programmatically / lazily.

Luckily ES6 has added support for creating modules. Let's define an ES6 module:
// Filename: Frame.js

export function moveBy({x, y, width, height}, [dx, dy]) {
  return {x: x + dx, y: y + dy, width, height};
}

export function origin(frame) {
  return {x: frame.x, y: frame.y};
}

export function size(frame) {
  return {width: frame.width, height: frame.height};
}

export function getCenter({x, y, width, height}) {
  return {
    x: x + width / 2,
    y: y + height / 2
  }
}

export function distance({x: x1, y: y1}, {x: x2, y: y2}) {
  let xd = x2 - x1;
  let yd = y2 - y1;

  return Math.sqrt(xd * xd + yd * yd);
}

export const maarten = "Maarten";

We can then import the module above in a couple of ways:
// 1. Import everything from the module to the current namespace:
import * from "Frame";

let f = {x: 10, y: 10, width: 100, height: 100};
size(f);

// 2. Import everything under a binding in the current namespace:
import * as Frame from "Frame";

let f = {x: 10, y: 10, width: 100, height: 100};
Frame.size(f);

// 3. Import only specific functions from the module
import {size, moveBy} from "Frame";

let f = {x: 10, y: 10, width: 100, height: 100};
size(f);
moveBy(f, [10, 44]);

// 4. Import specific functions and rebind them under a different name
import {size as frameSize} from "Frame";

// Name clash!
function size() 
{ 
 return 9000; 
} 

let f = {x: 10, y: 10, width: 100, height: 100};
frameSize(f);

The examples above show how versatile the new import syntax is. It is easy to prevent name clashes because there are so many ways to rename imports.

Want to know more?


Here's a list of resources with even more examples. I recommend going through the first two:

https://github.com/lukehoban/es6features

https://github.com/google/traceur-compiler/wiki/LanguageFeatures

http://davidwalsh.name/es6-generators

http://www.2ality.com/2015/03/es6-generators.html (exhaustive look at generators)

http://jakearchibald.com/2014/es7-async-functions/ (technically ES7 but it is to awesome to ignore)

http://www.2ality.com/2014/09/es6-modules-final.html

ES6 and Angular 2.0


By now you have a pretty good idea of some of the features that ES6 adds to JavaScript. So what does it have to do with Angular 2.0?

The first thing is that Angular 2.0 will use classes a lot more instead of functions. Everything from Directives and Services will be classes in 2.0.

But the most important thing is that 2.0 uses ES6’s module system instead of having a custom module system that 1.x had. This greatly affects the way we write the JavaScript part of our Angular 2.0 code.

Sneak peak


Here is a small example on how you would use modules in Angular 2.0:
import {Component, View} from 'angular2/angular2 ';

Angular 1.x's module system


So what was wrong with the 1.x module system? Lets look at an example:
angular.module('users')                                                
.factory('userFactory', ['$http', function($http) {                    
  // code for userFactory which uses $http                             
}]);

In the module definition above we see a factory called "userFactory" being assigned to the

"users" module. The "userFactory" has a dependency on the $http module that Angular 1.x provides.

The first downside to the Angular 1.x module system is that it is string based. This makes the module system brittle: one spelling mistake and the whole thing falls down like a house of cards.

The second downside is that in order to survive minification (jsmin) you must declare all dependencies inside of an array as strings. This is why '$http' is declared inside the array as a string, and as $http, the variable, in the function. You can use ngAnnotate so you don’t have to write this code manually, but it is still a hassle.

The third, and most important downside, is that Angular 1.x modules only work inside the Angular world. If you have found a great library that was written in pure JavaScript without Angular in mind, you must jump through hoops to get it working inside Angular. The same is also true in reverse, if you have a great Angular module and you want to use it outside of Angular, you are going to have to rewrite the code.

Conclusion


By embracing ES6 and its module system it will become much easier to use existing non Angular JavaScript code in an Angular project, and vice versa.

This is not only true for Angular but also true for other frameworks as well such as Ember, React and Knockout. Sharing code between frameworks is going to be easier than never before. ES6 modules will act as a bridge between frameworks and the greater JavaScript world.

I hope that the ES6 modules system will unite the JavaScript community.

So in conclusion when you hear about the death of the Angular 1.x module system, thats a good thing. We are getting a great alternative in ES6 modules in return.

Next week we will look at Types, and why the Angular team thought ES6 alone was not enough!