A JavaScript Ecosystem Overview

In this post I will have a look at the JavaScript ecosystem, starting by describing the history of this programming language, and later covering the different front-end and back-end technologies that exist in this ecosystem.

The Language

The JavaScript programming language was created by Brendan Eich in 1995. It was the time of the browser wars: Internet Explorer and Netscape Navigator were fighting to provide new features and there was no time to lose.

Mr. Eich was working for Netscape and had to create a programming language that would allow web developers to embed small programs into webpages, making them more interactive and dynamic. The final product had to be flexible, easy to learn, have a popular appeal, and be easily implementable. it’s said that Mr. Eich only had 10 workdays to produce the specification draft… and here are some existing programming languages that influenced the result:

  • Scheme - the functions as first class values and the dynamic typing
  • Perl - weak-typing.
  • Self - simple-to-implement prototype-based object-system
  • Java - mainstream syntax and naming conventions

The language was developed under the name Mocha and a later published together with Netscape already under its current de-facto name: JavaScript. This name choice was a marketing move by Netscape, in order to showcase its new creation piggybacking on the hype of the day: Java! Since JavaScript was a success Microsoft reverse-engineered it in order to compete with Netscape, publishing its own version under the name JScript.

From day one there were several incompatibilities between the different implementations of the language, and in order to reduce the impact of this problem, Netscape submitted JavaScript for standardisation by the Ecma International standards organisation in 1997. The resulting standardised version was named ECMAScript and progressively attained universal acceptance as the language specification. Despite this, the name "JavaScript" remained in use as the de-facto name of the language.

The Evolution of the EcmaScript Standard

In order to add new features to the language and to become compatible with the work of other standardisation organisations, the standard has been changing from time to time. The deliverables for each version are known as "EcmaScript <Version Number>". Here goes a brief description of some of the most relevant changes.

EcmaScript 1

The first version of the language standard.

EcmaScript 2

This version was produced in 1998 when the language was also standardised by ISO/IEC. It updated the Ecma standard so that would match the document produced by ISO/IEC.

EcmaScript 3

In 1999 there were several new features being added to the language. The most conspicuous were:

  • Regular Expression Literals
  • New control statements
  • Exception handling

EcmaScript 4

This version draft was abandoned in 2008 due to disagreements about its feature set. It was decided that the output was too radical to be introduced and there was an agreement to create an intermediate version before such drastic developments would take place.

EcmaScript 5

The version that supports by most browsers nowadays (early 2016) was created in 2009. Its most noticeable features are:

  • High-order iteration functions (map, reduce, filter, forEach)
  • JSON support
  • Getters and Setters
  • Better reflection and object properties
  • Promises

EcmaScript 5.1

Just like EcmaScript 2, this spec was produced to align the Ecma standard with the document that was produced by ISO/IEC for EcmaScript 5.

EcmaScript 6 (aka ES 6, EcmaScript 2015, Harmony)

This version added several big changes in 2015, namely:

  • Modules
  • Classes
  • Block-scoped variable declarations (let)
  • Arrow functions
  • Template literal
  • Spread operator
  • De-structuring assignment
  • Parameter default values
  • Rest parameters
  • Symbols

The browser support for this version is still incomplete though there is a transpiler that converts ES6 code into older ES versions.

EcmaScript 7 (aka EcmaScript 2016)

This version was not published yet though it’s already possible to transpile what’s currently specified in its draft to older versions. Some proposed features are:

  • Async functions
  • Decorators
  • List comprehensions
  • Observable objects

This version is expected to be published in June 2016.

Frontend Technologies

It was with the advent of AJAX in 1999 (standardised in 2006) that JavaScript started to grow. Before AJAX, JavaScript was mostly used to animate and provide interactivity to some webpages. Though, as this interactivity would not involve interchanging data with the server - it would only consist on accessory display-logic. With AJAX, JavaScript started to perform key tasks in our web-applications application-logic, to the point that nowadays it’s hard to find web-applications that do not require a JavaScript-enabled browser in order to work. It was after this change that the JavaScript ecosystem started to expand as a client-side programming language.

DOM Manipulation Libraries

The main needs of the developers working with AJAX web applications are:

  • Performing AJAX calls
  • Manipulating the Document Object Model (DOM) - the data structure kept by the browser representing the webpage's structure
  • React to the user's interactions

The core JavaScript functions providing these features were not very practical to be used and sometimes their behaviour was not compatible with different browsers. The first libraries to become popular addressed precisely these three tasks, providing a collection of functions and objects that made these operations easy and functional on every browser. Some of these libraries would also provide plenty of other general-purpose functions. We proceed by presenting some of them:


This JavaScript framework was released in 2005 as part of the Ruby on Rails AJAX support. It directly extended many JavaScript core objects with behaviour that would be somewhat similar to the Ruby programming language, also adding a class-based system to the language. These core extensions were regarded as troublesome by other programming communities and the provided functions as weirder than the following competing library:


The jQuery framework was released in 2006 and - unlike Prototype - has let the language core as it were and rather focused on providing plenty functionalities related with the DOM, events, asynchronicity and effects. One feature to highlight is that jQuery sports a plugin system that allows library programmers to extend its functionality. This library was the most successful library of this kind, being used nowadays in the majority of the websites.


The MooTools framework was released in 2007 and followed the same path as Prototype.js, in the sense that it focused on improving the language core by extending it with extra functionality. It was notable by the class-based system implementation and by being modular, though it did not ever become popular enough to compete with jQuery.

Obsolescence due to later developments

Despite jQuery’s immense popularity, as well as its ability to provide many useful functionalities, it’s important to notice that modern versions of JavaScript already provide many of the functionalities that made jQuery popular.

It’s increasingly popular not to use jQuery and to rather rely on either only the current core JavaScript functions or on libraries that are specialised in each specific task (e.g. fetch for AJAX calls).

MVC-like Frameworks

While using DOM-Manipulation libraries it’s common to keep the application data in the DOM itself, thus there are DOM-Manipulation selector functions that have to be called when needing to access or modify this data. This kind of code is often filled with selector-names and easily becomes a mess as the application's complexity evolves. There is a set of frameworks that allow us to address this problem in an elegant way by leveraging on application patterns that are similar to the famous Model-View-Controller (MVC) pattern.

On the other hand, there is a recent trend in which web-applications are fully loaded in the browser on the first page-load, performing every subsequent operations via AJAX. This kind of applications is called Single-Page Application (SPA) and, as their code is often quite elaborate and complex, it’s recommendable to pick one of these frameworks when implementing a SPA. Let’s look into some of them.


Knockout is an ModelView-ViewModel (MVVM) framework published in 2010 that allows the programmer to create mappings between JavaScript objects and the DOM. Therefore, it only needs to directly subscribe/change this JavaScript objects in order to react to actions in the DOM or to update the DOM. It is a minimal framework that does not provide anything apart of this functionality and that does not have SPAs in mind.


Backbone is an Model-View-Presenter (MVP) framework published in late 2010. it’s focused on SPAs and enforces the separation between application-logic, display-logic and templates. The communication between these entities is managed in an event-based fashion and it also sports a Router class that allows the applications to react to urls and to update these according to the triggered events.


Angular is a framework implemented by Google that used to follow a MVC architecture and that has evolved into an MVVM framework. As Backbone, it’s focused on SPAs and enforces the separation between application-logic, display-logic and templates. It’s interesting to notice that Angular allows the programmer to extend HTML, by creating new tags that can be used in the templates and that encapsulate parts of the page.

Other distinctive feature is that it allows the establishment of two-way data-bindings in-between the templates and the JavaScript objects that are to be rendered by them. In this way, any update to these objects will automatically trigger HTML updates, and - in a similar way - the JavaScript objects will be updated when their corresponding template input fields are changed.

Virtual-DOM based frameworks

As we have described on the previous sections, the display layer of our web applications moved from a simple computational model where the interface to display was only function of some data calculated by the server-side code into something much more dynamic and complex, where it would depend not only on the retrieved data but also on the user interactions and scripts that were processing these.

This complexity often translates itself into complex code that’s hard to share and to reason about. Thus, some people at Facebook started experimenting with a simpler computational model to code the front-end. A computational model that would allow us to think of what is displayed in the page again as result of the state of some few variables instead of the result of the interaction between a "zillion" of Model-View-Controller classes.

The Virtual DOM

In order to make such change take place, something radical had to happen: If we wanted the display to only be the result of applying a function to some state variables we would have to stop manipulating parts of the DOM directly and instead render each version of the display as a whole. As rendering the whole DOM whenever changing something in it would be too slow, it was decided to map the DOM onto a lightweight representation of it that would be easy to regenerate and compare. Thus, whenever changing one variable, we are able to generate a new version of this lightweight representation and compare it with its previous version, thus calculating which parts of the DOM need to be updated. it’s to this lightweight representation of the DOM that we call Virtual DOM.


React was the first framework to use a Virtual DOM. It was created by Facebook and the architecture that needs to be followed by the applications using it’s minimal.

Regarding MVC, React does not stick to it and is rather only about the View part of the code. With it, the View is represented as a tree of entities called components that might be composed of other components themselves.

A component holds some state, can receive properties upon creation, and knows how to render itself for a given set of properties and state variables. Whenever its state is changed, a new version of its display's Virtual DOM is rendered and compared with its previous version, and the needed DOM is updated accordingly.

If some interactions performed on a given component are to change the state of parent components, then the parent component has to have passed the handling code to the subcomponent in the form of a callback. This approach guarantees that components are loosely coupled between themselves, thus making them quite easy to be reused.

One rather controversial choice taken by the creators of this framework is, instead of using HTML templates like the other frameworks, an HTML-like representation of the code is written as XHTML mixed within the JavaScript code, therefore requiring the use of a compiler that knows how to transform this JS and XHTML mix (called JSX) into HTML-generating JavaScript.

Flux and Redux

The component-based approach of React is effective for smaller applications, though for structuring bigger applications Facebook proposes an architecture called Flux, in which the state of the application is stored away of the components in something called state containers.

The original implementation of the Flux architecture was rather complex but there is a simpler and more recent one called Redux that’s becoming very popular and starting to be used even to implement application back-ends. Its architecture is functional and instead of directly mutating state, the programmer is meant to create new versions of it by applying reducer functions, thus allowing easy Undo/Redo and time-travelling debugging.


As the React approach was so revolutionary there are new Virtual DOM implementations appearing and other frameworks based upon them. Riot is one of these, which is differentiates itself from React by:

  • Using HTML templates
  • Allowing the creation of custom HTML tags
  • Being more minimalist
  • Having a Virtual DOM implementation that’s more selective about what it re-renders.

The Back-end Ecosystem

Despite that JavaScript was used on the server side since the mid-90's, its implementations were quite slow and it hasn't become popular for such use before 2008, when Google published the V8 JavaScript engine bundled together with the first version of Chrome web-browser.


When implementing the V8 engine, Google used optimisation techniques that had both been created during the development of already-forgotten-programming-languages compilers, namely of Self's and Strongtalk's compilers. The result was remarkably fast for a dynamically typed programming language like JavaScript, vigorously surpassing other implementations' performance.

On the other hand, both Chrome and V8 were released as open-source software (with a BSD license), thus allowing anyone to build their work on top of them and to distribute them as part of any software without having to pay royalties to Google.


It did not take long until Ryan Dahl created an environment called Node.js where JavaScript could be run on V8 away of the browser. Besides bringing V8 to the server-side, Node is also event-driven, thus allowing any blocking IO operations to be written in an non-blocking asynchronous fashion. This way to address IO together with the speedy V8 made Node suitable to develop server-side real-time applications that required the swiftness of non-blocking IO together with the expressivity of a high-level programming language such as JavaScript.


As JavaScript's core specification is quite small and not thought for the server side, using it for such task usually requires plenty of external libraries. There is thus the need to have a package manager for Node, thus allowing developers to publish packages and easily retrieve libraries that are to be included in the developer's applications. The Node Package Manager (NPM) is such package manager, just in the style of Perl's CPAN or Ruby's RubyGems.

Module Loaders

As a browser language, JavaScript missed statements to specify dependencies between files and to tell the compiler to load these. In order to solve this problem there were two different APIs being specified and competing for this task:

  • Asynchronous Module Definition (AMD)
  • CommonJS Modules
  • ES6 Modules

In the ES5 world, Node.js implemented CommonJS modules while front-end programmers preferred AMD, thus creating chaos on this field. There was thus invented a way to require modules and declare them called Universal Module Definition (UMD), which is rather ugly but makes modules compatible with both module APIs.

Anyway, this mess is about to be wiped by the adoption of ES6 - that already sports a language-level constructs aimed at dealing with modules.

Module Bundlers

When using modules while developing web applications there is often the need to merge the different modules' files into a single file to be included by the web application. There are two libraries that allow developers to perform this task:


Browserify is the oldest web module bundler and allows developers to use the CommonJS module syntax when programming web-applications.


This is a more modern web module bundler that does not only bundle JavaScript but also other assets such as images and CSS. It allows the different assets to be hot reloaded by the browser when there is a change in the modules code.

Task Runners

As projects become bigger and there is the need to transform their assets in order to run them, there's the need to write scripts that manage the execution of these transformations - something like make, ant or rake. There are two dependency-based task runners implemented in JavaScript. An older one called Grunt - in which data that’s intermediate to the different transformation steps is kept in temporary files. And Gulp - which is a more recent one that rather uses memory instead of the temporary files.


As JavaScript is a language born out of the browser, why couldn’t we use it to develop the server side of our web applications? Express is the framework tackling this issue. it’s aimed at being minimalist - like Sinatra, though there are plenty of big libraries developed for it (e.g. Automatic Backoffice generation).


As V8 was used to create Node, it was also used as interpreter by the MongoDB document-oriented database. When interacting with the database the data is shared in JSON-like structures and the query language is about invoking JavaScript functions. This makes this database very popular amongst JavaScript developers.


Yeoman is an interactive project skeleton generator written in JavaScript. It supports creating plenty of JavaScript project types with different configurations - though it’s not only aimed at JavaScript: It may also be used to generate skeletons of Java projects, for instance.

Opinionated Full Development Stacks

There were some attempts to bundle both the tooling and the technologies aimed at creating full-stack applications. We talk about two of these attempts.


The MEAN stack stands for a package with the following set of technologies:

  • MongoDB
  • Express
  • Angular
  • Node

It also comes with the "mean" command-line, that allows generating projects, files and performing other useful command-line tasks.


Meteor is a web framework that uses MongoDB as database. It uses a publish-subscribe protocol applied to resources, allowing clients to get updates when the data is updated on the server. It can be used with any client-side framework.


As not everybody is satisfied either with the JavaScript language or with its support by the browsers, there have been produced several transpilers which convert other programming languages into browser-supported JavaScript versions. Here follow few examples of these:


This transpiler converts modern JavaScript into older, browser-supported JavaScript. it’s very customisable and allows cross compiling between several JavaScript versions.

Emscripten (LLVM)

This converts LLVM bytecode to asm.js, which is a set of JavaScript that forces the compiler to perform type-related optimisations, thus allowing to use any LLVM-based compiler to compile to JavaScript. It should be noticed that the resulting code is often much faster than handwritten JavaScript due to the implicit type information that it conveys.


This transpiler converts a statically typed superset of JavaScript called TypeScript into JavaScript. This language was created by Microsoft and the current version of Google's Angular framework provides documentation for both this language and JavaScript.


This transpiler converts CoffeeScript into JavaScript. CoffeeScript is a programming language inspired by Ruby, Python and JavaScript that removes a lot of the excessive punctuation characteristic of JavaScript code.


This transpiler converts a subset of the Clojure programming language called ClojureScript into JavaScript. Both Clojure and ClojureScript are Lisps - famous for being homoiconic and programmable in themselves, and thus providing a degree of expressivity that’s hard to match by other languages.


JavaScript's ecosystem tells much about the history of the language, and it’s amazing how a language that was originally specified in 10 days gained such traction. It also shows well how the evolution of software sometimes requires patchy solutions from which the essence is later distilled into standards and more mature tools (e.g. the modules system). Another noticeable element is how the application of old ideas has driven breakthroughs in the ecosystem (e.g.: V8).

Though, we should not be naïve to the point of believing that this success is only the product of the language's feature-set. Its standardisation as "the programming language of the web browsers" had a very big impact in getting JavaScript the status of Lingua Franca that it has unprecedentedly attained.

Current technological needs regarding speed, safety and expressivity in browser applications are creating pressure both to turn JavaScript into a different language and to replace it by something more flexible than a high-level programming language. WebAssembly is one of such projects, and if successful it might represent a big turning point in the history of this language. If such a change takes place it’s hard to tell if JavaScript's Lingua Franca status will be kept or whether this "Babel tower" will fall apart, scattering the browser-programming landscape across the diversity that has always characterised the history of programming languages.