JavaScript - Generators

Generators are a new way of working with functions and iterators introduced in ES6 (or generator functions). A generator is a function that can stop in the middle and then resume where it left off. In short, a generator looks like a function but acts like an iterator.

Before we get into the technical details, here's a simple analogy to help you understand generators:

Assume you're reading a gripping techno-thriller. You are so engrossed in the book that you barely hear the doorbell ring. It's the pizza delivery guy calling. You stand up to unlock the door. However, before you do so, you place a bookmark on the last page you read. You mentally save the plot's events. Then you go get some pizza. When you return to your room, you begin reading the book from the page where you placed the bookmark. You don't start over on the first page. In some ways, you served as a generator function.


Let's look at how we can use generators to solve some common programming problems. But first, let's define what a generator is.

What exactly are generators?

A normal function, such as this one, cannot be terminated before it completes its task, i.e. before its last line is executed. It adheres to the run-to-completion model.

function normalFunc() {

The only way out of the normalFunc is to return to it or throw an error. If you call the function again, the execution will restart from the beginning.

A generator, on the other hand, is a function that can stop in the middle and then resume where it left off.

Here are some additional definitions of generators:

Generators are a subclass of functions that make writing iterators easier.

A generator is a function that returns a series of values rather than a single value, i.e. you generate a series of values.

A generator in JavaScript is a function that returns an object, on which you can call next (). Every call to next() will result in the return of an object of shape:

  value: Any,
  done: true|false

The value will be stored in the value property. The property done is either true or false. When the done condition is met, the generator comes to a halt and no longer generates values.

Here's an example of the same:

Normal Functions vs Generators

Take note of the dashed arrow just before Finish in the Generators section of the image, which closes the yield-resume-yield loop. There is a chance that a generator will never finish. We'll look at an example later.

Creating a Generator

Let's look at how to make a generator in JavaScript:

function * generatorFunction() { // Line 1
  console.log('This will be executed first.');
  yield 'Hello, ';   // Line 2
  console.log('I will be printed after the pause');  
  yield 'World!';
const generatorObject = generatorFunction(); // Line 3
console.log(; // Line 4
console.log(; // Line 5
console.log(; // Line 6
// This will be executed first.
// Hello, 
// I will be printed after the pause
// World!
// undefined

Pay attention to the parts that stand out. Instead of just function, we use function * syntax to create a generator function. There can be any number of spaces between the function keyword, the *, and the function name. Because it is just a function, you can use it anywhere a function can be used, such as within objects and class methods.

We don't have a return inside the function body. Instead, we have a different keyword yield (Line 2). It's an operator that allows a generator to pause itself. When a generator comes across a yield, it "returns" the value specified after it. Hello is returned in this case. In the context of generators, however, the term "returned" is not used. "The generator has yielded Hello," we say.

function *  generatorFunc() {
  yield 'a';
  return 'b'; // Generator ends here.
  yield 'a'; // Will never be executed. 

Line 3 is where we create the generator object. We appear to be invoking the function generatorFunction. Yes, we are! A generator function, on the other hand, always returns a generator object rather than any value. An iterator is the generator object. As a result, it can be used in for-of loops or other functions that accept an iterable.

In Line 4, we call the generatorObject's next() method. The generator starts executing with this call. The console.log command will be executed first. Then it comes across a yield 'Hello,'. The value is returned as an object {value: 'Hello, ', done: false}, and the generator suspends/pauses. It is now awaiting the next invocation.

In Line 5, we call next() once more. This time, the generator wakes up and resumes execution where it left off. The following line it discovers is a console.log. It records the string that will be displayed after the pause. Another yield is discovered. The value is returned in the form of the object {value: 'World!', done: false}. We extract and log the value property. The generator has gone back to sleep.

In Line 6, we use next() once more. There are no more lines to execute this time. Remember that if no return statement is provided, every function returns undefined. As a result, the generator returns (rather than yielding) an object {value: undefined, done: true}. The done parameter is set to true. This marks the end of the generator. It can no longer generate new values or resume because there are no more statements to execute.

To run the generator again, we'll need to create a new generator object.

yield Delegation

Generators can use the yield* expression in addition to the regular yield operator to delegate additional values to another generator. When the yield* is encountered within a generator, it will enter the delegated generator and start iterating through all the yields until the generator is closed. This can be used to semantically organise your code by separating different generator functions while still having all of their yields iterable in the correct order.

To demonstrate, we can create two generator functions, one of which will yield* operate on the other:

// Generator function that will be delegated to
function* delegate() {
  yield 3
  yield 4

// Outer generator function
function* begin() {
  yield 1
  yield 2
  yield* delegate()

Next, let’s iterate through the begin() generator function:

// Iterate through the outer generator
const generator = begin()

for (const value of generator) {

This will give the following values in the order they are generated:


The outer generator returned 1 and 2, then delegated to the other generator, which returned 3 and 4.

yield* can also delegate to any iterable object, such as an Array or a Map. Yield delegation can help with code organisation because any function within a generator that wants to use yield must also be a generator.

Generator Object Methods and States

The following table shows a list of methods that can be used on Generator objects:

Method Description
next() Returns the next value in a generator
return() Returns a value in a generator and finishes the generator
throw() Throws an error and finishes the generator

The next table lists the possible states of a Generator object:

Status Description
suspended Generator has halted execution but has not terminated
closed Generator has terminated by either encountering an error, returning, or iterating through all values

Generator Applications

There are numerous fantastic applications for generators. Let's take a look at a few of them.

Iterables Implementation

When you implement an iterator, you must create an iterator object with a next() method by hand. You must also manually save the state. It is often extremely difficult to do so. Because generators are iterable, they can be used to implement iterables without the need for additional boilerplate code. Let's look at a simple example.

We need to create a custom iterable that returns This, is, and iterable. Here's an example of an iterative implementation:

const iterableObj = {
  [Symbol.iterator]() {
    let step = 0;
    return {
      next() {
        if (step === 1) {
          return { value: 'This', done: false};
        } else if (step === 2) {
          return { value: 'is', done: false};
        } else if (step === 3) {
          return { value: 'iterable.', done: false};
        return { value: '', done: true };

for (const val of iterableObj) {

// This
// is 
// iterable.

Here’s the same thing using generators:

function * iterableObj() {
  yield 'This';
  yield 'is';
  yield 'iterable.'

for (const val of iterableObj()) {

// This
// is 
// iterable.

You can contrast the two versions. It's true that this is a somewhat fabricated example. However, it does demonstrate the points:

  • We don't have to be concerned about Symbol.iterator
  • We don't have to implement next()..
  • We don't have to manually create the next() return object, i.e. {value: 'This,' done: false}.
  • We are not required to save the state. The state was saved in the variable step in the iterator's example. Its value determined what the iterable output. We couldn't do anything like this in the generator.

Infinite Data Streams

It’s possible to create generators that never end. Consider this example:

function * naturalNumbers() {
  let num = 1;
  while (true) {
    yield num;
    num = num + 1

const numbers = naturalNumbers();

// 1
// 2

We create a naturalNumbers generator. We have an infinite while loop inside the function. Furthermore, we get the num from that loop. The generator is suspended when it yields. When we call next() again, the generator wakes up, resumes execution from where it was suspended (in this case, yield num), and continues until another yield is encountered or the generator finishes. The following statement, num = num + 1, updates num. Then it returns to the top of the while loop. The situation remains unchanged. It comes across the next line, which yields num. It returns the updated num and then suspends. This can go on indefinitely.

Improved Async functionality

Code that makes use of promises and callbacks, such as:

function fetchJson(url) {
    return fetch(url)
    .then(request => request.text())
    .then(text => {
        return JSON.parse(text);
    .catch(error => {
        console.log(`ERROR: ${error.stack}`);

can be written as (using library such as co.js):

Some readers may have noticed a resemblance to the use of async/await. That is not an accident. In cases where promises are involved, async/await can use a similar strategy and replaces yield with await. It could be powered by generators.

It is not the result of the example above; rather, it is an evolution path of how to deal with async code. I believe async/await is solely based on Promises. Read here.

"The generator object is a looper. As a result, it can be used in for-of loops or other functions that accept an iterable." — You can do it because generator is both an iterator and an iterable.

Generators acting as observers

The next(val) function can also be used to receive values from generators. The generator is then referred to as an observer because it wakes up when new values are received. In a sense, it continues to look for values and acts when it finds one. You can learn more about this pattern by clicking here.

The Benefits of Generators 

Lazy Evaluation

As demonstrated by the Infinite Data Streams example, it is only possible due to lazy evaluation. Lazy Evaluation is an evaluation model that postpones evaluating an expression until its value is required. That is, if we do not require the value, it will not exist. It is calculated as we request. Let us look at an example:

function * powerSeries(number, power) {
  let base = number;
  while(true) {
    yield Math.pow(base, power);

The powerSeries function returns the series of a number raised to a power. A power series of 3 raised to 2 would be 9(3²) 16(4²) 25(5²) 36(6²) 49(7²). const powersOf2 = powerSeries(3, 2); simply creates the generator object. None of the values were calculated. If we call next() now, 9 will be computed and retuned.

Memory Efficient

Generators are memory efficient as a direct result of Lazy Evaluation. We only generate the values that are required. With normal functions, we had to generate all of the values ahead of time and save them in case we needed them later. However, with generators, we can postpone the computation until it is required.

To act on generators, we can write combinator functions. Combinators are functions that take existing iterables and combine them to create new ones. take is one such combinator. It starts with the first n elements of an iterable. Here's an example of one implementation.

function * take(n, iter) {
  let index = 0;
  for (const val of iter) {
    if (index >= n) {
    index = index + 1;
    yield val;

Here’s some interesting use cases of take:

take(3, ['a', 'b', 'c', 'd', 'e'])
// a b c
take(7, naturalNumbers());
// 1 2 3 4 5 6 7
take(5, powerSeries(3, 2));
// 9 16 25 36 49


There are a few things to keep in mind when programming with generators.

  • Generator objects are only accessible once. You cannot iterate over the values once they have been exhausted. To re-generate the values, create a new generator object.

    const numbers = naturalNumbers();

    console.log(...take(10, numbers)) // 1 2 3 4 5 6 7 8 9 10
    console.log(...take(10, numbers)) // This will not give any data
  • Generator objects do not support random access, as arrays do. Because the values are generated one at a time, accessing a random value would result in the computation of values up to that element. As a result, it is not random access.


Generators are processes that have the ability to pause and resume execution. Although they are not widely used, they are a powerful and versatile feature of JavaScript.

I sincerely hope that the majority of you find the approach covered here to be helpful. Thank you for reading, and please feel free to leave any comments or questions in the comments section below.

Post a Comment