Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/master/Natural Language Processing with Attention Models/Week 1 - Neural Machine Translation/C4_W1_Ungraded_Lab_Stack_Semantics.ipynb
Views: 13373
Stack Semantics in Trax: Ungraded Lab
In this ungraded lab, we will explain the stack semantics in Trax. This will help in understanding how to use layers like Select
and Residual
which operates on elements in the stack. If you've taken a computer science class before, you will recall that a stack is a data structure that follows the Last In, First Out (LIFO) principle. That is, whatever is the latest element that is pushed into the stack will also be the first one to be popped out. If you're not yet familiar with stacks, then you may find this short tutorial useful. In a nutshell, all you really need to remember is it puts elements one on top of the other. You should be aware of what is on top of the stack to know which element you will be popping. You will see this in the discussions below. Let's get started!
Imports
1. The tl.Serial Combinator is Stack Oriented.
To understand how stack-orientation works in Trax, most times one will be using the Serial
layer. We will define two simple Function layers: 1) Addition and 2) Multiplication.
Suppose we want to make the simple calculation (3 + 4) * 15 + 3. Serial
will perform the calculations in the following manner 3
4
add
15
mul
3
add
. The steps of the calculation are shown in the table below. The first column shows the operations made on the stack and the second column the output of those operations. Moreover, the rightmost element in the second column represents the top of the stack (e.g. in the second row, Push(3)
pushes 3
on top of the stack and 4
is now under it).

After processing all the stack contains 108 which is the answer to our simple computation.
From this, the following can be concluded: a stack-based layer has only one way to handle data, by taking one piece of data from atop the stack, termed popping, and putting data back atop the stack, termed pushing. Any expression that can be written conventionally, can be written in this form and thus be amenable to being interpreted by a stack-oriented layer like Serial
.
Coding the example in the table:
Defining addition
Defining multiplication
Implementing the computations using Serial combinator.
The example with the two simple adition and multiplication functions that where coded together with the serial combinator show how stack semantics work in Trax
.
2. The tl.Select combinator in the context of the Serial combinator
Having understood how stack semantics work in Trax
, we will demonstrate how the tl.Select combinator works.
First example of tl.Select
Suppose we want to make the simple calculation (3 + 4) * 3 + 4. We can use Select
to perform the calculations in the following manner:
4
3
tl.Select([0,1,0,1])
add
mul
add
.
The tl.Select
requires a list or tuple of 0-based indices to select elements relative to the top of the stack. For our example, the top of the stack is 3
(which is at index 0) then 4
(index 1) and we Select to add in an ordered manner to the top of the stack which after the command is 3
4
3
4
. The steps of the calculation for our example are shown in the table below. As in the previous table each column shows the contents of the stack and the outputs after the operations are carried out.

After processing all the inputs the stack contains 25 which is the answer we get above.
Second example of tl.Select
Suppose we want to make the simple calculation (3 + 4) * 4. We can use Select
to perform the calculations in the following manner:
4
3
tl.Select([0,1,0,1])
add
tl.Select([0], n_in=2)
mul
The example is a bit contrived but it demonstrates the flexibility of the command. The second tl.Select
pops two elements (specified in n_in) from the stack starting from index 0 (i.e. top of the stack). This means that 7
and 3
will be popped out because n_in = 2
) but only 7
is placed back on top because it only selects [0]
. As in the previous table each column shows the contents of the stack and the outputs after the operations are carried out.

After processing all the inputs the stack contains 28 which is the answer we get above.
In summary, what Select does in this example is a copy of the inputs in order to be used further along in the stack of operations.
3. The tl.Residual combinator in the context of the Serial combinator
tl.Residual
Residual networks are frequently used to make deep models easier to train and you will be using it in the assignment as well. Trax already has a built in layer for this. The Residual layer computes the element-wise sum of the stack-top input with the output of the layer series. Let's first see how it is used in the code below:
Here, we use the Serial combinator to define our model. The inputs first goes through a Select
layer, followed by a Residual
layer which passes the Fn: Addition()
layer as an argument. What this means is the Residual
layer will take the stack top input at that point and add it to the output of the Fn: Addition()
layer. You can picture it like the diagram the below, where x1
and x2
are the inputs to the model:
Now, let's try running our model with some sample inputs and see the result:
As you can see, the Residual
layer remembers the stack top input (i.e. 3
) and adds it to the result of the Fn: Addition()
layer (i.e. 3 + 4 = 7
). The output of Residual(Addition()
is then 3 + 7 = 10
and is pushed onto the stack.
On a different note, you'll notice that the Select
layer has 4 outputs but the Fn: Addition()
layer only pops 2 inputs from the stack. This means the duplicate inputs (i.e. the 2 rightmost arrows of the Select
outputs in the figure above) remain in the stack. This is why you still see it in the output of our simple serial network (i.e. array([3]), array([4])
). This is useful if you want to use these duplicate inputs in another layer further down the network.
Modifying the network
To strengthen your understanding, you can modify the network above and examine the outputs you get. For example, you can pass the Fn: Multiplication()
layer instead in the Residual
block:
This means you'll have a different output that will be added to the stack top input saved by the Residual block. The diagram becomes like this:
And you'll get 3 + (3 * 4) = 15
as output of the Residual
block: