📚 The CoCalc Library - books, templates and other resources
cocalc-examples / data-science-ipython-notebooks / deep-learning / theano-tutorial / intro_theano / intro_theano.ipynb
132937 viewsLicense: OTHER
Kernel: Python 3
Introduction to Theano
Credits: Forked from summerschool2015 by mila-udem
Slides
Refer to the associated Introduction to Theano slides and use this notebook for hands-on practice of the concepts.
Basic usage
Defining an expression
In [1]:
In [2]:
Graph visualization
In [3]:
Out[3]:
dot [@A] ''
|x [@B]
|W [@C]
In [4]:
Out[4]:
sigmoid [@A] ''
|Elemwise{add,no_inplace} [@B] ''
|dot [@C] ''
| |x [@D]
| |W [@E]
|b [@F]
Compiling a Theano function
In [5]:
Graph visualization
In [6]:
Out[6]:
CGemv{inplace} [@A] '' 3
|AllocEmpty{dtype='float64'} [@B] '' 2
| |Shape_i{1} [@C] '' 1
| |W [@D]
|TensorConstant{1.0} [@E]
|InplaceDimShuffle{1,0} [@F] 'W.T' 0
| |W [@D]
|x [@G]
|TensorConstant{0.0} [@H]
In [7]:
Out[7]:
Elemwise{ScalarSigmoid}[(0, 0)] [@A] '' 2
|CGemv{no_inplace} [@B] '' 1
|b [@C]
|TensorConstant{1.0} [@D]
|InplaceDimShuffle{1,0} [@E] 'W.T' 0
| |W [@F]
|x [@G]
|TensorConstant{1.0} [@D]
In [8]:
Out[8]:
The output file is available at pydotprint_f.png
In [9]:
Out[9]:
In [10]:
Out[10]:
The output file is available at pydotprint_g.png
In [11]:
Out[11]:
The output file is available at pydotprint_h.png
Executing a Theano function
In [12]:
Out[12]:
array([ 1.79048354, 0.03158954, -0.26423186])
In [13]:
Out[13]:
array([ 0.9421594 , 0.73722395, 0.67606977])
In [14]:
Out[14]:
[array([ 1.79048354, 0.03158954, -0.26423186]),
array([ 0.9421594 , 0.73722395, 0.67606977])]
In [15]:
Out[15]:
[array([ 2.79048354, 1.03158954, 0.73576814]),
array([ 0.9421594 , 0.73722395, 0.67606977])]
Graph definition and Syntax
Graph structure
In [16]:
Out[16]:
The output file is available at pydotprint_f_notcompact.png
Strong typing
Broadcasting tensors
In [17]:
Out[17]:
(True, False)
In [18]:
Out[18]:
(False, True)
In [19]:
Out[19]:
[[ 1.1 2.1 3.1]
[ 1.2 2.2 3.2]]
Graph Transformations
Substitution and Cloning
The givens
keyword
In [20]:
Out[20]:
array([ 1.90651511, 0.60431744, -0.64253361])
Cloning with replacement
In [21]:
Out[21]:
array([ 1.90651511, 0.60431744, -0.64253361])
Gradient
Using theano.grad
In [22]:
Using the gradients
In [23]:
Out[23]:
[array(0.6137821438190066), array([[ 0.01095277, 0.07045955, 0.051161 ],
[ 0.01889131, 0.12152849, 0.0882424 ],
[ 0.01555008, 0.10003427, 0.07263534],
[ 0.01048429, 0.06744584, 0.04897273]]), array([ 0.03600015, 0.23159028, 0.16815877])]
In [24]:
Out[24]:
[array(0.6137821438190066), array([[ 0.49561888, -0.14531026, 0.64257244],
[ 1.52114073, -0.24630622, -0.2429612 ],
[ 1.57765781, 0.7574313 , -0.47673792],
[ 0.54151161, -0.47016228, -0.47062703]]), array([ 0.99639999, 0.97684097, 0.98318412])]
In [25]:
Out[25]:
The output file is available at pydotprint_cost_and_upd.png
Shared variables
Update values
In [26]:
Using shared variables
In [27]:
Out[27]:
[ 1.78587062 0.00189954 -0.28566499]
In [28]:
Out[28]:
[ 0.94151144 0.72221187 0.66391952]
Updating shared variables
In [29]:
In [30]:
Out[30]:
The output file is available at pydotprint_cost_and_perform_updates.png
Advanced Topics
Extending Theano
The easy way: Python
In [31]: