CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutSign UpSign In
pytorch

CoCalc provides the best real-time collaborative environment for Jupyter Notebooks, LaTeX documents, and SageMath, scalable from individual users to large groups and classes!

GitHub Repository: pytorch/tutorials
Path: blob/main/beginner_source/examples_tensor/polynomial_numpy.py
Views: 494
1
# -*- coding: utf-8 -*-
2
"""
3
Warm-up: numpy
4
--------------
5
6
A third order polynomial, trained to predict :math:`y=\sin(x)` from :math:`-\pi`
7
to :math:`pi` by minimizing squared Euclidean distance.
8
9
This implementation uses numpy to manually compute the forward pass, loss, and
10
backward pass.
11
12
A numpy array is a generic n-dimensional array; it does not know anything about
13
deep learning or gradients or computational graphs, and is just a way to perform
14
generic numeric computations.
15
"""
16
import numpy as np
17
import math
18
19
# Create random input and output data
20
x = np.linspace(-math.pi, math.pi, 2000)
21
y = np.sin(x)
22
23
# Randomly initialize weights
24
a = np.random.randn()
25
b = np.random.randn()
26
c = np.random.randn()
27
d = np.random.randn()
28
29
learning_rate = 1e-6
30
for t in range(2000):
31
# Forward pass: compute predicted y
32
# y = a + b x + c x^2 + d x^3
33
y_pred = a + b * x + c * x ** 2 + d * x ** 3
34
35
# Compute and print loss
36
loss = np.square(y_pred - y).sum()
37
if t % 100 == 99:
38
print(t, loss)
39
40
# Backprop to compute gradients of a, b, c, d with respect to loss
41
grad_y_pred = 2.0 * (y_pred - y)
42
grad_a = grad_y_pred.sum()
43
grad_b = (grad_y_pred * x).sum()
44
grad_c = (grad_y_pred * x ** 2).sum()
45
grad_d = (grad_y_pred * x ** 3).sum()
46
47
# Update weights
48
a -= learning_rate * grad_a
49
b -= learning_rate * grad_b
50
c -= learning_rate * grad_c
51
d -= learning_rate * grad_d
52
53
print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')
54
55