Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place.
Path: blob/main/beginner_source/examples_tensor/polynomial_numpy.py
Views: 713
# -*- coding: utf-8 -*-1"""2Warm-up: numpy3--------------45A third order polynomial, trained to predict :math:`y=\sin(x)` from :math:`-\pi`6to :math:`pi` by minimizing squared Euclidean distance.78This implementation uses numpy to manually compute the forward pass, loss, and9backward pass.1011A numpy array is a generic n-dimensional array; it does not know anything about12deep learning or gradients or computational graphs, and is just a way to perform13generic numeric computations.14"""15import numpy as np16import math1718# Create random input and output data19x = np.linspace(-math.pi, math.pi, 2000)20y = np.sin(x)2122# Randomly initialize weights23a = np.random.randn()24b = np.random.randn()25c = np.random.randn()26d = np.random.randn()2728learning_rate = 1e-629for t in range(2000):30# Forward pass: compute predicted y31# y = a + b x + c x^2 + d x^332y_pred = a + b * x + c * x ** 2 + d * x ** 33334# Compute and print loss35loss = np.square(y_pred - y).sum()36if t % 100 == 99:37print(t, loss)3839# Backprop to compute gradients of a, b, c, d with respect to loss40grad_y_pred = 2.0 * (y_pred - y)41grad_a = grad_y_pred.sum()42grad_b = (grad_y_pred * x).sum()43grad_c = (grad_y_pred * x ** 2).sum()44grad_d = (grad_y_pred * x ** 3).sum()4546# Update weights47a -= learning_rate * grad_a48b -= learning_rate * grad_b49c -= learning_rate * grad_c50d -= learning_rate * grad_d5152print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')535455