CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
y33-j3T

Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.

GitHub Repository: y33-j3T/Coursera-Deep-Learning
Path: blob/master/Improving Deep Neural Networks Hyperparameter tuning, Regularization and Optimization/week6/__pycache__/opt_utils.cpython-36.pyc
Views: 13376
3

X�]/�@s�ddlZddljZddlZddlZddlZddl	Zdd�Z
dd�Zddd�Zd	d
�Z
dd�Zd
d�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�NcCsddtj|�}|S)z�
    Compute the sigmoid of x

    Arguments:
    x -- A scalar or numpy array of any size.

    Return:
    s -- sigmoid(x)
    �)�np�exp)�x�s�r�$/home/jovyan/work/week6/opt_utils.py�sigmoids
r	cCstjd|�}|S)z{
    Compute the relu of x

    Arguments:
    x -- A scalar or numpy array of any size.

    Return:
    s -- relu(x)
    r)r�maximum)rrrrr�relus
rrc	Cs�tjj|�tjjdd�}tjjdd�}tjjdd�}tjjdd�}tjjdd�}tjjdd�}tjjdd�}tjjdd�}||||||||fS)N��r)r�random�seed�randn)	r�W1�b1�W2�b2�dW1�db1�dW2�db2rrr�load_params_and_grads#srcCs�tjjd�i}t|�}x�td|�D]�}tjj||||d�tjd||d�|dt|�<tj||df�|dt|�<|dt|�j	||k||dfs�t
�|dt|�j	||kdfs$t
�q$W|S)ab
    Arguments:
    layer_dims -- python array (list) containing the dimensions of each layer in our network
    
    Returns:
    parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":
                    W1 -- weight matrix of shape (layer_dims[l], layer_dims[l-1])
                    b1 -- bias vector of shape (layer_dims[l], 1)
                    Wl -- weight matrix of shape (layer_dims[l-1], layer_dims[l])
                    bl -- bias vector of shape (1, layer_dims[l])
                    
    Tips:
    - For example: the layer_dims for the "Planar Data classification model" would have been [2,2,1]. 
    This means W1's shape was (2,2), b1 was (1,2), W2 was (2,1) and b2 was (1,1). Now you have to generalize it!
    - In the for loop, use parameters['W' + str(l)] to access Wl, where l is the iterative integer.
    r
rr�W�b)rrr�len�ranger�sqrt�str�zeros�shape�AssertionError)Z
layer_dims�
parameters�L�lrrr�initialize_parameters2s<*&r&cCsP|jd}tjtj|�|�tjtjd|�d|�}d|tj|�}|S)z�
    Implement the cost function
    
    Arguments:
    a3 -- post-activation, output of forward propagation
    Y -- "true" labels vector, same shape as a3
    
    Returns:
    cost - value of the cost function
    rg�?)r!r�multiply�log�sum)�a3�Y�mZlogprobs�costrrr�compute_costRs
0r.cCs�|d}|d}|d}|d}|d}|d}tj||�|}t|�}	tj||	�|}
t|
�}tj||�|}t|�}
||	|||
|||||
||f}|
|fS)a�
    Implements the forward propagation (and computes the loss) presented in Figure 2.
    
    Arguments:
    X -- input dataset, of shape (input size, number of examples)
    parameters -- python dictionary containing your parameters "W1", "b1", "W2", "b2", "W3", "b3":
                    W1 -- weight matrix of shape ()
                    b1 -- bias vector of shape ()
                    W2 -- weight matrix of shape ()
                    b2 -- bias vector of shape ()
                    W3 -- weight matrix of shape ()
                    b3 -- bias vector of shape ()
    
    Returns:
    loss -- the loss function (vanilla logistic loss)
    rrrr�W3�b3)r�dotrr	)�Xr#rrrrr/r0�z1�a1�z2�a2�z3r*�cacherrr�forward_propagationesr9cCs�|jd}|\}}}}}}	}
}}}
}}d||
|}tj||	j�}tj|ddd�}tj|j|�}tj|tj|	dk��}tj||j�}tj|ddd�}tj|
j|�}tj|tj|dk��}tj||j�}tj|ddd�}|||||||||||d�}|S)a�
    Implement the backward propagation presented in figure 2.
    
    Arguments:
    X -- input dataset, of shape (input size, number of examples)
    Y -- true "label" vector (containing 0 if cat, 1 if non-cat)
    cache -- cache output from forward_propagation()
    
    Returns:
    gradients -- A dictionary with the gradients with respect to each parameter, activation and pre-activation variables
    rg�?T)�axis�keepdimsr)�dz3�dW3�db3�da2�dz2rr�da1�dz1rr)r!rr1�Tr)r'�int64)r2r+r8r,r3r4rrr5r6rrr7r*r/r0r<r=r>r?r@rrrArBrrZ	gradientsrrr�backward_propagation�s"
rEc	Cs�|jd}tjd|ftjd�}t||�\}}xBtd|jd�D].}|d|fdkr`d|d|f<q>d|d|f<q>Wtdttj|ddd�f|ddd�fk���|S)a
    This function is used to predict the results of a  n-layer neural network.
    
    Arguments:
    X -- data set of examples you would like to label
    parameters -- parameters of the trained model
    
    Returns:
    p -- predictions for the given dataset X
    r)�dtyperg�?z
Accuracy: N)	r!rr �intr9r�printr�mean)r2�yr#r,�pr*Zcaches�irrr�predict�s
2rMcCsptjjd�}|dj}|dj}|dj}|dj}tj|ddd�f|ddd�f|dtjjd	�||||fS)
Nzdatasets/data.matr2rJ�Xval�yvalrr�()�cr�cmap)�scipy�io�loadmatrC�plt�scatter�cm�Spectral)�data�train_X�train_YZtest_XZtest_Yrrr�load_2D_dataset�s



0r]cCs|ddd�fj�d|ddd�fj�d}}|ddd�fj�d|ddd�fj�d}}d}tjtj|||�tj|||��\}}	|tj|j�|	j�f�}
|
j|j�}
t	j
||	|
t	jjd�t	j
d�t	jd�t	j|ddd�f|ddd�f|t	jjd�t	j�dS)Nrrg{�G�z�?)rR�x2�x1)rQrR)�min�maxr�meshgrid�arange�c_�ravel�reshaper!rV�contourfrXrY�ylabel�xlabelrW�show)�modelr2rJ�x_min�x_max�y_min�y_max�h�xx�yy�Zrrr�plot_decision_boundary�s22$

.rtcCst||�\}}|dk}|S)z�
    Used for plotting decision boundary.
    
    Arguments:
    parameters -- python dictionary containing your parameters 
    X -- input data of size (m, K)
    
    Returns
    predictions -- vector of predictions of our model (red: 0 / blue: 1)
    g�?)r9)r#r2r*r8�predictionsrrr�predict_dec�s
rvcCsrtjjd�tjjddd�\}}tj|dd�df|dd�df|dtjj	d�|j
}|jd|jdf�}||fS)	Nr
i,g�������?)�	n_samples�noiserrrP)rQrrR)
rrr�sklearn�datasets�
make_moonsrVrWrXrYrCrfr!)r[r\rrr�load_dataset�s0r|)r)�numpyr�matplotlib.pyplot�pyplotrVZh5py�scipy.iorSry�sklearn.datasetsr	rrr&r.r9rErMr]rtrvr|rrrr�<module>s"


 &#!