月度归档:2017年03月

DeepLearning:Linear Class解释

class Linear(Node):
    """
    Represents a node that performs a linear transform.
    """
    def __init__(self, X, W, b):
        # The base class (Node) constructor. Weights and bias
        # are treated like inbound nodes.
        Node.__init__(self, [X, W, b])

    def forward(self):
        """
        Performs the math behind a linear transform.
        """
        X = self.inbound_nodes[0].value
        W = self.inbound_nodes[1].value
        b = self.inbound_nodes[2].value
        self.value = np.dot(X, W) + b

    def backward(self):
        """
        Calculates the gradient based on the output values.
        """
        # Initialize a partial for each of the inbound_nodes.
        self.gradients = {n: np.zeros_like(n.value) for n in self.inbound_nodes}
        # Cycle through the outputs. The gradient will change depending
        # on each output, so the gradients are summed over all outputs.
        for n in self.outbound_nodes:
            # Get the partial of the cost with respect to this node.
            grad_cost = n.gradients[self]
            # Set the partial of the loss with respect to this node's inputs.
            self.gradients[self.inbound_nodes[0]] += np.dot(grad_cost, self.inbound_nodes[1].value.T)
            # Set the partial of the loss with respect to this node's weights.
            self.gradients[self.inbound_nodes[1]] += np.dot(self.inbound_nodes[0].value.T, grad_cost)
            # Set the partial of the loss with respect to this node's bias.
            self.gradients[self.inbound_nodes[2]] += np.sum(grad_cost, axis=0, keepdims=False)

1. the loss with respect to inputs
self.gradients[self.inbound_nodes[0]] += np.dot(grad_cost, self.inbound_nodes[1].value.T)
Cost对某个Node的导数(gradient)等于Cost对前面节点导数的乘积。

解释一:

np.dot(grad_cost, self.inbound_nodes[1].value.T)

对于Linear节点来说,有三个输入参数,即inputs, weights, bias分别对应着

self.inbound_nodes[0],self.inbound_nodes[1],self.inbound_nodes[2]

So, each node will pass on the cost gradient to its inbound nodes and each node will get the cost gradient from it’s outbound nodes. Then, for each node we’ll need to calculate a gradient that’s the cost gradient times the gradient of that node with respect to its inputs.

于是Linear对inputs的求导就是weights。所以是grad_cost*weights.grad_cost是Linear输出节点传递进来的变化率。

np.dot(self.inbound_nodes[0].value.T, grad_cost)

同理可推对weights的求导为inputs,于是gradient=grad_cost*inputs

np.sum(grad_cost, axis=0, keepdims=False)

而对于bias,Linear对bias求导恒为1.所以gradient=1*grad_cost

解释二:为何是+=

因为每一个节点将误差传递给每一个输出节点。于是在Backpropagation时,要求出每一个节点的误差,就要将每一份传递出去给输出节点的误差加起来。于是用+=。
于是可以理解为什么要for n in self.outbound_nodes: 目的是为了在每一个节点的输出节点里遍历。
If a node has multiple outgoing nodes, you just sum up the gradients from each node.

注意点一:
要区分Backpropagation 和Gradient Descent是两个步骤,我通过Backpropagation找到gradient,于是找到了变化方向。再通过Gradient Descent来最小化误差。

To find the gradient, you just multiply the gradients for all nodes in front of it going backwards from the cost. This is the idea behind backpropagation. The gradients are passed backwards through the network and used with gradient descent to update the weights and biases.

最终目的是:

Backpropagation只求了导数部分。Gradient Descent则是整个过程。

 

解决defaults::qt-5.6.2-vc14_3

使用conda 安装tensorflow的时候,出现了这个问题,这不是第一次的出现了。上一次出现,我选择重新安装miniconda,问题解决了。这次我决定去解决他!

ERROR conda.core.link:_execute_actions(330): An error occurred while installing package ‘defaults::qt-5.6.2-vc14_3’. UnicodeDecodeError(‘utf-8′, b’\xd2\xd1\xb8\xb4\xd6\xc6 1 \xb8\xf6\xce\xc4\xbc\xfe\xa1\xa3\r\n‘, 0, 1, ‘invalid continuation byte’) Attempting to roll back. UnicodeDecodeError(‘utf-8′, b’\xd2\xd1\xb8\xb4\xd6\xc6 1 \xb8\xf6\xce\xc4\xbc\xfe\xa1\xa3\r\n‘, 0, 1, ‘invalid continuation byte’)

ERROR conda.core.link:_execute_actions(319): An error occurred while installing package ‘defaults::qt-5.6.2-vc9_3’

python | 解决defaults::qt-5.6.2-vc14_3

这两篇都是同一种解决方法,但是当我试着他们的方法的时候,却出现了新的问题,即ModuleNotFoundError: No module named ‘chardet’

于是我直接用conda在root环境下安装chardet

conda install chardet
最后一切搞定,连上面两篇文章的添加内容都不需要~

笔记本升级SSD

电脑莫名其妙总是蓝屏!一气之下,买了固态硬盘。好在主板只有SATA2的接口,所以正好可以省点钱,买低配的SSD。但是也要500大洋,心疼。不过念在学校又发了1000大洋,想想这台12年高考毕业时买的古董,也算焕发第三春了。

大概是大二的时候,升级了内存。加了一条4G内存。这是第二春(ಡωಡ)hiahiahia

没什么事做,尝试用手机发表文章,试试效果。除了屏幕小以外,其他都可以接受。

作为小白,为了选购SSD还特意去看了知乎相关回答,涨了不少知识。

【如何选购固态硬盘?】dyoule:https://www.zhihu.com/question/20369676/answer/99405990?utm_source=com.meizu.notepaper&utm_medium=social (分享自知乎网)

Very9s第二年

很高兴坚持留下来。最近比较忙,所以写的比较少。但是素材准备了很多。等过段时间,整理整理,写出来。

最近为了自己的项目,焦头烂额的。亚历山大~~不想多说了,有机会再写吧~


2017.5.20添:

删除放在github上的旧站。为了以后还能怀念她,截了首页的图。

这个主题还是我仿照lofter上的主题自己做的。花了可能有一周的时间。好不舍得。