Computation Graphs - Cornell University

[Pages:26]CS5740: Natural Language Processing Spring 2017

Computation Graphs

Instructor: Yoav Artzi

From Practical Neural Networks for NLP / Chris Dyer, Yoav Goldberg, Graham Neubig / EMNLP 2016

Computation Graphs

? The descriptive language of deep learning models ? Functional description of the required computation ? Can be instantiated to do two types of computation:

? Forward computation ? Backward computation

expression: y= > + ? +c x Ax b x graph:

A node is a {tensor, matrix, vector, scalar} value x

A(peaonxynipnde=trdeeagxrslsse>soitAoroednxpna: ort+aedsbedesen?.pxtse+nadcfeunnccyti)o.nTahregyuamreenjut st

A node with an incoming edge is a function of thgartaepdhg: e's tail node.

A node knows how to compute its value and the

value of its derivative w.r.t each argument (edge)

times a derivative of an arbitrary input

. @F

@f (u)

f (u) = u>

@f (u) @F

@F >

=

@u @f (u) @f (u)

x

expression: y= > + ? +c x Ax b x

graph:

Functions can be nullary, unary, binary, ... n-ary. Often they are unary or binary.

f (U, V) = UV f (u) = u>

A

x

expression: y= > + ? +c x Ax b x

graph:

f (M, v) = Mv f (U, V) = UV f (u) = u>

A

x Computation graphs are directed and acyclic (usually)

expression: y= > + ? +c x Ax b x

graph:

f (M, v) = Mv f (U, V) = UV f (u) = u>

A

x

f( , ) = > x A x Ax

x

A

@f( , ) x A =(

>+

)

@

A Ax

x

@f (x, A) = >

@

xx

A

expression: y= > + ? +c x Ax b x

graph:

X f (x1, x2, x3) = xi

i

f (M, v) = Mv

f (U, V) = UV

f (u) = u>

f (u, v) = u ? v

A

x

b

c

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download