Tensorflow NN with specific custom cost function -


i'm trying make neural network in tensorflow doesn't belong in classification of regression categories. it's closer reinforcement learning.

i've made network few relu hidden layers ends in 3-element softmax output layer. target vectors each sample rewards (can negative penalty or 0 neutral) making choice (of there 3). idea maximize summed reward on samples.

given 1 sample input mapped model output m=[a,b,c] targets y=[d,e,f]; loss specific sample m*y', or -tf.matmul(model, y, transpose_b=true). when working batches resulting in matrices in stead of vectors, i'm @ loss (heh) how express cost-function in way tensorflow's optimizers can use. using example code above yield meaningless batchsize^2-sized matrix.

how can this?

let's have output of model mini-batch of n examples, call output. have shape [n, 1, 3]. (note : typically, output of softmax have shape [n, 3], can use tf.reshape reshape [n, 1, 3]). call rewards or targets target, , have shape [n, 1, 3]. can following operation loss shape [n, 1, 1] :

loss = tf.batch_matmul(output, tf.transpose(target, [0, 2, 1])) 

if want average loss minibatch, can following :

loss = tf.reduce_mean(loss) 

in case scalar value.


Comments

Popular posts from this blog

sequelize.js - Sequelize group by with association includes id -

android - Robolectric "INTERNET permission is required" -

java - Android raising EPERM (Operation not permitted) when attempting to send UDP packet after network connection -