Production RNN Regressor

After developing a Deep Neural Network model, I have decided to kick it up a notch by trying out with RNN. The RNN model is a regression model that predicts users’ next best items, either for browsing or purchasing. Since it’s a regressor, there’s no hard single item that would be recommended. Instead, it’s predicting values of the next best items features which could be compared against our inventory of items. This way similarity ( 1 – distance ) could be computed and items could then be ranked accordingly. The RNN model that I’ve designed is as follows and it was coded in Chainer.

Schema


 

Learning Curve

rnn

Recurrent Unit

[python]
class RNN(chainer.Chain):

def __init__(self, itemEncoder, itemDecoder, n_feas, n_units, dropout_rate, use_gpu=False):
super(RNN, self).__init__()
with self.init_scope():
self.dropout_rate = dropout_rate
self.itemEncoder = itemEncoder
self.l1 = L.LSTM(n_units, n_units)
self.l2 = L.LSTM(n_units, n_units)
# self.l3 = L.GRU(n_units, n_units)
self.itemDecoder = itemDecoder
self.use_gpu = use_gpu

for param in self.params():
param.data[…] = np.random.uniform(-0.1, 0.1, param.data.shape)

def reset_state(self):
self.l1.reset_state()
self.l2.reset_state()
# self.l3.reset_state()

def __call__(self, xs):
if self.use_gpu:
xs = F.transpose(xs, (1, 0, 2))
else:
xs = np.transpose(xs, (1, 0, 2))

for x in xs:
h0 = self.itemEncoder(x)
h1 = self.l1(F.dropout(h0, self.dropout_rate))
h2 = self.l2(F.dropout(h1, self.dropout_rate))
y = self.itemDecoder(F.dropout(h2, self.dropout_rate))
return y
[/python]
 
Will continue on production infra later…

Leave a Reply