The following snippet shows the usage of RMSProp with Keras:
from keras.optimizers import RMSprop
...
rms_prop = RMSprop(lr=0.0001, rho=0.8, epsilon=1e-6, decay=1e-2)
model.compile(optimizer=rms_prop,
loss='categorical_crossentropy',
metrics=['accuracy'])
The learning rate and decay are the same as SGD. The parameter rho corresponds to the exponential moving average weight, μ, and epsilon is the constant added to the changing speed to improve the stability. As with any other algorithm, if the user wants to use the default values, it's possible to declare the optimizer without instantiating the class (for example, optimizer='rmsprop').