Memcached is a system for caching objects in memory that runs very fast. Using Memcached can significantly increase the speed of a Rails application at minimal cost.
Prerequisites
It is assumed that Ruby on Rails and Memcached are already installed on your system. If this is not the case, the links below will help you:
It also assumes that you have a running Rails application that you plan to optimize with Memcached.
Installing the Dalli heme
The first thing we need to do is install the
Dalli gem from Mike Perham:
gem install dalli
If you are using a Bundler, add the line
gem 'dalli' to the Gemfile and install
bundle install .
Next, we will go a very short way in setting up the interaction between Rails and Memcached.
Configuring Rails
First of all, to configure the interaction between Rails and Memcached, you need to edit the file
config / environments / production.rb . Add the following line, which tells Rails to use Dalli for caching:
config.cache_store = :dalli_store
In the same file, add another line that allows ActionController to perform caching:
config.action_controller.perform_caching = true
Now is the time to reload your Rails application for normal future work.
')
Tuning Rails Applications
We need to configure the Rails application for further work with Memcached. There are two main ways to configure and we will talk about them below.
Adding Headers (Cache-Control) for Cache Management
The easiest way is to add headers (Cache-Control) to control the cache to one of your actions. This will allow Rack :: Cache to save the result of the action in Memcached. In this case, the action in
app / controllers / slow_controller.rb should contain the following:
def slow_action sleep 15
You can also add the following line to allow Rack :: Cache to store the result for five minutes:
def slow_action expires_in 5.minutes sleep 15
Now, when you perform this action again, you should notice that the speed of work has increased significantly. Rails will execute the action code only once every five minutes to update Rack :: Cache.
Note that Cache-Control headers will be publicly available. If you have actions that should only be visible to specific users, then use
expires_in 5.minutes,: public => false . You must also determine the most optimal caching time for your application. For each application, this optimal time may be different and must be determined empirically.
Read more about HTTP caching in the article by Mark Nottingham
Caching Guide for Web Authors and Webmasters .
Storage of objects in Memcached
If your actions have very resource-intensive operations or you operate with objects that for some reason need to be created frequently, then it will be rational to store the intermediate result in Memcached. Suppose your action contains the following code:
def slow_action slow_object = create_slow_object end
You can save the result in Memcached as follows:
def slow_action slow_object = Rails.cache.fetch(:slow_object) do create_slow_object end end
In this case, Rails will query Memcached with the 'slow_object' key. If the object already exists (it was previously stored in memory), it will be returned. Otherwise, the block contents will be executed and the result will be written to 'slow_object'.
Fragment caching
Fragment caching is one of the features of Rails, allowing you to cache the most dynamic (frequently changing) parts of your application. You can cache any parts of your views by enclosing them in a
cache block:
<%
This caching technique is referred to as
Russian nesting dolls . Rails will cache all of these fragments in Memcached. Since we have added the model name as a parameter to the
cache method call, the keys of these models in the cache will change only when the model changes. This problem will be especially relevant if we need to update the “task” model:
@task.completed! @task.save!
So, we have a nested object cache and Rails knows nothing about the lifetime of the cache of fragments attached to models. In this case, the ActiveRecord
touch key will help us:
class Employee < ActiveRecord::Base belongs_to :manager, touch: true end class Task < ActiveRecord::Base belongs_to :employee, touch: true end
Now that the
Task model is updated, the cache will become invalid for the fragments, and the
Employee model will be informed that it must also update its fragments. Next, the
Employee model will tell the Model
manager that it must also update its cache. After that, the process of updating the cache can be considered safely completed.
There is another additional caching problem of the “Russian nesting dolls” type: when deploying a new application, Rails does not know when to update the view. Suppose you update the “task” as follows:
<%
Rails cannot update the fragment cache when using partial templates. Previously, to get around this problem, you had to add the version number to the
cache method. Now this problem can be solved using the
cache_digests heme, which automatically adds an MD5 hash to the cache key. After updating the partial template and restarting the application, the cache key will also update and Rails will be forced to perform a new generation of the view template. He also knows how to handle dependencies between template files in such a way that if, say, the partial template
_incomplete_tasks.html.erb has been changed, then all dependent templates will be updated up the chain.
This feature was taken into account in Rails 4.0. If you are using an earlier version, you need to install the gems using the command:
gem install cache_digests
If you are using the Bundler, add the following line to the Gemfile:
gem 'cache_digests'
Advanced Rails and Memcached Configuration
Dalli heme is a very powerful solution and provides the ability to work with keys in a cluster of Memcached servers. He is able to distribute the load, thereby increasing the potential of Memcached. If you have multiple servers, then you can install Memcached on each of them, and then add the appropriate configuration to the
config / environments / production.rb configuration file:
config.cache_store = :dalli_store, 'web1.example.com', 'web2.example.com', 'web3.example.com'
This setting allows you to use
consistent hashing to distribute keys between the available Memcached servers.