📜 ⬆️ ⬇️

Cacheops

Some time ago I wrote about the caching system . I remember that I promised to continue, but now I decided that the line of code is better than a hundred comments, we leave the theory for later. Therefore, today we have a kind of announcement with a couple of tips for use in one bottle. Meet the cacheops - the system of caching and automatic cache invalidation for Django ORM.

Brief explanations for those who did not read or simply forgot the previous article. The system is designed to disable the results of queries to the database through the ORM on the event of a change in the model object, automatically determining for each event the results of which queries could become outdated. Since for automatic invalidation it is required to store additional structured information except, actually, the cache content, Redis was chosen as the backend. And since this is a practical announcement, then I shut up and get down to business.

Suppose you have already installed Redis, Django and you have something that you can cache (models and queries using them). Install the cacheops:

pip install django-cacheops 

or if you decide to delve into the code:
')
 git clone git://github.com/Suor/django-cacheops.git ln -s `pwd`/django-cacheops/cacheops/ /somewhere/on/your/python/import/path/ 

Next we need to configure it, add cacheops to the list of installed applications. Cacheops must be initialized before loading Django models, so put it first:

 INSTALLED_APPS = ( 'cacheops', ... ) 

You must also configure the connection with radish and caching profiles:

 CACHEOPS_REDIS = { 'host': 'localhost', #  redis   'port': 6379, #    #'db': 1, #     'socket_timeout': 3, } CACHEOPS = { #    User.objects.get()  15  #      request.user  post.author, #  Post.author - foreign key  auth.User 'auth.user': ('get', 60*15), #     #    django.contrib.auth   'auth.*': ('all', 60*60), #        news  . #  - . 'news.*': ('just_enable', 60*60), #     .count() #     15  '*.*': ('count', 60*15), } 

Hereinafter, the cache storage time often appears, it is important to understand that this is only the maximum limit on the lifetime, in real work, each specific entry can be erased much earlier in the event.

Setup is ready, you can proceed. Actually, you can already use, all requests of the specified types will be cached and automatically disabled when changing, adding or deleting the corresponding models. However, it is possible to use more subtle, at the level of individual querysets.

As a minimum setting that will allow the use of manual caching, you can use:

 CACHEOPS = { '*.*': ('just_enable', <  >), } 

This will allow us to write something like:

 articles = Article.objects.filter(tag=2).cache() 

and get a request to the database, which on the one hand will be cached, and on the other, the cache of which will be erased when changing, adding or deleting an article with tag 2.

In the .cache () method, you can pass a set of operations that need to be cached and timeout:

 qs = Article.objects.filter(tag=2).cache(ops=['count'], timeout=60*5) paginator = Paginator(objects, ipp) # count      5  artiles = list(pager.page(page_num)) #       

The set of operations can be any subset of ['get', 'fetch', 'count'], including an empty one - to disable caching for the current queryset. However, for the latter case there is a shortcut:

 qs = Article.objects.filter(visible=True).nocache() 

Here, access to the contents of qs will go to the database.

In addition to querysets, cacheops can work with functions, and it can disable them as some sort of queryset:

 from cacheops import cacheoped_as @cacheoped_as(Article.objects.all()) def article_stats(): return { 'tags': list( Article.objects.values('tag').annotate(count=Count('id')).nocache() ) 'categories': list( Article.objects.values('category').annotate(count=Count('id')).nocache() ) } 

Pay attention to wrapping the querysets in the list () - we don’t want to put the query object in the cache, which will then be executed each time an access is attempted. We also use .nocache () to not do too much work and not to clog the cache with intermediate results.

Perhaps, I gave enough for anyone to feel the taste, so I’ll stop until it becomes boring.

PS For those who want intimate details, there is a branch with Russian commentaries on the githaba.

Source: https://habr.com/ru/post/129122/


All Articles