Skip to content
This repository was archived by the owner on Dec 1, 2023. It is now read-only.
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 10 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,24 @@
# Counter::Cache
# Counter::Cache

[![Gem Version](https://badge.fury.io/rb/counter-cache.png)](http://badge.fury.io/rb/ventable)
[![Build Status](https://travis-ci.org/wanelo/counter-cache.svg?branch=master&fl)](https://travis-ci.org/wanelo/counter-cache)
[![Code Climate](https://codeclimate.com/github/wanelo/counter-cache.png)](https://codeclimate.com/github/wanelo/counter-cache)

Counting things is hard, counting them at scale is even harder, so control when things are counted.
Counting things is hard, counting them at scale is even harder, so control when things are counted.

Any time your application performs pagination, the underlying library probably issues a `select count(*) from ...`
Any time your application performs pagination, the underlying library probably issues a `select count(*) from ...`
request to the database, because all paginators need to know how many pages there are. This works on a small-to-medium
dataset, and in an application with relatively low web traffic. But at high traffic volume, live counts saturate CPU on the
database server. This is because sorting typically happens on CPU of the database server, using a small amount of heap
RAM or even worse — using temp files, which grinds the disk IO to a hault. Web requests become slower and
dataset, and in an application with relatively low web traffic. But at high traffic volume, live counts saturate CPU on the
database server. This is because sorting typically happens on CPU of the database server, using a small amount of heap
RAM or even worse — using temp files, which grinds the disk IO to a hault. Web requests become slower and
slower, start to pile up in various queues, and eventually saturate all of the app servers.
There you are, the site is down.
There you are, the site is down.

This gem provides a solution that works at scale, and will help you keep your site up.

This library is battle-tested at Wanelo, where it has been running for several years.

### Overview
### Overview

[Rails Counter Caches](http://railscasts.com/episodes/23-counter-cache-column) are a convenient way to keep counters on
models that have many children. Without them, you always do live counts, which do not scale. But at high scale, Rails
Expand Down Expand Up @@ -237,7 +237,8 @@ In an initializer such as `config/initializers/counter_cache.rb`, write the conf
Counter::Cache.configure do |c|
c.default_worker_adapter = MyCustomWorkAdapter
c.recalculation_delay = 6.hours # Default delay for recalculations
c.redis_pool = Redis.new
c.redis_url = 'redis://localhost:6379/1' #Default redis url
c.redis_pool = Redis.new # Default redis pool
c.counting_data_store = MyCustomDataStore # Default is Counter::Cache::Redis
end
```
Expand Down
2 changes: 1 addition & 1 deletion lib/counter/cache/config.rb
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ module Counter
module Cache
class Config
# TODO:: Confer with paul/kig about adapting the counting data store
attr_accessor :default_worker_adapter, :recalculation_delay, :redis_pool, :counting_data_store
attr_accessor :default_worker_adapter, :recalculation_delay, :redis_pool, :redis_url, :counting_data_store

def initialize
self.counting_data_store = Counter::Cache::Redis.new
Expand Down
8 changes: 7 additions & 1 deletion lib/counter/cache/redis.rb
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,13 @@ def del(key)
private

def with_redis
redis_pool = Counter::Cache.configuration.redis_pool
redis_pool = if Counter::Cache.configuration.redis_pool.present?
Counter::Cache.configuration.redis_pool
elsif Counter::Cache.configuration.redis_url.present?
Redis.new(url: Counter::Cache.configuration.redis_url)
else
Redis.new # redis in the same machine with app
end
return yield redis_pool unless redis_pool.respond_to?(:with)

redis_pool.with do |redis|
Expand Down