-
Notifications
You must be signed in to change notification settings - Fork 30
Using Events — A Polymorphic DB Table
In another approach, we could have created a polymorphic table Events
— to records all interesting events that have happened, associated only with their producer, and not with the event consumers (i.e. followers). This is because it would require only one row in the Events
table per event, which can scale well. On the other hand, storing each follower association per event would get extremely expensive fast, imagine a user with 1M followers — that would mean 1M rows inserted. Not good!
class CreateEvents < ActiveRecord::Migration
def up
create_table :events do |t|
t.bigint :producer_id
t.string :producer_type
t.string :action
t.bigint :target_id
t.string :target_type
t.string :body
t.datetime :created_at
end
add_index :events, :producer_id, :producer_type
add_index :events, :target_id, :target_type
execute 'ALTER TABLE events ALTER COLUMN id SET DATA TYPE bigint'
execute 'ALTER TABLE events ALTER COLUMN created_at SET DEFAULT now()'
end
def down
drop_table :events
end
end
The above design supports various kinds of event producers, not just a User. For example, a marketplace application may have a Store be a producer of events, distributed to store's followers. Or it could be a Tag that became popular. This design provides great flexibility, in terms of supporting various types of event producers.
On the other hand, if your application is such that a User is the only entity that can produce events, then it might be simpler to replace producer_id
with user_id
, and remove producer_type
.
The target
of an event is an object such a comment, or another user, or a store, etc. How exactly you render the text of events is up to you. The above scheme can be used to create a consistent event structure, which can be automatically rendered by all of your view layers, such as a mobile app, or a web site, simply based on some agreed rules.
Of course, depending on your application, you may have to add more columns to this table.
Also notice that this table does not have updated_at
. This is because events should ideally be an append-only table. Once event occured, the only operation available on that event is SELECT or DELETE.
This approach provides a nice balance:
- You get a version of true event persistence by persisting the actual event stream with it's producers only.
- At the same time, you are keeping a pre-computed stream for each user in SimpleFeed (with its Redis backend). This is a fast, cheap, memory-only storage.
- If you loose SimpleFeed's data in Redis, you can always re-load it by running the event stream through the list of event followers, one for each recent event, and calling a
#store
on each event to re-publish it to its followers.
In other words, you get an authoritative list of events in your system recorded in a reliable transactional database, while keeping the pre-computed cache of individual activity streams in Redis.
In this scheme, what you have to store in the Value
is the database ID of each event. You can even store it as a base64-encoded number, for additional compactness. See Serializing Events section for more details.
Then, to render a given user's feed, you'd first call #paginate
to get a page-worth of event IDs, and then fetch the actual data from the database, using primary-key lookup against an array of IDs. This will scale very well, and is easily cached, because all lookups are always by the primary key. Caching libraries like cache-object
might offer a drop-in write-through caching solution.
SimpleFeed — easy to integrate pure-Ruby Redis-backed implementation of the Social Activity Stream feature.
© 2016-2017 Konstantin Gredeskoul, all rights reserved. Distributed under the MIT license.