about summary refs log tree commit diff
path: root/app/serializers
diff options
context:
space:
mode:
authorunarist <m.unarist@gmail.com>2017-10-27 23:10:22 +0900
committerEugen Rochko <eugen@zeonfederated.com>2017-10-27 16:10:22 +0200
commit0129f5eada3d146c4e3550c7c82b94520a18d2ba (patch)
treea567a56083c98ceae570dd0942cc24ce8b72fe40 /app/serializers
parent22da775a8546a0496b4f501f7cb6e321b0168f40 (diff)
Optimize FixReblogsInFeeds migration (#5538)
We have changed how we store reblogs in the redis for bigint IDs. This process is done by 1) scan all entries in users feed, and 2) re-store reblogs by 3 write commands.

However, this operation is really slow for large instances. e.g. 1hrs on friends.nico (w/ 50k users). So I have tried below tweaks.

* It checked non-reblogs by `entry[0] == entry[1]`, but this condition won't work because `entry[0]` is String while `entry[1]` is Float. Changing `entry[0].to_i == entry[1]` seems work.
  -> about 4-20x faster (feed with less reblogs will be faster)
* Write operations can be batched by pipeline
  -> about 6x faster
* Wrap operation by Lua script and execute by EVALSHA command. This really reduces packets between Ruby and Redis.
  -> about 3x faster

I've taken Lua script way, though doing other optimizations may be enough.
Diffstat (limited to 'app/serializers')
0 files changed, 0 insertions, 0 deletions