Pages Menu
Twitter
Categories Menu

Most recent articles

Rails: Use JSON to serialize model attributes

Posted by on Apr 1, 2015 in Ops, Ruby On Rails | 0 comments

Scalling a Rails application’s response is often done by using another programming language. As we might know, Twitter has started as a Rails application, and ended up as Scala, or later as a Javascript backend application.

In my oppinion, a first step to this kind of migration would consist in normalising all the serialised data you have in your database.

Personally, i use ActiveRecord::Base#serialize method to handle most of the custom data that can be resulted by a STI model, or to store any dynamic extra data. I consider to be a good example the situation when you need to keep some additional information about the user, like the company info if is a company .

Some of the Ruby on Rails projects are starting to use plain serialize method, which can be exemplified by Ryan Bates tutorial named PayPal Notifications, or by Diaspora’s codebase or Spree codebase.

One easy trick that you can do in any 3.x & 4.x application is to define your serialize method like :

class User < ActiveRecord::Base 
  serialize :other_data, JSON
end

This way your application will use JSON column coder, which in my opinion, is a better alternative which fixes some of the problems for you:

  • Allows you to use same database backend for multiple applications, written in multiple languages (if is the case)
  • Avoids Syck vs Psych serialization problems (Psych is default Yaml-er starting with Ruby 1.9.3)
  • Json is much faster than Yaml (check here, here and here)

Some of the problems caused by YAML are described in Arne Brasseur’s post.

Updating an existing application to use JSON serialized fields

In order to make your existing application to use JSON serialized field, you would need to perform some changes to your models, mainly to convert:

class User < ActiveRecord::Base 
  serialize :other_data
end

To

class User < ActiveRecord::Base 
  serialize :other_data, JSON
end

Notice the “JSON”, argument for the serialize method.

Other change that you would need to do is to add a migration to change your existing data, from YAML serialized string to JSON serialized string. To do so, you would need to add a migration or a code snippet somewhere in your application to perform the conversion operation:

class ChangeSerializationOnUser < ActiveRecord::Migration
  class YamlUser < ActiveRecord::Base
    self.table_name ="users"
    serialize :other_data
  end

  class JsonUser  < ActiveRecord::Base
    self.table_name ="users"
    serialize :other_data, JSON
  end

  def up
    YamlUser.where(other_data: '---
    ').update_all(other_data: nil)

    YamlUser.find_each do |yaml_user|
      next unless yaml_user.other_data.present?
      next unless yaml_user.other_data.respond_to?(:to_hash)
      hash = yaml_user.other_data.to_hash

      JsonUser.where(id: yaml_user.id).update_all(other_data: nil)
      json_user = JsonUser.find(yaml_user.id)
      json_user.other_data = hash || {}
      json_user.save!
    end
  end

  def down
    raise ActiveRecord::IrreversibleMigration
  end
end

The migration above is doing the following things:

  • Defines a YamlUser class that would handle the Yaml serialize part of your migration. Assuming you added JSON parameter to your class, YamlUser is performing the simple task of converting from string to whatever data you have serialized.
  • Defines a JsonUser class that would handle the JSON serialize part of your migration. This class is defined to perform one single thing, that to convert and save the serialized info field, without validations, without ActiveRecord callbacks.
  • Cleans up all the empty serialized objects. Depending of your data, you might add also an update for  ‘— \n[]’
  • Sometimes, the information you have saved might come as an HashWithIndifferentAccess, which for this operation would require a manual deserialization. That is why, i am using .to_hash
  • Before instantiating a JsonUser object, we would need to update the record in order to avoid any errors caused by the object hydration.
  • Of course, i consider this to be an “ActiveRecord::IrreversibleMigration”

I consider this to be a first step in order to migrate to multiple backend applications.

Read more:

Exposing serialised fields – meta programming way

Posted by on Mar 30, 2015 in Programming, Ruby On Rails | 0 comments

class SomeClass < ActiveRecord::Base
  belongs_to :user

  def self.serialize(attr_name, class_name = Object, exposed_fields = [])
    super(attr_name, class_name)
    serialized_attr_accessor attr_name, exposed_fields
  end

  def self.serialized_attr_accessor(attr_name, *args)
    args.first.each do |method_name|
      eval "
        def #{method_name}
          (self[:#{attr_name}] || {})[:#{method_name}]
        end
        def #{method_name}=(value)
          self[:#{attr_name}] ||= {}
          self[:#{attr_name}][:#{method_name}] = value
        end
        attr_accessible :#{method_name}
      "
    end
  end

  serialize :other_data,
    Hash,
    %w(some other values you want to store in serialized field)
end

Cleanup Big mongodb Collection

Posted by on Dec 28, 2014 in MongoDB | 0 comments

Recently i  have come across one small problem that i needed to fix. I had many records in a DB that i do not needed. I could not delete the entire collection, as i needed some of the records to be left alone. I have come up with this script, which allows me to delete records as I need.

query = {created_at: {"$gte": new ISODate("2012-11-01T00:00:00Z"), "$lt":  new ISODate("2012-12-01T00:00:00Z")}}

items = db.<COLLECTION>.find(query).count();
count = 0;
batches =  parseInt(items / 1000);

for (var i = 0; i < batches; i++) {
  print("Remaining: "+ parseInt(batches-i));
  db.<COLLECTION>.find(query).skip(count).limit(1000).forEach(function(p) { 
    if (p.has_transaction && p.has_transaction == 1) { 
      count++;
    } else {
      db.<MY BACKUP COLLECTION>.insert(p); 
      db.<COLLECTION>.remove(p,1); 
    }
  });
}
print(db.<COLLECTION>.find(query).count());

Romanian Phone Number validator

Posted by on Jul 29, 2014 in Javascript - Client Side | 0 comments

Recently i had to implement a Romanian Phone Number validator… and i have managed to implement it as a method of the jQuery Validation Plugin.

here is the whole method

$.validator.addMethod("phoneRO", function(phone_number, element) {
  phone_number = phone_number.replace(/\(|\)|\s+|-/g, "");
  return this.optional(element) || phone_number.length > 9 &&
phone_number.match(/^(?:(?:(?:00\s?|\+)40\s?|0)(?:7\d{2}\s?\d{3}\s?\d{3}|(21|31)\d{1}\s?\d{3}\s?\d{3}|((2|3)[3-7]\d{1})\s?\d{3}\s?\d{3}|(8|9)0\d{1}\s?\d{3}\s?\d{3}))$/);
}, "Please specify a valid romanian phone number");

The ReGex of interest is:

/^(?:(?:(?:00\s?|\+)40\s?|0)(?:7\d{2}\s?\d{3}\s?\d{3}|(21|31)\d{1}\s?\d{3}\s?\d{3}|((2|3)[3-7]\d{1})\s?\d{3}\s?\d{3}|(8|9)0\d{1}\s?\d{3}\s?\d{3}))$/

Some of the formats this ReGex is able to recognise are:

00 40 722 000 000
00 40 218 032 329
00 40 243 253 398
00 40 343 254 398
00 40 800 801 227
00 40 318 032 329
0722 000 000
0800 801 227
0800 801227
0318 032 329

Have a try: http://rubular.com/r/2ufyprKWGz

sphinx mysql command line

Posted by on May 14, 2014 in Mysql, Programming, Sphinx Search | 0 comments

I’ve been searched for a method on how to work using sphinx from console, I needed to access the interface in order to see what’s indexed, how is stored and  also test my searches.

After my google searches i’ve ended up with this.

mysql -h 127.0.0.1 --prompt 'SphinxQL>' --port 9312

Horizontal scaling using Db Charmer

Posted by on Jan 29, 2014 in Optimisations, Programming, Ruby On Rails, Server | 0 comments

I was looking for a way to scale horizontally a Ruby on Rails application, and i have tried several methods to scale it. A method would be using a MySQL cluster, but that would require some serious database administrator skills, which unfortunately i don’t have.

dbreplication173Mainly i have an application that is read intensive (80% reads vs 20% writes) so, i have considered to use a MySQL master – slave configuration. The problem is that there is nothing about it in Rails documentation, however, after a short look in ruby-toolbox.com I have discovered that I am not the only one who encountered this problem.

I have tried octopus as my first choice, but i have soon discovered that is not fit for my application. For some reasons, not all my “read” queryes were passed to my slave connection. I have tried to see why, but because I was kind of pressed by time, i have dismissed this gem, even if i love the simplicity of the models.

After dismissing octopus, I have tried db charmer gem, which is pretty active. This is yet another Active Record Sharding gem that offers you the possibility to split database reads and writes.

The method i have chosen for my first try was to split my actions that were 100% reads, and push them to a slave. That was pretty simple using a before filter in my rails controllers.

class ProfilesController < Application
  force_slave_reads :only =>  [ :show, :index ]
end

This action allowed me to scale the application by keeping the same amount of servers, but the main effect was a drop in the response time of the applications.

The second action i have taken was to get all the heavy queries like counts out of the mysql master server and move them to slave.

class User < ActiveRecord::Base
  def some_some_heavy_query
    self.on_slave.joins(:profile, :messages).count(:group => ['messages.thread_id'])
  end
end

In my enthusiasm of having a mysql slave I have thought that it would be nice to have “ready” 3 slave instances in my config. I have later realised that this “optimisation” caused problems because those 3 connections multiplied by the number of max_child in my apache configuration and also multiplied by the number of the servers exceded the number of the max_connection on my mysql slave server.

After a small fix in my database.yml files I was back online with a more performant application.

Page 1 of 212