If I had the following table.
create_table :my_table, :id => false do |t|
t.string :key_column
t.string :value_column
end
How would I ensure that the rows are optimaly stored for binary search by the field of :key?
And how would I make sure binary search is used?
For any interesting number of rows, the optimal way (for most definitions of “optimal”) to access a single random record by key is to create an index.
or in an ActiveRecord migration:
Database indices typically use binary search, using B-trees or similar, which offers a good balance between storage costs and time for retrieval and update.
Ensuring the index is used should be relatively straightforward for single-table operations:
for example, should generate something like this (check development.log):
which even MySQL’s relatively unimpressive optimiser should have no problem running optimally.
Row storage should not be a concern for individual row retrieval, which is fortunate as there isn’t much you can do to control it anyway. For MySQL performance you should probably choose MyISAM over InnoDB as the storage engine, provided your definition of “optimal” doesn’t include “most reliable”.