I’m working on a website that utilizes essentially a database with a table of organizations, one row for each organization. Each organization can have an unlimited number of attached keywords. The keywords are represented in a separate table from the organizations where each row is simply the primary key, the keyword, and the primary key of the organization it is attached to. Eventually this table could have many thousands of entries. Will this making pulling records from this table, as well as listing unique keywords in the table, too time consuming?
Share
Having a couple of hundred thousands rows is perfectly fine, as long as :
I’m working on an application that’s doing lots of queries on several tables with a couple of hundred thousands records in each, with joins and not “simple” where clause, and that application is working fine — well, since we’ve optimized the queries and indexes ^^
A couple of million rows, in those conditions, is OK too, I’d say — depends on what kind of queries (and how many of those) you’ll do ^^
In every case, there’s only one way to know for sure :