Lets say I have two separate databases, X and Y, on the same physical server.
All of my queries currently run out of X.
I find I have one table in Y I would like to be available to X for JOINS.
So… now I keep a copy of the one table I need for both X and Y in both X and Y, but the data in Y is constantly changing, so the copy soon becomes out of date. Not critical for this application, but for another I’m designing now this just won’t do.
X right now only needs read access to the table in Y.
Is there an efficient way to do joins across the two databases?
I’d prefer a mysql solution as opposed to opening multiple database connections in php.
Thank you.
EDIT: So what becomes of these lines?:
$conn = mysql_connect('localhost','username','password');
@mysql_select_db('database_name',$conn)
And yes.. this is an old app, hence the mysql instead of mysqli. Suppose I could convert it.
EDIT2: So where I don’t specify database_name.table_name and just specify table_name I get whatever is selected, and where I do specify database.table_name I get what I asked for.. right? Yep. Works.
EDIT3: Is there any significant performance hit by doing cross database joins versus joins within the same database? Nope. Looks just as fast.
Thank you all for your responses.
The mysql_select_db is optional. As it just selects the active db to use.
https://www.php.net/manual/en/function.mysql-select-db.php
IMHO I don’t think using MYSQL has anything to do with being slow, from my own research into it, there is not much speed improvement with MYSQLi vs MYSQL. Real world tests don’t show that being a big performance driver. I’ve honestly stuck with MYSQL for all my apps.