2018/05/08
Finding which WordPress requests are hardest on the database
Everyone says WordPress can handle north of a 100,000 or a million posts, no problem!
BS.
Okay, if you have a dedicated database system with enough RAM to hold the entire dataset, then A) that’s cheating, since RDBMSs were created to handle the case where one doesn’t have the requisite memory; and B) I don’t have that luxury right now. I routinely run into lots of performance issues despite a well-tuned LAMP system.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
|
mysql> select count(1) from wp_posts;
+----------+
| count(1) |
+----------+
| 1153099 |
+----------+
1 row in set (0.72 sec)
mysql> select count(1) from wp_terms;
+----------+
| count(1) |
+----------+
| 65797 |
+----------+
1 row in set (0.04 sec)
|
Witness the awesome power of this addition to your functions.php
file, inspired by this SO post:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
|
add_action('shutdown', 'my_sql_logger');
function my_sql_logger() {
if (defined('SAVEQUERIES')) {
global $wpdb;
$log_file = fopen(ABSPATH.'/sql_log.txt', 'a');
fwrite($log_file, "\n//////////\n" . date("F j, Y, g:i:s a\n"));
fwrite($log_file, print_r($_SERVER['REQUEST_URI'], TRUE) . "\n");
foreach($wpdb->queries as $q) {
if ((int)$q[1] >= 1) {
fwrite($log_file, $q[0] . " - ($q[1] s)" . "\n");
}
}
fclose($log_file);
}
}
|
Don’t forget to turn off the SAVEQUERIES
define in wp-config.php, or else your disk may fill up rather quickly.
Finding which WordPress requests are hardest on the database is original content from devolve.