Hey everyone! I’m trying to figure out how to add several records to a WordPress database table in one go. I know we can use $wpdb->insert()
for single entries, but what about multiple? Is there a way to do something like this with WordPress functions:
function bulk_add_to_table($data) {
global $wpdb;
$table_name = $wpdb->prefix . 'my_custom_table';
foreach ($data as $entry) {
$wpdb->insert(
$table_name,
array(
'first_field' => $entry['first'],
'second_field' => $entry['second'],
'third_field' => $entry['third']
)
);
}
}
I’m wondering if there’s a more efficient method than looping through each entry. Any ideas or best practices for this in WordPress? Thanks in advance for your help!
I’ve dealt with a similar issue before, and I can tell you that using $wpdb->query()
with a multi-insert SQL statement is indeed the way to go for bulk inserts. However, it’s crucial to be mindful of your server’s max_allowed_packet
setting when dealing with large datasets.
Here’s a slightly modified approach that I’ve found effective:
function bulk_insert($data, $chunk_size = 500) {
global $wpdb;
$table_name = $wpdb->prefix . 'my_custom_table';
$chunks = array_chunk($data, $chunk_size);
foreach ($chunks as $chunk) {
$values = array();
foreach ($chunk as $entry) {
$values[] = $wpdb->prepare('(%s, %s, %s)', $entry['first'], $entry['second'], $entry['third']);
}
$sql = "INSERT INTO {$table_name} (first_field, second_field, third_field) VALUES " . implode(',', $values);
$wpdb->query($sql);
}
}
This approach chunks the data to avoid potential issues with very large datasets. It’s been reliable in my experience, even when dealing with thousands of records.
While the previous answers offer good solutions, I’d like to add another perspective. Consider using WordPress’s transients API for caching if you’re dealing with frequent bulk inserts. Here’s how you could modify the approach:
function optimized_bulk_insert($data) {
global $wpdb;
$table_name = $wpdb->prefix . 'my_custom_table';
$cached_data = get_transient('bulk_insert_cache');
if ($cached_data === false) {
$cached_data = array();
}
$cached_data = array_merge($cached_data, $data);
set_transient('bulk_insert_cache', $cached_data, 300); // Cache for 5 minutes
if (count($cached_data) >= 1000) {
$values = array();
foreach ($cached_data as $entry) {
$values[] = $wpdb->prepare('(%s, %s, %s)', $entry['first'], $entry['second'], $entry['third']);
}
$sql = "INSERT INTO {$table_name} (first_field, second_field, third_field) VALUES " . implode(',', $values);
$wpdb->query($sql);
delete_transient('bulk_insert_cache');
}
}
This approach caches data and performs bulk inserts when a threshold is reached, reducing database load for frequent small inserts.
hey alexr1990, u can use $wpdb->query() with a multi-insert SQL statement. it’s faster than looping. smth like:
$values = array();
foreach ($data as $entry) {
$values = $wpdb->prepare(‘(%s, %s, %s)’, $entry[‘first’], $entry[‘second’], $entry[‘third’]);
}
$sql = "INSERT INTO {$table_name} (first_field, second_field, third_field) VALUES " . implode(‘,’, $values);
$wpdb->query($sql);
hope this helps!