Error
Error Code: 1153

MySQL Error 1153: Packet Size Exceeded

📦 MySQL
📋

Description

This error occurs when the MySQL server receives a network packet from a client or attempts to send one that is larger than the 'max_allowed_packet' configuration variable permits. It typically indicates an attempt to transfer an excessive amount of data in a single communication, leading to a rejection by the server.
💬

Error Message

Got a packet bigger than 'max_allowed_packet' bytes
🔍

Known Causes

4 known causes
⚠️
Large SQL Queries
Executing SQL statements, especially INSERT or UPDATE queries with many values or long string/binary data, can create a packet exceeding the configured limit.
⚠️
Extensive BLOB/TEXT Data Transfer
Sending or retrieving substantial amounts of binary large objects (BLOBs) or long text data often results in packets larger than the default max_allowed_packet setting.
⚠️
Bulk Data Operations
Operations like LOAD DATA INFILE or large mysqldump imports/exports can generate packets that exceed the max_allowed_packet size, causing the server to reject them.
⚠️
Replication Transaction Size
During MySQL replication, a large transaction containing extensive data changes might exceed the max_allowed_packet setting on the replica server.
🛠️

Solutions

4 solutions available

1. Increase max_allowed_packet easy

Allow larger data transfers

1
Check current setting
SHOW VARIABLES LIKE 'max_allowed_packet';
2
Increase for current session
SET GLOBAL max_allowed_packet = 64 * 1024 * 1024;  -- 64MB
3
Make permanent in my.cnf
[mysqld]
max_allowed_packet = 64M
4
Restart MySQL after config change
sudo systemctl restart mysql

2. Split Large INSERT into Chunks medium

Break data into smaller batches

1
Instead of one huge INSERT, use multiple
-- Instead of:
-- INSERT INTO logs VALUES (...thousands of rows...);

-- Use batches:
INSERT INTO logs VALUES (row1), (row2), ... (row100);
INSERT INTO logs VALUES (row101), (row102), ... (row200);
-- etc.
2
In application code (Python example)
batch_size = 1000
for i in range(0, len(data), batch_size):
    batch = data[i:i+batch_size]
    cursor.executemany('INSERT INTO logs VALUES (%s, %s)', batch)
    conn.commit()

3. Use LOAD DATA for Large Files medium

More efficient than INSERT for bulk data

1
Export to CSV and use LOAD DATA
LOAD DATA LOCAL INFILE '/path/to/data.csv'
INTO TABLE your_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
2
Enable local_infile if needed
SET GLOBAL local_infile = 1;

4. Increase Client-Side Packet Size medium

Configure client to send larger packets

1
MySQL CLI
mysql --max_allowed_packet=64M -u user -p
2
mysqldump
mysqldump --max_allowed_packet=64M -u user -p database > backup.sql
3
PHP PDO
$pdo = new PDO($dsn, $user, $pass, [
    PDO::MYSQL_ATTR_MAX_BUFFER_SIZE => 64 * 1024 * 1024
]);
🔗

Related Errors

5 related errors