Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

program stuck in __memcpy_ssse3_back , cpu 100% #403

Open
nkr-404-NotFound opened this issue Jun 27, 2023 · 16 comments
Open

program stuck in __memcpy_ssse3_back , cpu 100% #403

nkr-404-NotFound opened this issue Jun 27, 2023 · 16 comments

Comments

@nkr-404-NotFound
Copy link

Hi, I m using the project to do mysql binlog to redis thing.

but in some cases, my cpu goes up to 100% and the sync process seems to hang.

i did some digging and found the stack always stop at

Program received signal SIGINT, Interrupt.
0x00007fe57a06e698 in __memcpy_ssse3_back () from /lib64/libc.so.6
(gdb) py-bt
Traceback (most recent call first):
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 148, in read
self.__data_buffer = self.__data_buffer[size:]
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 266, in read_variable_length_string
byte = byte2int(self.read(1))
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 361, in read_binary_json_type
return self.read_variable_length_string()
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 463, in _read
return self.read_binary_json_type(x[0], length)
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 465, in
return [_read(x) for x in values_type_offset_inline]
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 465, in read_binary_json_array
return [_read(x) for x in values_type_offset_inline]
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 359, in read_binary_json_type
return self.read_binary_json_array(length - 1, large)
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py", line 352, in read_binary_json
return self.read_binary_json_type(t, length)
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/row_event.py", line 181, in _read_column_data
values[name] = self.packet.read_binary_json(column.length_size)
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/row_event.py", line 447, in _fetch_one_row
row["values"] = self._read_column_data(self.columns_present_bitmap)
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/row_event.py", line 422, in _fetch_rows
self.__rows.append(self._fetch_one_row())
File "/usr/local/lib/python3.7/site-packages/pymysqlreplication/row_event.py", line 427, in rows
self._fetch_rows()
File "/data/release/nebula_cmdb/server/exec/synccache.py", line 395, in del_cache
for row in binlogevent.rows:

does anyone have the same issue ?
any help is appreciated ~

@nkr-404-NotFound
Copy link
Author

version : 0.21

@nkr-404-NotFound
Copy link
Author

in database there are lots of json field and field-data is very long, maybe 500-1000 bytes ?

@nkr-404-NotFound
Copy link
Author

(gdb)
149 def read(self, size):
150 size = int(size)
151 self.read_bytes += size
152 if len(self.__data_buffer) > 0:
153 data = self.__data_buffer[:size]

154 self.__data_buffer = self.__data_buffer[size:]
155 if len(data) == size:
156 return data
157 else:
158 return data + self.packet.read(size - len(data))
159 return self.packet.read(size)
(gdb) c
Continuing.

@nkr-404-NotFound
Copy link
Author

image

@nkr-404-NotFound
Copy link
Author

in this case, the DelRowEvent buffer len is very long , nearly 1700 KB

image

@nkr-404-NotFound
Copy link
Author

i'v tried 0.42 version, same issue.

@nkr-404-NotFound
Copy link
Author

nkr-404-NotFound commented Jun 27, 2023

after a long time of hang, the current process will be aborted and binlogStream will be reset back to 30 min
before, and consume them all over again ( until the BIG json event and stuck again )

@nkr-404-NotFound
Copy link
Author

as here

@nkr-404-NotFound
Copy link
Author

2023-06-27 14:29:18,854 [INFO] server_num_total : 49650
2023-06-27 14:29:18,854 [INFO] current time : 2023-06-27T14:29:18.854472binlog time : 2023-06-27T14:17:48
2023-06-27 14:34:10,927 [INFO] server_num_total : 49660
2023-06-27 14:34:10,927 [INFO] current time : 2023-06-27T14:34:10.927750binlog time : 2023-06-27T14:17:54
2023-06-27 14:34:11,477 [INFO] server_num_total : 49670
2023-06-27 14:34:11,477 [INFO] current time : 2023-06-27T14:34:11.477635binlog time : 2023-06-27T13:45:35
2023-06-27 14:34:11,550 [INFO] server_num_total : 49680
2023-06-27 14:34:11,551 [INFO] current time : 2023-06-27T14:34:11.551199binlog time : 2023-06-27T13:45:35

super BIG json at 14:17 , after hung for 5min, the next binlog time jumped back to 13:45...

very strange.

@nkr-404-NotFound
Copy link
Author

currently, the only way i can restore from this situation is restarting the whole thing

afterwards the binlog offset will be move next to current timestamp and hope not encounter another BIG json event again ...

@nkr-404-NotFound
Copy link
Author

if data cache is very long ,

byte = byte2int(self.read(1))

will move the data buffer only 1 byte, cause lots of perf loss here

@nkr-404-NotFound
Copy link
Author

found a workaround .. i'v shrink the upstream sql op, batch values from 5000 down to 50

thus can reduce the binlog size ( with more binlog records )

seems work fine ( for a few hours now )

@nkr-404-NotFound
Copy link
Author

maybe in some case if data_buffer is very long, should empty it after current op.

otherwise , next read op will consume extreme long time and no way to recover.

@nkr-404-NotFound
Copy link
Author

maybe there is some logic error in READ func and the loop always here . around

/usr/local/lib/python3.7/site-packages/pymysqlreplication/packet.py

@nkr-404-NotFound
Copy link
Author

image

@sean-k1
Copy link
Collaborator

sean-k1 commented Sep 22, 2023

@nankingrider hello?
can you run code on this PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants