You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there. This one is extremely hard to give a short-form reproducible example so I'll do my best to describe the situation. Hopefully this makes sense.
I'm trying to insert a data frame that has 35 rows into MySQL using RMYSQL.
Each row contains the serialized content of a shiny input element (e.g. rawToChar(serialize(input[[inputs[10, 1]]], NULL, ascii = TRUE))) plus some metadata (field type, field name, etc).
The last two rows contain the serialized content of a large dataframe object (rawToChar(serialize(df_object, NULL, ascii = TRUE))) which works well for most dataframe objects we've had to store.
This week those data objects in the late two rows seem to have hit some threshold and the rows containing them are now no-longer written to MySQL. But the strange thing is, no errors or warnings are thrown and the other rows get written. The whole dataframe is pushed through RMYSQL at once.
These rows contain serialized objects of around 300MB so I'm assuming it's the size that is the issue, but I've no idea how to resolve the issue.
Things I've already tried: I've tweaked the setup of MySQL to raise the max_allowed_packet to the maximum allowed in AWS (enormous), I've changed the ulimit in Ubuntu to 16MB, file system used by MySQL is barracuda, I've also tried playing around with my R code to split out the large rows and insert them separately. Nothing has worked yet.
The code doing the insertion is pretty straightforward
dbWriteTable(con, value = df, name = table, append = TRUE)
Any ideas would be appreciated.
The text was updated successfully, but these errors were encountered:
Hi there. This one is extremely hard to give a short-form reproducible example so I'll do my best to describe the situation. Hopefully this makes sense.
I'm trying to insert a data frame that has 35 rows into MySQL using RMYSQL.
Each row contains the serialized content of a shiny input element (e.g. rawToChar(serialize(input[[inputs[10, 1]]], NULL, ascii = TRUE))) plus some metadata (field type, field name, etc).
The last two rows contain the serialized content of a large dataframe object (rawToChar(serialize(df_object, NULL, ascii = TRUE))) which works well for most dataframe objects we've had to store.
This week those data objects in the late two rows seem to have hit some threshold and the rows containing them are now no-longer written to MySQL. But the strange thing is, no errors or warnings are thrown and the other rows get written. The whole dataframe is pushed through RMYSQL at once.
These rows contain serialized objects of around 300MB so I'm assuming it's the size that is the issue, but I've no idea how to resolve the issue.
Things I've already tried: I've tweaked the setup of MySQL to raise the max_allowed_packet to the maximum allowed in AWS (enormous), I've changed the ulimit in Ubuntu to 16MB, file system used by MySQL is barracuda, I've also tried playing around with my R code to split out the large rows and insert them separately. Nothing has worked yet.
The code doing the insertion is pretty straightforward
dbWriteTable(con, value = df, name = table, append = TRUE)
Any ideas would be appreciated.
The text was updated successfully, but these errors were encountered: