MEDIUMTEXT

Developer
May 5, 2013 at 6:53 AM
Edited May 5, 2013 at 6:55 AM
Cannot handle multiple MEDIUMTEXT and MEDIUMBLOB data type. (some where more than 18MB length in single SQL query).
It is a little bit unclear here. Is it because of out of memory or because of max_allowed_packet? I have tried this a few minutes ago
Dim a As New System.Text.StringBuilder
For i As Integer = 0 To 1000000
    a.Append("123456789012345678")
Next

Dim myConn As New MySqlConnection("host=127.0.0.1;username=root;password=#####;database=#####")
myConn.Open()

Dim myCommand As New MySqlCommand("INSERT INTO test(`data`) VALUES(@DATA)", myConn)
myCommand.Parameters.Add("@DATA", MySqlDbType.LongText).Value = a
myCommand.ExecuteNonQuery()

myConn.CloseA()
It seem to work fine if I set the max_allowed_packet high.
Coordinator
May 5, 2013 at 12:17 PM
Edited May 5, 2013 at 12:22 PM
Yeah, that works.

A mediumtext column will hold 16777215 characters which is somewhere equal to 16 MB length. I am success in this length of query too.

But try this:
Dim a As New System.Text.StringBuilder
For i As Long = 0 To 16777214
    a.Append("A")
Next

Dim myConn As New MySqlConnection("host=127.0.0.1;username=root;password=#####;database=#####")
myConn.Open()

Dim myCommand As New MySqlCommand("INSERT INTO test(`data1`,`data1`,`data3`) VALUES(@DATA,@DATA,@DATA)", myConn)
myCommand.Parameters.Add("@DATA", MySqlDbType.LongText).Value = a
myCommand.ExecuteNonQuery()

myConn.Close()
Coordinator
May 5, 2013 at 12:21 PM
EmptyNull (one of the developer of this project) had addressed this issue before.
You can have a look or join the disscussion at here: https://mysqlbackupnet.codeplex.com/discussions/406746
Developer
May 5, 2013 at 2:02 PM
Well, it still works fine with my computer even I tried the code you have given me.
Image
Developer
May 5, 2013 at 2:58 PM
There is another solution to be able to insert a large input without having largest max_allowed_packet by break insert into multiple update. For example:

INSERT INTO table(data) VALUES('........ 4mb chuck of data ......');
UPDATE table SET data += '..... 4mb chuck .....') WHERE id = LAST_ID
---- Repeat the update until no more data to concat ----

Since we can assume that most table would have at least one unique field (field which does not allow duplicate value). We can use it for our condition WHERE.
Coordinator
Dec 24, 2013 at 2:02 PM
I get the concept of your idea.
However, I'm still figuring how to apply it for the Export and Import mechanism in this project.
Thanks for the idea anyway :)
New version of is released (V2.0), you may want to have a look on it.