ELMIT
September 29th, 2020, 03:07 AM
I have a large database (data.idb is 24 GB).
On this machine, I still use 18.04
I want to prepare to setup a new machine with 20.04.
At the same time I want to upgrade to MySQL 8.0
The system is collecting constantly data. If I use a mysqldump it would take some time and I would miss data when I start the database on the new machine.
What is the best way to move the data? I am thinking of something like use mysqldump for the first xxxx records and delete these records, to keep the database small on the day of moving. Then I start collecting on the new machine and import the data I exported via mysqldump and deleted it. Then I want to add the data I have now collected on the old and new machine but would discard the duplicates when inserting. Is that a way I should look closer or is there another better way?
On this machine, I still use 18.04
I want to prepare to setup a new machine with 20.04.
At the same time I want to upgrade to MySQL 8.0
The system is collecting constantly data. If I use a mysqldump it would take some time and I would miss data when I start the database on the new machine.
What is the best way to move the data? I am thinking of something like use mysqldump for the first xxxx records and delete these records, to keep the database small on the day of moving. Then I start collecting on the new machine and import the data I exported via mysqldump and deleted it. Then I want to add the data I have now collected on the old and new machine but would discard the duplicates when inserting. Is that a way I should look closer or is there another better way?