Overview
Loading of a CSV data is done often with conjunction of a balckhole engine as we are going to see later. For now, let's simply load a CSV data into our database. Firstly, we have to prepare our CSV data, so let's create a file with the following data:
Create CSV file
[root@mysqlslave newDB]# cat testSource.csv "1","record one" "2","record two"
Create DB Table
Once the CSV is prepared, feel free to delete the header or leave it there, in either case we can decide what to do with the first line during the load.
mysql> create table testData(i INT NOT NULL, c CHAR(10) NOT NULL) ENGINE = CSV; Query OK, 0 rows affected (0.00 sec) mysql>
Load Data
Once the table is created, we can safely load the data as follows:
mysql> LOAD DATA LOCAL INFILE '/mysqldata/newDB/testSource.csv' INTO TABLE testData FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'; Query OK, 2 rows affected (0.02 sec) Records: 2 Deleted: 0 Skipped: 0 Warnings: 0 mysql> select * from testData; +---+------------+ | i | c | +---+------------+ | 1 | record one | | 2 | record two | +---+------------+ 2 rows in set (0.00 sec)
After this, we can create a table under using the Blackhole engine as select * From this_table, and replicate LARGE CSV file on a production server or visa versa.