John Davidson

Parse JSON files with 600k to 2mil lines in php and insert it in MySQL DB but it takes a lot of time

0 comments
Message:


the JSON file looks sort of like this


{"id":3924,"name":"xx","xxx":3.657}<br /> {"id":6124,"name":"xx","xxx":0.841}<br /> ...


There are 600k lines, each containing a JSON file.


There is also a second file that contains 2mil of those lines and if my calculations are right, it would take me the whole day to parse it.


The code is below, I'm kinda new to this, is there a faster way to parse this than 400k per 6 hours?


`<?php

$dbhost = 'xx';
$dbuser = 'xx';
$dbpass = 'xx';
$DBname = 'xx';
$DB_List_Name = 'xx';
$JSONFileLocation=fopen(__DIR__.'\json\thefile.json','r');
$conn = new mysqli($dbhost, $dbuser, $dbpass, $DBname);

ini_set('max_execution_time', '40000'); //i think its pretty slow for 600000 rows


if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}

$file = $JSONFileLocation;
while (!feof($file)){
$line = fgets($file);
$obj = json_decode($line);

$nameElem = addslashes($obj->name);

$sql = "INSERT INTO $DB_List_Name (
id ,
name,
xxx
)

VALUES (
'$obj->id',
'$nameElem',
'$obj->xxx'
)";


//that was for testing on low numbers
if ($conn->query($sql) === TRUE) {
//printf("Table $DB_List_Name successfully updated.<br />");
// echo '<br>';
}
else {
//echo $obj->id.' - ';
//printf("Error updating the table : %s<br />", $conn->error);
//echo '<br>';
}

}
echo 'End of the file: Everything is now in the database';
?>`


My Answer:
To parse large JSON files efficiently in PHP and insert the data into a MySQL database, you can follow these steps:

1. Use the `json_decode()` function to parse the JSON file line by line instead of loading the entire file into memory at once. This will help in handling large JSON files without running out of memory.

2. Open a connection to the MySQL database using PDO or MySQLi.

3. Use prepared statements to insert the data into the MySQL database. Prepared statements are more efficient and secure for inserting large amounts of data.

Here is an example code snippet to demonstrate how to parse a large JSON file and insert the data into a MySQL database:

php

// Open the JSON file for reading
$filename = 'large_json_file.json';
$file = fopen($filename, 'r');

// Open a connection to the MySQL database
$pdo = new PDO('mysql:host=localhost;dbname=database_name', 'username', 'password');

// Prepare the insert statement
$stmt = $pdo->prepare("INSERT INTO table_name (column1, column2) VALUES (:value1, :value2)");

// Loop through each line of the JSON file
while (($line = fgets($file)) !== false) {
// Decode the JSON data
$data = json_decode($line, true);

// Insert the data into the MySQL database
$stmt->bindParam(':value1', $data['key1']);
$stmt->bindParam(':value2', $data['key2']);
$stmt->execute();
}

// Close the file and database connection
fclose($file);
$pdo = null;

?>


This code snippet reads the JSON file line by line, decodes each line, and inserts the data into the MySQL database using prepared statements. This approach should help in efficiently handling large JSON files and reducing the time taken to insert the data into the database.

Rate this post

3 of 5 based on 7337 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us