Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
774 views
in Technique[技术] by (71.8m points)

php - Handling large result set from mysql with limited memory

I have a large database that contains results of an experiment for 1500 individuals. Each individual has 96 data points. I wrote the following script to summarize and then format the data so it can be used by the analysis software. At first all was good until I had more than 500 individuals. Now I am running out of memory.

I was wondering if anyone has a suggestion on now to overcome the memory limit problem without sacrificing speed.

This is how the table look in the database

fishId assayId allele1 allele2

14_1_1 1 A T

14_1_1 2 A A

$mysql = new PDO('mysql:host=localhost; dbname=aquatech_DB', $db_user, $db_pass);
$query = $mysql->prepare("SELECT genotyped.fishid, genotyped.assayid, genotyped.allele1, genotyped.allele2, fishId.sex, " .
"fishId.role FROM `fishId` INNER JOIN genotyped ON genotyped.fishid=fishId.catId WHERE fishId.projectid=:project");
$query->bindParam(':project', $project, PDO::PARAM_INT);
$query->execute();  

So this is the call to the database. It is joining information from two tables to build the file I need.

 if(!$query){
    $error = $query->errorInfo();
    print_r($error);
} else { 
    $data = array();
    $rows = array();
    if($results = $query->fetchAll()){
        foreach($results as $row)
        {
            $rows[] = $row[0];
            $role[$row[0]] = $row[5];
            $data[$row[0]][$row[1]]['alelleY'] = $row[2];
            $data[$row[0]][$row[1]]['alelleX'] = $row[3];
        }
        $rows = array_unique($rows);
        foreach($rows as $ids)
        {
            $col2 = $role[$ids];
            $alelleX = $alelleY = $content = "";
            foreach($snp as $loci)
            {
                $alelleY = convertAllele($data[$ids][$loci]['alelleY']);
                $alelleX = convertAllele($data[$ids][$loci]['alelleX']);
                $content .= "$alelleY$alelleX";
            }
            $body .= "$ids$col2" . substr($content, 0, -1) . "
";

This parses the data. In the file I need I have to have one row per individual rather than 96 rows per individual, that is why the data has to be formatted. In the end of the script I just write $body to a file.

I need the output file to be

FishId Assay 1 Assay 2

14_1_1 A T A A

$location = "results/" . "$filename" . "_result.txt";
$fh = fopen("$location", 'w') or die ("Could not create destination file");
if(fwrite($fh, $body))
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Instead of reading the whole result from your database query into a variable with fetchAll(), fetch it row by row:

while($row = $query->fetch()) { ... }

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...