I need to update the table which contains more than 10k records, through CSV file in server. Problem Is It shows "Server Timeout" or "Error after a few minutes".
I Have added this lines
ini_set('memory_limit', '512M');
ini_set('max_execution_time', '180');
before
$this->db->where('part_no',$insert_csv['part_no']);
$this->db->update('mst_parts', $data4);
I am Getting This Error
"This page isn’t working
'xxxxxxxx.com' took too long to respond.
HTTP ERROR 504"
You can read the CSV source by chunks, for example reading only 1000 rows at a time like:
$startLine = 1;
$file = new SplFileObject('source.csv');
$file->seek($startLine-1);
echo $file->current();
And the next run you should increment $startLine.
Documentation: SplFileObject::seek()
Related
I have an ICS file which is 2gb in size and i want to parse ics data from that file but php is not able to read such big file and i am getting fatal error of "Out of Memory" even i have set "ini_set('memory_limit', '-1')".
So i want to somehow break or split big ICS file to small file or is there any way to read the data from such big ICS file.
I have some small files and all are working fine and i can extract data from other files but 2gb big ICS file is more important for me to extract / parse.
Thanks in advance
Usual method to handle Out of memory exception is by allocating more ram to php in php.ini file but because you have a file of 2Gb thats not a valid option unless you have a lot of memory on your system. Basically you are trying to read the file wrong. Rather than reading the whole file and saving it to a variable which would cost ram equivalent to the size of file, you ran parse them by line or byte depending on the format of file you are working on. Here is a basic example that you can work on
<?php
$handle = fopen("fileName", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process line here
}
fclose($handle);
} else {
// handle file read error
}
?>
Hope this helps you.
Is there a way in PHP to take some action (mysql insert for example) if there is no new requests for say 1 second?
What I am trying to achieve is to determinate beginning and the end of image sequence sent from a IP camera. Camera sends series of images on detected movement and stops sending when movement stops. I know that camera makes 5 images per second (every 200ms). When there is no new images for more than 1 sec I want to flag last image as end of the sequence, insert a record in mysql, place img in appropriate folder (where all other imgs from the same sequence are already written) and instruct app to make a MJPEG clip of images in that folder.
Right now I am able to determine the first image in the sequence using Alternative PHP cash to save reference time from the previous request but the problem is because next image sequence can happen hours later and I can not instruct PHP to close the sequence if there is NO requests for some time, only when first request of the new sequence arrives.
I really need help on this. My PHP sucks almost as my English... :)
Pseudo code for my problem:
<?php
if(isset($headers["Content-Disposition"]))
{
$frame_time = microtime(true);
if(preg_match('/.*filename=[\'\"]([^\'\"]+)/', $headers["Content-Disposition"], $matches))
{ $filename = $matches[1]; }
else if(preg_match("/.*filename=([^ ]+)/", $headers["Content-Disposition"], $matches))
{ $filename = $matches[1]; }
}
preg_match("/(anpr[1-9])/", $filename, $anprs);
$anpr = $anprs[1];
$apc_key = $anpr."_last_time"; //there are several cameras so I have to distinguish those
$last = apc_fetch($apc_key);
if (($frame_time - $last)>1)
{
$stream = "START"; //New sequence starts
} else {
$stream = "-->"; //Streaming
};
$file = fopen('php://input', 'r');
$temp = fopen("test/".$filename, 'w');
$imageSize = stream_copy_to_stream($file, $temp); //save image file on the file system
fclose($temp);
fclose($file);
apc_store($apc_key, $frame_time); //replace cashed time with this frame time in APC
// here goes mysql stuff...
/* Now... if there is no new requests for 1 second $stream = "END";
calling app with exec() or similar to grab all images in the particular folder and make
MJPEG... if new request arrives cancel timer or whatever and execute script again. */
?>
Could you make each request usleep 1.5 seconds before exiting, and as a last step check to see if the sequence timestamp was updated? If yes, exit and do nothing. If no, save the sequence to mysql. (This will require mutexes, since each http request will be checking and trying to save the sequence, but only one must be allowed to.)
This approach would merge the sub-file/script into the php code (single codebase, easier to maintain), but it can possibly balloon memory use (each request will stay in memory 1.5 seconds, which is a long time for a busy server).
Another approach is to make the sub-file/script into a loopback request on the localhost http server, with presumably a much smaller memory footprint. Each frame would fire off a request to finalize the sequence (similarly again, with mutexes).
Or maybe create a separate service call that checked and saved all sequences, and have a cron job ping it every few seconds. Or have each frame ping it, if a second request can detect that the service is already running it can exit. (share state in the APC cache)
Edit: I think I just suggested what bytesized said above.
What if you just keep the script running for 1 second after it stores the frame to check for more added frames? I imagine you may want to close the connection before the 1 second expires, but tomlgold2003 and arr1 have the answer for you: http://php.net/manual/en/features.connection-handling.php#93441
I think this would work for you:
<?php
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true); // optional
ob_start();
// Do your stuff to store the frame
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
ob_end_clean();
// The connection should now be closed
sleep(1);
// Check to see if more frames have been added.
?>
If your server is expected to see a high load, this may not be the answer for you since when receiving 5 frames per second, there will be 5 scripts checking to see if they submitted the last frame.
Store each request from the camera with all its data and timestamp in a file (in php serialized form). In cronjob run (every 10 seconds or so) a script that reads that file and finds requests that have more then one second after them before the following request. Save the data from such requests and delete all other requests.
Please help me for the the problem exporting large data to excel format xlsx.
i m exporting almost 45000 records at a time to single file and it timeout without saving file.
the mysql select query is taking 21 seconds for executing that large data. Below is my code to export data in to Excel file using PHP Excel library.
$sql2 = "SELECT * FROM Surveys";
$result2 = mysql_query($sql2);
while($row2 = mysql_fetch_assoc($result2))
{
$j=$rowscounter+2;
$sheet->setCellValue("A$j",$row2['brandname']);
$sheet->setCellValue("B$j",$row2['productname']);
$sheet->setCellValue("C$j",$row2['sname']);
$sheet->setCellValue("D$j",$row2['smobile']);
$sheet->setCellValue("E$j",$row2['semail']);
$sheet->setCellValue("F$j",$row2['country']);
$sheet->setCellValue("G$j",$row2['city']);
$sheet->setCellValue("H$j",$row2['sdob']);
$sheet->setCellValue("I$j",$row2['age']);
$sheet->setCellValue("J$j",$row2['comment']);
$sheet->setCellValue("K$j",$row2['outletcode']);
$sheet->setCellValue("L$j",$row2['username']);
$sheet->setCellValue("M$j",$row2['datetime']);
$sheet->setCellValue("N$j",$row2['duration']);
$rowscounter++;
}
// Rename worksheet
$sheet->setTitle('Survey-Report');
$objPHPExcel->setActiveSheetIndex(0);
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter->setPreCalculateFormulas(false);
unlink("Survey-Report.xlsx");
$objWriter->save('Survey-Report.xlsx');
echo "ok";
UPDATE:
I forgot to mention that I already tried set_timout etc and wrote below code in my php file.
set_time_limit(0);
ini_set('memory_limit','2500M');
You could add this at the top of your script:
set_time_limit (0);
That will disable the default 30 seconds php timeout.
Or you could add a custom number of seconds, see set_time_limit()
I had the same issue where I keep on getting a 504 Gateway Timeout when exporting records using PHPExcel. I also tried set_time_limit(0) with no success. What I ended up doing is changing the apache timeout.
You can find this timout in your httpd.conf file.
Hope it helps!
I have a page with a larger query form mysql, the results could be more than 2MB. Because I haven't got enough memory on my server (I should arrange the memory for mysql to speed up query time), I have given up memcached. instead it is a txt file.
How to lock a file like this situation? If the file is lock, only read, if file isn't lock, first write then read. thanks.
if((time()-filemtime(filename)) > 60){
//mysql_query; query time less than 1.5 seconds
if(ex_lock){ // if file is lock, only read
file_get_contents($filename);
}else{ // if file isn't lock, first write then read
file_put_contents($filename, $query_contents);//some file size could be 2mb
file_get_contents($filename);
}
}
Use flock(), but if you're creating result sets with 2MB of data I'd suggest you cache them. Does the result set really need to be updated every other second?
I am using PHPexcel for excel generation.
for()
{
$objPHPExcel->getActiveSheet()->setCellValueByColumnAndRow($col, $ex_row, $value);
}
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
I having huge data result of more than 60000 records with 60 columns.
I think PHPExcel is setting values and everything saved in object array and at last its writing to the file. As PHP is not good with arrays and data is huge am getting request time out error.
To avoid that am planning to write row by row. Is it possible that I can write row by row to the excel file and save it at the end?
If the speed of the creation isn't the real problem and the fact that PHP is giving you a timeout error, you could always place this at the top of your script:
set_time_limit(0);
The 0 will allow the script to run and run and run...