PHP having issues saving "large" files - php

I've got a program that takes 3 arrays (which are the same length, can contain 500 items or so) and writes them to a text file.
However I'm getting an issue with writing larger files. The arrays are coordinates and timestamps of a canvas drawing app so I can control the length. I've found that once files start getting larger than 2mb it doesn't save the file. The maximum file I've managed to save has been 2.18mb. From a related question PHP: Having trouble uploading large files I've determined that the cause is most likely due to having this hosted on a free hosting server. I've looked at phpinfo() and here are the 4 relevant numbers:
memory_limit 16M
max_execution_time 30
upload_max_filesize 5M
post_max_size 5M
Here is the relevant writing code:
// retrieve data from the JS
$x_s = $_GET['x_coords'];
$y_s = $_GET['y_coords'];
$new_line = $_GET['new_lines'];
$times = $_GET['time_stamps'];
print_r($_GET);
$randInt = rand(1,1000);
// first want to open a file
$file_name = "test_logs/data_test_" . $randInt . ".txt";
$file_handler = fopen($file_name, 'w') or die("Couldn't connect");
// For loop to write the data
for ($i = 0; $i < count($x_s); $i++){
// If new line want to write new line!
if (!$new_line[$i]){
if ($i!=0){
// If not the first line
fwrite($file_handler, "LINE_END\n"); }
fwrite($file_handler, "LINE_START\n");
}
// Write the x coord, y coord, timestamp
fwrite($file_handler, $x_s[$i] . ", ". $y_s[$i] .", ". $times[$i]. "\n");
// If last line then write last LINE_END
if ($i == (count($x_s)-1)){
fwrite($file_handler, "LINE_END\n"); }
}
fclose($file_handler);
I've setup a php server on my localhost and have access to the error log. This is what I am getting.
[Fri Mar 23 20:03:02 2012] [error] [client ::1] request failed: URI too long (longer than 8190)
PROBLEM RESOLVED: The issue was that I was using GET to send large amounts of data, which was appended to the URI. Once the URI reached 8190 characters it had an error. Using POST solves this.

upload_max_filesize and post_max_size determine the maximum size of data that can be posted. But this is probably not your problem since some of the data is written (if you reach the data limit, the script does not execute).
Your script has two restrictions: max_execution_time and memory_limit. Have a look at your apache error log file to see if you are getting an error message (saying which limit is reached).
You can also try logging inside the for loop to see the progression of time and memory usage:
if(($i % 100) == 0) { // log every 100 entries
error_log(date("H:i:s ").memory_get_usage(true)."Bytes used\n", 3, 'test.log');
}
It may also be that the Suhosin patch is preventing you from sending too many data points:
http://www.adityamooley.net/blogs/2012/01/09/php-suhosin-and-post-data/

Maybe script exceeds max execution time.
Add this
set_time_limit(0)
at the beginning of your code.

1) check max_input_time
ini_set ( 'max_input_time', 50 );
2) check in phpinfo() - do you have Suhosin patch?
You should look at apache error_log - You should find which limit is reached.
Try
ini_set('error_reporting', E_ALL);
error_reporting(E_ALL);
ini_set('log_errors', true);
ini_set('html_errors', false);
ini_set('error_log', dirname(__FILE__).'script_error.log');
ini_set('display_errors', true);

PHP (an hence the web server) is protecting itself. Perhaps use a different mechanism to upload a large file - i would imagine they come from known (an trusted) sources. Use a different mechanism, for example SFTP.

Related

Is it possible to force PHP code to continue running, if a php.ini directive such as upload_max_filesize is exceeded when uploaded file exceeds this?

I have a single PHP page, with the php (v7.4.7) script at the top of the page. The page functions as a file up-loader, allows up to 10 files per upload, and the max file size is set in the script.
The site works perfectly, generating a table of the result of each file to the user.
That is, unless someone uploads a file greater than the upload_max_filesize directive in the php.ini file. At which point, the script stops dead and cannot therefore, continue to provide the necessary results back to the user. No results are returned, the results table is therefore empty, and the user might think, wrongly, all went well.
I have tried adding try/catch blocks, but the script still fails to complete. Is this by design, or is there a way to coerce the script to run/complete, if this directive is exceeded? The HTML/PHP code is all pretty standard, as demonstrated by various tutorials etc.
Many thanks.
Violating upload_max_filesize doesn't abort the script execution. It'll just cause the error key in $_FILES to become UPLOAD_ERR_INI_SIZE.
Most likely, you're hitting an entirely different limit. For example, Apache has
LimitRequestBody and PHP itself has post_max_size or even max_file_uploads. In general, those directives don't abort the script either but simply wipe out the data excess.
My advice is to locate all the directives that may affect file uploads, set them one by one to an artificially low limit and verify what effect they have in your data. A check I typically do is to verify if $_SERVER['CONTENT_LENGTH'] is greater than zero for a request where $_SERVER['REQUEST_METHOD'] is POST but $_POST is empty. Aside that, checking error in $_FILES should have you fairly covered.
; Allow enough time for the upload to complete
max_input_time = 1200
; Max file size
upload_max_filesize = 50M
; Greater than upload_max_filesize to ease diagnose slightly large files
post_max_size = 100M
if (
empty($_FILES) &&
empty($_POST) &&
isset($_SERVER['REQUEST_METHOD']) &&
strtolower($_SERVER['REQUEST_METHOD'])=='post'
) {
// Error: post body was too large
// Bytes sent (and discarded): filter_input(INPUT_SERVER, 'CONTENT_LENGTH')
} elseif (
isset($_FILES['foo']) &&
$_FILES['foo']['error'] != UPLOAD_ERR_NO_FILE
){
// Upload
if ($_FILES['foo']['error'] === UPLOAD_ERR_OK) {
// Successful upload
}else{
// Failed upload
}
} else {
// No upload
}

PHP filesize() showing old filesize with a file inside a windows shared (network) folder

I have the following script that runs to read new content from a file:
<?php
clearstatcache();
$fileURL = "\\\\saturn\extern\seq_ws.csv";
$fileAvailable = file_exists($fileURL);
$bytesRead = file_get_contents("bytes.txt");
if($fileAvailable){
$fileSize = filesize($fileURL);
//Statusses 1 = Partial read, 2 = Complete read, 0 = No read, -1 File not found. followed by !!
if($bytesRead < $fileSize){
//$bytesRead till $fileSize bytes read from file.
$content = file_get_contents($fileURL, NULL, NULL, $bytesRead);
file_put_contents("bytes.txt", ((int)$bytesRead + strlen($content)));
echo "1!!$content";
}else if($bytesRead > $fileSize){
//File edit or delete detected, whole file read again.
$content = file_get_contents($fileURL);
file_put_contents("bytes.txt", strlen($content));
echo "2!!$content";
}else if($bytesRead == $fileSize){
//No new data found, no action taken.
echo "0!!";
}
}else{
//File delete detected, reading whole file when available
echo "-1!!";
file_put_contents("bytes.txt", "0");
}
?>
It works perfect when I run it and does what is expected.
When I edit the file from the same PC and my server it works instantly and returns the correct values.
However when I edit the file from another PC, my script takes about 4-6 seconds to read the correct filesize of the file.
I added clearstatcache(); on top of my script, because I think its a caching issue. But the strange thing is that when I change the file from the server PC it responds instantly, but from another it doesn't.
On top of that as soon as the other PC changes the file, I see the file change in Windows with the filesize and content but for some reason, it takes Apache about 4-6 seconds to detect the change. In those 4-6 seconds it receives the old filesize from before the change.
So I have the following questions:
Is the filesize information cached anywhere maybe either on the Apache server or inside Windows?
If question 1 applies, is there anyway to remove or disable this caching?
Is it possible this isnt a caching problem?
I think that in Your local PC php has development settings.
So I suggest to check php.ini for this param: realpath_cache_ttl
Which is:
realpath_cache_ttl integer
Duration of time (in seconds) for which to cache realpath
information for a given file or directory.
For systems with rarely changing files,
consider increasing the value.
to test it, php info both locally and on server to check that value:
<?php phpinfo();

how to put large records in csv file

I am creating a csv file but there are 9700 records in database but around 6800 rows are inserting in csv file, please advise how can I do it, i have heard about saving data in temp file and then putting it in csv , i am not sure how to do that
$valArr = $fieldArr['field_value'];
if(!empty($arrvalH_VH))
$valArr = array_merge($fieldArr['field_value'],$arrvalH_VH);
if(!empty($field_value_arr))
$valArr = array_merge($valArr,$field_value_arr);
fputcsv($file, $valArr);
Thanks
With the long processes like data import/export, there is a good probability that PHP maximum execution timeout is terminating the script before completion. Execution timeout can be increased by following setting in php.ini.
max_execution_time = 360 // Timeout value is in seconds
As an alternate the same setting can be changed from PHP code as below:
ini_set('max_execution_time', '360');
NOTE: If restricted by server admin, you will not be able to set max_execution_time. This is the case with most shared hosting providers.

Can I set max execution time in php.ini to be 30,000 in my case?

Scenario is that I wanna save 4046 images to a folder . (Have coded in php ) I guess it would take maximum of 5 hours . Initially max execution time in php.ini was set to 30 seconds . After 650 images got saved , The browser froze . And none of the images got saved .But the process was running . And had no error too . Can anybody give me an idea the max execution time I should set in this case !
P.S. If my approach is wrong , Do guide me .
Thanks
I'm not sure if your problem isn't caused just by wrong tool - PHP isn't meant for such long tasks.
If that images are on some server better user FTP client.
If you have list of files saved in text file use cURL to download them.
I'd highly suggest modifying your script to do the job incrementally. So basically break the job up into smaller parts and provide a break in between. The basic logic flow would be like this.
<?php
$start = $_GET['start']; //where to start the job at
$end = $start + 250; //queue this and the next 250 positions
for($i=$start;$i<=$end;$i++){
//do the operations need for position $i
}
header("/urlToScript?start=".($end+1)); //refresh page moving to the next 250 jobs
?>
This will do small parts of the total job and avoid any issues with the interpreter. Add any INI modifications to increase memory usage and time as needed and you'll be fine.
You can extend the time using this line at that script which saving images.
ini_set('max_execution_time', 30000);
Second approach is to use htaccess.
php_value max_execution_time 30000

Get MD5 Checksum for Very Large Files

I've written a script that reads through all files in a directory and returns md5 hash for each file. However, it renders nothing for a rather large file. I assume that the interpreter has some value set for maximum processing time, and since it takes too long to get this value, it just skips along to other files. Is there anyway to get an md5 checksum for large files through PHP? If not, could it be done through a chron job with cpanel? I gave it a shot there but it doesn't seem that my md5sum command has ever been processed: I never get an email with the hash. Here's the PHP I've already written. It's a very simple code and works file for files of a reasonable size:
function md5_dir($dir) {
if (is_dir($dir)) {
if ($dh = opendir($dir)) {
while (($file = readdir($dh)) !== false) {
echo nl2br($file . "\n" . md5_file($file) . "\n\n");
}
closedir($dh);
}
}
}
Make sure to use escapeshellarg ( http://us3.php.net/manual/en/function.escapeshellarg.php ) if you decide to use a shell_exec() or system() call. I.e.,
shell_exec('md5sum -b ' . escapeshellarg($filename));
While i couldn't reproduce it with PHP 5.2 or 5.3 with a 2GB file the issue seems to come up on 32bit PHP builds.
Even so it's not a really nice solution you could try to let the system to the hasing
echo system("md5sum test.txt");
46d6a7bcbcf7ae0501da341cb3bae27c test.txt
If you're hitting an execution time limit or maximum execution time, PHP should be throwing an error message to that effect. Check your error logs. If you are hitting a limit, you can set the maximum values for PHP memory usage and execution time in your php.ini file:
memory_limit = 16M
will set max memory usage to 16 megs. For maximum execution time:
max_execution_time = 30
will set maximum execution time to 30 seconds.
you could achieve it with command line
shell_exec('md5sum -b '. $fileName);
FYI....in case someone needs a fast md5()check-sum. PHP is pretty fast even with the larger files. This returns the check-sum on Linux Mint .iso (size 880MB) in 3 sec.
<?php
// checksum
$path = $_SERVER['DOCUMENT_ROOT']; // get upload folder path
$file = $path."/somefolder/linux-mint.iso"; // any file
echo md5_file($file);
?>

Categories