PHP ZipArchive Timeout - php

I'm currently trying to find a work around to prevent a PHP script from timing out whilst creating a .zip using the ZipArchive class.
I've managed to do this no problem by overriding the max script execution time inside php.ini using set_time_limit(). However its not guaranteed that safe mode will be turned off inside php.ini on the servers that the script will be running on and I don't have access to the php.ini file. Is there any other way to stop the script timing out? Can the ZipArchive be run as a background task?

I think time out is not problem.
It will be solved by ini_set of max_execution_time.
But memory limit is problem.
I am not facing time out issue about zip file creation of 5.8G directory. but i am faing memory limit issue.

Try
ini_set('max_execution_time', 60); //60 seconds = 1 minute
ini_set('max_execution_time', 0); //0=NOLIMIT
But, there may be restrictions put in place on a shared host, so if these functions don't work as they should, ask some admin of the server.

I'm trying to solve this as well, set_time_limit is no good on shared hosting in safe mode or disabled on server.
I'm using Ajax client side calls to PHP to process archiving in steps, however it opened up a few new issues. Script timeout and some kind of "check-pointing" so I can resume on a continue operation.
So my recommendation is to use ajax/client side implementation to hit php to do work knowing the script may not finish in one call but could take N calls, and although an operation could take long enough in PHP to time out the script you can still put logic in to check for elapsed time and try to save state before a timeout and cover both cases in client for timeout/manual kick out.

Related

Zipping 400 files [php]

I made a code to zip 400 files from website, but when I open it, it taking a lot of time (And this is ok), but if it too long the php file stop working.
How I suppose to zip 4000 files without my website crash? Maybe I need to create a progress bar?
Hmm.. help? :)
Long work (like zipping 4000 files, sending emails, ...) should not be done in PHP scripts that will keep the user's browser waiting.
Your user may cancel loading the page, and even if they don't, it's not great to have an apache thread locked during a long time.
Setting up a pool of workers outside of apache, to make this kind of work asynchronously is usually the way to go. Have a look at tools like RabbitMQ and Celery
In a PHP installation, you have the directive max_execution_time which is defined in your php.ini file. This directive sets the maximum time of execution of any of your PHP scripts, so you might want to increase it or set it to the infinite (no time limit). You have two ways of doing that : you can modify your php.ini but it's not always available. You can also use the set_time_limit function or the ini_set function. Note that depending on your host service, you may not be able to do any of this.
I think you should look around PHP set time limit or max_execution_time properties:
ini_set('max_execution_time', 300);
// or if safe_mode is off in your php.ini
set_time_limit(0);
Try to find the reasonable settings for zipping all your bundle, and set the values accordingly.
The PHP interpreter is limited in its own execution time, by default to a certain value. That's why it stops suddently. If you change that setting at the beginning of your php script, it will work better, try it!
You could invoke the php executable in cli mode too, to handle that process... with functions like shell_exec

Very long script keeps failing

I have a script that updates my database with listings from eBay. The amount of sellers it grabs items from is always different and there are some sellers who have over 30,000 listings. I need to be able to grab all of these listings in one go.
I already have all the data pulling/storing working since I've created the client side app for this. Now I need an automated way to go through each seller in the DB and pull their listings.
My idea was to use CRON to execute the PHP script which will then populate the database.
I keep getting Internal Server Error pages when I'm trying to execute a script that takes a very long time to execute.
I've already set
ini_set('memory_limit', '2G');
set_time_limit(0);
error_reporting(E_ALL);
ini_set('display_errors', true);
in the script but it still keeps failing at about the 45 second mark. I've checked ini_get_all() and the settings are sticking.
Are there any other settings I need to adjust so that the script can run for as long as it needs to?
Note the warnings from the set_time_limit function:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Are you running in safe mode? Try turning it off.
This is the bigger one:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Are you using external system calls to make the requests to eBay? or long calls to the database?
Look for particularly long operations by profiling your php script, and looking for long operations (> 45 seconds). Try to break those operations into smaller chunks.
Well, as it turns out, I overlooked the fact that I was testing the script through the browser. Which means Apache was handling the PHP process, which was executed with mod_fcgid, which had a timeout of exactly 45 seconds.
Executing the script directly from shell and CRON works just fine.

PHP script stops running after some time run by browser

I have made a PHP script which probably would take about 3 hours to complete. I run it from browser and after about 45minutes it stops doing anything. I know this since its polling certain web addresses and then saves some data to database. So it basically stops putting any data to database which lead me to conclusion that it has stopped. It still shows in browser like it would be loading the page though but its neverending.
There arent any errors so it probably is some kind of timeout... But where it occurs is mystery or how can I prevent it from happening. In my case I cant use the CLI, I must user browser client to initiate the script.
I have tried to put
set_time_limit(0);
But it had no apparent effect. Any suggestions what could cause the timeout and a fix for it?
Try this:
set_time_limit(0);
ignore_user_abort(true);
ini_set('max_execution_time', 0);
Most webhosts kill processes that run for a certain length of time. This is intended as a failsafe against infinite loops.
Ask your host about this and see if there's any way it can be disabled for this particular script of yours. In some cases, the killer doesn't apply to Cron tasks, or processes run by SSH. However, this varies from host to host.
Might be the browser that's timing out, not sure if browsers do that or not, but then I've also never had a page require so much time.
Suggestion: I'm assuming you are running a loop. Load a page then run each iteration of the loop in an ajax call to another page, not firing the next iteration until the previous one returns.
There's a setting in PHP to kill processes after sometime. This is especially important for shared servers (you do not want that one process slows up the whole server).
You need to ask your host if you can make modifications to php.ini (through .htaccess). In particular the max_execution_time setting.
If you are using session, then you would need to look at 'session.cookie_lifetime' and not set_time_limit. If you are using an array, the array size might also fill up.
Without more info on how your script handles the task, it would be difficult to identify.

PHP file_get_contents() Timeout?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
Will this process time out if downloading to the server takes too long?
If so, is there a way to extend this timeout?
Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');

PHP Timeouts and FTP function

In implementing the backup script I described in this serverfault question, I ran into some timeout issues that have prompted optimizations to the code (namely, backing up one file per execution of the script and doing everything I can to minimize the number of file-hashes I am calculating over the very large data files).
So far, that seems to have solved the timeout issue, but given the size of the files, there is certainly room for the transfer to take longer than the standard 30s allotted before a script times out. If that happens, I assume the file will simply be cut off as partially transferred. Is there any way to protect against this?
Note that I am operating on a shared-hosting environment, so editing the php.ini file is not an option.
If it's enabled, you can call set_time_limit(). Alternatively, if you run php from the command line (via cron or similar), max execution time does not apply.
Can you try running the ftp job via the shell? Might work on a shared host...
shell_exec('nohup ftp my-ftp-command 2> /dev/null');
According to set_time_limit(), this should never be an issue because time spent executing activities outside the script are not included when calculating execution time of the script for timeout issues.

Categories