PHP Timeouts and FTP function - php

In implementing the backup script I described in this serverfault question, I ran into some timeout issues that have prompted optimizations to the code (namely, backing up one file per execution of the script and doing everything I can to minimize the number of file-hashes I am calculating over the very large data files).
So far, that seems to have solved the timeout issue, but given the size of the files, there is certainly room for the transfer to take longer than the standard 30s allotted before a script times out. If that happens, I assume the file will simply be cut off as partially transferred. Is there any way to protect against this?
Note that I am operating on a shared-hosting environment, so editing the php.ini file is not an option.

If it's enabled, you can call set_time_limit(). Alternatively, if you run php from the command line (via cron or similar), max execution time does not apply.

Can you try running the ftp job via the shell? Might work on a shared host...
shell_exec('nohup ftp my-ftp-command 2> /dev/null');

According to set_time_limit(), this should never be an issue because time spent executing activities outside the script are not included when calculating execution time of the script for timeout issues.

Related

PHP script executes infinitely when called using Cron

I faced very strange problem on my hosting.
I have script and it can be triggered using URL like this
https://mywebsite.com/script.php
I need this script to be executed one time in two days.
So I created Cron job just like my hosting provider advised me.
wget -O /dev/null -q 'https://mywebsite.com/script.php'
It uses wget because script requires some extra scripts - so my hosting provider said that I need to create task like this.
It worked fine for about a month but for few weeks I have a problem.
For some reason that I and my hosting provider can't understand when I run script by opening URI in browser - script executed fine (I know it because of emails that are sended in 4 different steps of execution). But when Cron execute this scripts it executes infinitely - so I continue to receive emails for numerous times until I rename script or delete it.
Script execution time is about 2-3 minutes. So when I run it from URL and wait till it finishes - I get error on the screen that time of request (60 sec) is over. But I know that scripts executes fine till the last step.
What is the problem?
Wget
I had the same problem at some point with a php based cronjob. The Problem was, that wget itself can have a timeout. If this timeout is reached, wget will try again and again.
Try to use some wget options to make sure it runs as you want it to run.
Example:
wget -O /dev/null --tries=1 --timeout=600 'https://mywebsite.com/script.php'
--tries tells how many times it will try to execute if a timeout appears.
--timeout specifies the max exec. time in seconds.
Those options can be specified at cronjob level as well.
PHP Cronjobs
If possible it will may be a betther choice to let PHP run your cronjob directly. If you know the servers php directory you could create a cronjob like
/usr/bin/php /srv/www/yousite/html/script.php
In this case you have no third party programm like wget to rely on. If this helps depends on how the cronjob is built. If your cronjobs uses $_SERVER variables for example, this would not work.
There are some settings you want to check, before you use any PHP file as cronjob.
Keep in mind that the php configuration set within the php.ini could also have an impact on unwanted errors on PHP Cronjobs in general. In the php.ini there is a value called "max_execution_time" where the max seconds to process a php request is defined.
An other setting you might want to get your eye on is the "memory_limit" wich is also defined within the php.ini configuration. This configuration defines the max. memory a php request can use. As your cronjob seems to run for 2-3 minutes, that could mean that maybe a lot of data is stored in memory while you use it.
Be aware, that any request uses those limits. If you set them to high it may will cause problems with CPU load on your server, or with too many spawned php processes.
If you have a shared hosting service or something similar, you may not be able to change any of those settings.

Zipping 400 files [php]

I made a code to zip 400 files from website, but when I open it, it taking a lot of time (And this is ok), but if it too long the php file stop working.
How I suppose to zip 4000 files without my website crash? Maybe I need to create a progress bar?
Hmm.. help? :)
Long work (like zipping 4000 files, sending emails, ...) should not be done in PHP scripts that will keep the user's browser waiting.
Your user may cancel loading the page, and even if they don't, it's not great to have an apache thread locked during a long time.
Setting up a pool of workers outside of apache, to make this kind of work asynchronously is usually the way to go. Have a look at tools like RabbitMQ and Celery
In a PHP installation, you have the directive max_execution_time which is defined in your php.ini file. This directive sets the maximum time of execution of any of your PHP scripts, so you might want to increase it or set it to the infinite (no time limit). You have two ways of doing that : you can modify your php.ini but it's not always available. You can also use the set_time_limit function or the ini_set function. Note that depending on your host service, you may not be able to do any of this.
I think you should look around PHP set time limit or max_execution_time properties:
ini_set('max_execution_time', 300);
// or if safe_mode is off in your php.ini
set_time_limit(0);
Try to find the reasonable settings for zipping all your bundle, and set the values accordingly.
The PHP interpreter is limited in its own execution time, by default to a certain value. That's why it stops suddently. If you change that setting at the beginning of your php script, it will work better, try it!
You could invoke the php executable in cli mode too, to handle that process... with functions like shell_exec

Make PHP script call itself after some time

I have some limitations with my host and my scripts can't run longer than 2 or 3 seconds. But the time it will take to finish will certainly increase as the database gets larger.
So I thought about making the script stop what it is doing and call itself after 2 seconds, for example.
Firstly I tried using cURL and then I made some attempts with wget. But there is always a problem with waiting for the response and timeouts (with cURL, for example, I just need to ping the script, not wait for a response) or permissions with the server (functions that we use to run wget such as exec seems to be disabled on my server, or something like that).
What do you think is the best idea to make a PHP script ping/call itself?
On Unix/LInux systems I would personally recommend to schedule CRON JOBS to keep running the scripts at certain intervals
May be this SO Link will help you
Php scripts generally don't call other php scripts. It is possible to spawn a background process as illustrated here, but I don't think that's what you're after. If, so you'd be better off using cron as was discussed above.
Calling a function every X amount of seconds with the same script is certainly possible, but this does the opposite of what you want since it would only extend the run time of the script in question.
What you seem to be asking is, contrary to your comment, somewhat paradoxical. A process that calls method() every so often is still a long running process and is subject to the same restrictions as any other process on the server, regardless of the fact that it may be sitting idle for short intervals.
As far as I can see your options are:
Extend the php max_execution_time directive, or have your sysadmin do so if they are willing
Revise your script so that it completes within the time limit
Move to a new server

PHP ZipArchive Timeout

I'm currently trying to find a work around to prevent a PHP script from timing out whilst creating a .zip using the ZipArchive class.
I've managed to do this no problem by overriding the max script execution time inside php.ini using set_time_limit(). However its not guaranteed that safe mode will be turned off inside php.ini on the servers that the script will be running on and I don't have access to the php.ini file. Is there any other way to stop the script timing out? Can the ZipArchive be run as a background task?
I think time out is not problem.
It will be solved by ini_set of max_execution_time.
But memory limit is problem.
I am not facing time out issue about zip file creation of 5.8G directory. but i am faing memory limit issue.
Try
ini_set('max_execution_time', 60); //60 seconds = 1 minute
ini_set('max_execution_time', 0); //0=NOLIMIT
But, there may be restrictions put in place on a shared host, so if these functions don't work as they should, ask some admin of the server.
I'm trying to solve this as well, set_time_limit is no good on shared hosting in safe mode or disabled on server.
I'm using Ajax client side calls to PHP to process archiving in steps, however it opened up a few new issues. Script timeout and some kind of "check-pointing" so I can resume on a continue operation.
So my recommendation is to use ajax/client side implementation to hit php to do work knowing the script may not finish in one call but could take N calls, and although an operation could take long enough in PHP to time out the script you can still put logic in to check for elapsed time and try to save state before a timeout and cover both cases in client for timeout/manual kick out.

Very long script keeps failing

I have a script that updates my database with listings from eBay. The amount of sellers it grabs items from is always different and there are some sellers who have over 30,000 listings. I need to be able to grab all of these listings in one go.
I already have all the data pulling/storing working since I've created the client side app for this. Now I need an automated way to go through each seller in the DB and pull their listings.
My idea was to use CRON to execute the PHP script which will then populate the database.
I keep getting Internal Server Error pages when I'm trying to execute a script that takes a very long time to execute.
I've already set
ini_set('memory_limit', '2G');
set_time_limit(0);
error_reporting(E_ALL);
ini_set('display_errors', true);
in the script but it still keeps failing at about the 45 second mark. I've checked ini_get_all() and the settings are sticking.
Are there any other settings I need to adjust so that the script can run for as long as it needs to?
Note the warnings from the set_time_limit function:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Are you running in safe mode? Try turning it off.
This is the bigger one:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Are you using external system calls to make the requests to eBay? or long calls to the database?
Look for particularly long operations by profiling your php script, and looking for long operations (> 45 seconds). Try to break those operations into smaller chunks.
Well, as it turns out, I overlooked the fact that I was testing the script through the browser. Which means Apache was handling the PHP process, which was executed with mod_fcgid, which had a timeout of exactly 45 seconds.
Executing the script directly from shell and CRON works just fine.

Categories