Zipping 400 files [php] - php

I made a code to zip 400 files from website, but when I open it, it taking a lot of time (And this is ok), but if it too long the php file stop working.
How I suppose to zip 4000 files without my website crash? Maybe I need to create a progress bar?
Hmm.. help? :)

Long work (like zipping 4000 files, sending emails, ...) should not be done in PHP scripts that will keep the user's browser waiting.
Your user may cancel loading the page, and even if they don't, it's not great to have an apache thread locked during a long time.
Setting up a pool of workers outside of apache, to make this kind of work asynchronously is usually the way to go. Have a look at tools like RabbitMQ and Celery

In a PHP installation, you have the directive max_execution_time which is defined in your php.ini file. This directive sets the maximum time of execution of any of your PHP scripts, so you might want to increase it or set it to the infinite (no time limit). You have two ways of doing that : you can modify your php.ini but it's not always available. You can also use the set_time_limit function or the ini_set function. Note that depending on your host service, you may not be able to do any of this.

I think you should look around PHP set time limit or max_execution_time properties:
ini_set('max_execution_time', 300);
// or if safe_mode is off in your php.ini
set_time_limit(0);
Try to find the reasonable settings for zipping all your bundle, and set the values accordingly.
The PHP interpreter is limited in its own execution time, by default to a certain value. That's why it stops suddently. If you change that setting at the beginning of your php script, it will work better, try it!
You could invoke the php executable in cli mode too, to handle that process... with functions like shell_exec

Related

PHP ZipArchive Timeout

I'm currently trying to find a work around to prevent a PHP script from timing out whilst creating a .zip using the ZipArchive class.
I've managed to do this no problem by overriding the max script execution time inside php.ini using set_time_limit(). However its not guaranteed that safe mode will be turned off inside php.ini on the servers that the script will be running on and I don't have access to the php.ini file. Is there any other way to stop the script timing out? Can the ZipArchive be run as a background task?
I think time out is not problem.
It will be solved by ini_set of max_execution_time.
But memory limit is problem.
I am not facing time out issue about zip file creation of 5.8G directory. but i am faing memory limit issue.
Try
ini_set('max_execution_time', 60); //60 seconds = 1 minute
ini_set('max_execution_time', 0); //0=NOLIMIT
But, there may be restrictions put in place on a shared host, so if these functions don't work as they should, ask some admin of the server.
I'm trying to solve this as well, set_time_limit is no good on shared hosting in safe mode or disabled on server.
I'm using Ajax client side calls to PHP to process archiving in steps, however it opened up a few new issues. Script timeout and some kind of "check-pointing" so I can resume on a continue operation.
So my recommendation is to use ajax/client side implementation to hit php to do work knowing the script may not finish in one call but could take N calls, and although an operation could take long enough in PHP to time out the script you can still put logic in to check for elapsed time and try to save state before a timeout and cover both cases in client for timeout/manual kick out.

PHP manage web service with long execution without extending set_time_limit

I need to write a script that reads a big .csv and generates a file for each row.
Since on this server I can't modify timeout with set_time_limit, in my script that is manually executed in browser via http I get 10 row, execute my code for each, reach the end and auto reload for next 10 row, so on till I reach the end.
Now I need to convert this script into a web service, but I don't know how to avoid that the scripts timeout.
set_time_limit(900) // does not work
From reading here it seems it has a restriction.
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Do you have safe mode enabled? If so, it seems you may have to disable it. You can also see other values that do the same/similar actions here, though they also require safe mode to be disabled.

PHP file_get_contents() Timeout?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
Will this process time out if downloading to the server takes too long?
If so, is there a way to extend this timeout?
Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');

What different settings affect PHP and/or Apache timeouts?

I was asked to help troubleshoot someone's website. It is written in php, on a linux box, using an apache server and mysql, and I have never worked with any of these before (except maybe linux in school).
I got most of the issues fixed (most code is really the same no matter what langues it is) however there is still one page that is timing out when processing huge files. I'm fairly sure the problem is a timeout somewhere but I have no idea where all the php timeouts would be.
I have adjusted max_execution_time, max_input_time, mysql.connect_timeout, default_socket_timeout, and realpath_cache_ttl in php.ini but it is still timing out after about 10 minutes. What other settings might exist that I could increase to try and fix this?
As a sidenote, I'm aware that 10min is generally not desired when processing an file, however this section of the site is only used by one person once or twice a week and she doesn't mind providing the process finishes as expected !and I really don't want to go rewrite someone else's bad coding in a language I don't understand, for a process I don't understand)
EDIT: The sql process finishes in the background, its just the webpage itself that times out.
Per Frank Farmer's suggestion, I added flush() to the code and it works now. Definitely a browser timeout, thanks Frank!
You can use set_time_limit() if you set it to zero the script should not time out at all.
This would be placed within your script, not in any config etc...
Edit: Try changing apache's timeout settings. In the config look for TimeOut directive (should be the same for apache 2.x and apache 1.3.x), once changed restart apache and check it.
Edit 3:
Did you go to the link I provided? It lists there the default, which is 300 seconds (5 minutes). Also if the setting IS NOT in the config file, you CAN add it.
According to the docs:
The TimeOut directive currently defines the amount of time Apache will wait for three things:
The total amount of time it takes to receive a GET request.
The amount of time between receipt of TCP packets on a POST or PUT request.
The amount of time between ACKs on transmissions of TCP packets in responses.
So it is possible it doesn't relate, but try it and see.

PHP Timeouts and FTP function

In implementing the backup script I described in this serverfault question, I ran into some timeout issues that have prompted optimizations to the code (namely, backing up one file per execution of the script and doing everything I can to minimize the number of file-hashes I am calculating over the very large data files).
So far, that seems to have solved the timeout issue, but given the size of the files, there is certainly room for the transfer to take longer than the standard 30s allotted before a script times out. If that happens, I assume the file will simply be cut off as partially transferred. Is there any way to protect against this?
Note that I am operating on a shared-hosting environment, so editing the php.ini file is not an option.
If it's enabled, you can call set_time_limit(). Alternatively, if you run php from the command line (via cron or similar), max execution time does not apply.
Can you try running the ftp job via the shell? Might work on a shared host...
shell_exec('nohup ftp my-ftp-command 2> /dev/null');
According to set_time_limit(), this should never be an issue because time spent executing activities outside the script are not included when calculating execution time of the script for timeout issues.

Categories