PHP file_get_contents() Timeout? - php

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
Will this process time out if downloading to the server takes too long?
If so, is there a way to extend this timeout?
Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.

Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()

If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');

Related

Zipping 400 files [php]

I made a code to zip 400 files from website, but when I open it, it taking a lot of time (And this is ok), but if it too long the php file stop working.
How I suppose to zip 4000 files without my website crash? Maybe I need to create a progress bar?
Hmm.. help? :)
Long work (like zipping 4000 files, sending emails, ...) should not be done in PHP scripts that will keep the user's browser waiting.
Your user may cancel loading the page, and even if they don't, it's not great to have an apache thread locked during a long time.
Setting up a pool of workers outside of apache, to make this kind of work asynchronously is usually the way to go. Have a look at tools like RabbitMQ and Celery
In a PHP installation, you have the directive max_execution_time which is defined in your php.ini file. This directive sets the maximum time of execution of any of your PHP scripts, so you might want to increase it or set it to the infinite (no time limit). You have two ways of doing that : you can modify your php.ini but it's not always available. You can also use the set_time_limit function or the ini_set function. Note that depending on your host service, you may not be able to do any of this.
I think you should look around PHP set time limit or max_execution_time properties:
ini_set('max_execution_time', 300);
// or if safe_mode is off in your php.ini
set_time_limit(0);
Try to find the reasonable settings for zipping all your bundle, and set the values accordingly.
The PHP interpreter is limited in its own execution time, by default to a certain value. That's why it stops suddently. If you change that setting at the beginning of your php script, it will work better, try it!
You could invoke the php executable in cli mode too, to handle that process... with functions like shell_exec

PHP ZipArchive Timeout

I'm currently trying to find a work around to prevent a PHP script from timing out whilst creating a .zip using the ZipArchive class.
I've managed to do this no problem by overriding the max script execution time inside php.ini using set_time_limit(). However its not guaranteed that safe mode will be turned off inside php.ini on the servers that the script will be running on and I don't have access to the php.ini file. Is there any other way to stop the script timing out? Can the ZipArchive be run as a background task?
I think time out is not problem.
It will be solved by ini_set of max_execution_time.
But memory limit is problem.
I am not facing time out issue about zip file creation of 5.8G directory. but i am faing memory limit issue.
Try
ini_set('max_execution_time', 60); //60 seconds = 1 minute
ini_set('max_execution_time', 0); //0=NOLIMIT
But, there may be restrictions put in place on a shared host, so if these functions don't work as they should, ask some admin of the server.
I'm trying to solve this as well, set_time_limit is no good on shared hosting in safe mode or disabled on server.
I'm using Ajax client side calls to PHP to process archiving in steps, however it opened up a few new issues. Script timeout and some kind of "check-pointing" so I can resume on a continue operation.
So my recommendation is to use ajax/client side implementation to hit php to do work knowing the script may not finish in one call but could take N calls, and although an operation could take long enough in PHP to time out the script you can still put logic in to check for elapsed time and try to save state before a timeout and cover both cases in client for timeout/manual kick out.

Increasing the max concurrent file uploads on a LAMP server

I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.

PHP server timeout on script

I have a function for a wordpress plugin I'm developing that takes a lot of time.
It connects to the TMDb (movies database) and retrieves one by one all movies by id (from 0 to 8000) and creates a XML document that is saved on the local server.
Of course it takes a bunch of time, and PHP says "504 Gateway Time-out The server didn't respond in time."
What can I do???? any sugestions!!!
Assuming a one-time execution and it's bombing on you, you can set_time_limit to 0 and allow it to execute.
<?php
set_time_limit(0); // impose no limit
?>
However, I would make sure this is not in production and it will only be ran when you want it to (otherwise this will place (and continue to place) a large load on the server).
Try to set:
set_time_limit(0);
at the script head. But i think it's the servers problem, you read too long. Try read in thread mode.
I think this is not related to script timeout.
504- Gateway Timeout problem is entirely due to slow IP communication between back-end computers, possibly including the Web server.
Fix:
Either use proxies or increase your cache size(search for "cache" in your php.ini and play with it) limit.
Dot

PHP POST requests timeout

I'm currently working on a upload script supporting larger uploads (~50 Mb) and I have very rapidly run into a problem! I'm using the traditional POST request with a form uploading the file to a temp location and later moving it with PHP. Naturally I've updated my php.ini file to support slightly larger than default files and files around 15 Mb upload really well!
The main problem is due to my hosting company. They let scripts timeout after 60 seconds meaning that POST requests taking longer than 60 seconds to complete will die before the temp file reaches the PHP script and this naturally yields an internal server error.
Not being able to crank the timeout on the server (after heated debates) I'm considering the options. Is there a way to bump the request or somehow refresh it to notify the server and reset the timing? Or are there alternative upload methods that don't timeout?
There are a few things you could consider. Each has a cost, and you'll need to determine which one is least costly.
Get a new hosting company. This may be your best solution.
Design a rather complex client-side system that breaks up the upload into multiple chunks and submits them via AJAX. This is ugly especially since it is only useful in getting around a host rule.
I'd really research #1.
With great difficulty. By far your easiest option is to dump the hard-headed host and pick one that actually lets you be productive. I personally use TSOHost - been with them for over a year and a half and have so far had absolutely no reason to complain (not even a slight annoyance).
Are you really sure it s a timeout issue? My first idea...
the transfert failed due to a configuration limitation set up in the web server php.ini file. You need to change it or set it as local settings in your script
# find it in php.ini used by your configuration
memory_limit = 96M
post_max_size = 64M
upload_max_filesize = 64M
Or directly inyour script
ini_set('memory_limit', '96M');
ini_set('post_max_size', '64M');
ini_set('upload_max_filesize', '64M');

Categories