I'm trying to write a long executing code, but it hangs after a set amount of seconds, and I want avoid this using any workaround.
The workflow:
User send an AJAX request on the press of a button. This initates a process (function), which for example polls multiple websites for info or sends POST data using cUrl. Ideally, it should provide some info once in a while, but it would be even better if it would run in the background.
The no-no's:
The following functions cannot be used in the code: set_time_limit, exec, fork, anything pcntl related.
Possible solution:
I searched through many posts, and one possible workaround would be to split the code to multiple parts (ex. send one cUrl at a time) and have have jQuery reinitiate the connection until a given condition.
But is there a way on the server side to avoid timeout? I also tried using the Process component of Symfony, Ratchet, sockets...
Many thanks!
Edit: Fixed formatting. I forgot to mention that the code has to be reusable on any server, so editing any config files is not an option either.
Use this function at the start of your script set_time_limit(0).
Related
I am running an external script in PHP using its exec() function. I was looking for various options to create a progress bar. I am able to create a plain rotating loader through AJAX but I couldn't achieve the percentage progress bar. Is there any way to do that?
Depending on the program you want to execute, you could use proc_open() instead of exec(), so you can proccess the output, calculate a percentage and throw it back to your ajax script using ob_flush() and flush().
It's not true that you need to wait for the execution to finish before sending any output, but there are some caveats with your server cache configuration and browsers rendering engines, so it is not so reliable.
If you're not using Websockets (the clean and modern option, which can be achieved with PHP using Ratchet or with nodejs using various options), the most reliable way of doing what you want is with polling.
Briefly, instead of calling your script once, you do a first ajax request to init the proccess and then start poking the server again and again to ask the execution status of your script.
For more information, take a look at those answers:
Run process with realtime output in PHP
PHP - Flushing While Loop Data with Ajax
Grab results from a php exec() while the command is still running?
PHP runs on the server, thus can not achieve this (to my knowledge),
here are some answered questions that might be able to help you.
How to show loading status in percentage for ajax response?
Jquery:: Ajax powered progress bar?
I'm creating a plugin for a CMS and need one or more preriodical tasks in background. As it is a plugin for an open source CMS, cron job is not a perfect solution because users may not have access to cron on their server.
I'm going to start a infinite loop via an AJAX request then abort XHR request. So HTTP connection will be closed but script continue running.
Is it a good solution generally? What about server resources? Is there any shutdown or limitation policies in servers (such as Apache) for long time running threads?
Long running php scripts are not too good idea. If your script uses session variables your user won't be able to load any pages until the other session based script is closed.
If you really need long running scripts make sure its not using any session and keep them under the maximum execution time. Do not let it run without your control. It can cause various problems. I remember when I made such a things like that and my server just crashed several times.
Know what you want to do and make sure it's well tested on different servers.
Also search for similiar modules and check what methods they use for such a problems like that. Learn from the pros. :)
So here is my dilemma. I need to pull several hundred API calls worth of data, parse them one at a time, and log matching data. My problem is this takes awhile and I'm on shared hosting and my FastCGI busy timeout cant be altered (Web Host won't do it because of shared hosting I believe). So I'm completely stumped on how to get around this. I can't do CLI because it's a user facing tool where they input a list of data and thats what I match against. So once the input is recieved I need the PHP to run by itself until completed (probably awhile like couple hours).
I tried everything and nothing works. At this point to try and trick the system I have the file being self-referential instead of a loop but that doesnt seem to work. I think thats my only way (unless someone has a better idea) and I'm trying to figure out how to make every call back on itself "restart" in the eyes of FastCGI. HELP!!
If you have access to exec then you can always create either another PHP script to actually do the execution or some other program or script to do it, and then call that script with exec so you can have it run on the machine instead of through FastCGI. You'd then want to use some sort of progress tracking in your script to keep track of how far it's gotten, or when it's done, and then have a page to check on the progress of a request :)
Note: This really isn't a great idea for a production solution, but it will work better than figuring out a recursive curl call :)
Why in Chrome, when I hit "STOP" button when PHP script is executing, it does not stop execution? Even closing TAB doesn't stop it. ignore_user_abort() is false. Any ideas on how to force PHP to stop? Got large script, that makes some big files and executes for like 10 minutes...
There is no way for a user to stop a PHP script. Since PHP is run on the server when a page is called, you cannot stop it unless you have a condition in your script that will quit to your liking.
As I mentioned in a comment, this answer to a previous question has a lot of info on this.
But one takeaway that may be your issue: PHP may not know it's disconnected until it tries to send data and is refused, see this bit in the docs:
the next time your script tries to output something PHP will detect that the connection has been aborted
So depending on your required output, you may be able to send some sort of "heartbeat" data to the browser that will trigger the abort if the user disconnects. If it doesn't seem to be sending, try doing some explicit buffer-clearing with flush(). If you (or your framework) are using output buffering, you may have to work around it.
If you can't do that (if you're generating an output file or some such), you'll likely have to rearrange things, since there's no way PHP can know the connection is closed. My suggestion is to use a queueing system to offload the generation of things to a separate script that you can then cancel/kill manually - here's a good overview of queueing systems, I personally use beanstalkd with PHP - it's simple, easy, works splendidly, and has some good PHP libraries (I've used pheanstalk and davidpersson's beanstalk). Any time you're generating large files like that, you should probably be using a queueing system anyway.
I'm currently running a Linux based VPS, with 768MB of Ram.
I have an application which collects details of domains and then connect to a service via cURL to retrieve details of the pagerank of these domains.
When I run a check on about 50 domains, it takes the remote page about 3 mins to load with all the results, before the script can parse the details and return it to my script. This causes a problem as nothing else seems to function until the script has finished executing, so users on the site will just get a timer / 'ball of death' while waiting for pages to load.
**(The remote page retrieves the domain details and updates the page by AJAX, but the curl request doesnt (rightfully) return the page until loading is complete.
Can anyone tell me if I'm doing anything obviously wrong, or if there is a better way of doing it. (There can be anything between 10 and 10,000 domains queued, so I need a process that can run in the background without affecting the rest of the site)
Thanks
A more sensible approach would be to "batch process" the domain data via the use of a cron triggered PHP cli script.
As such, once you'd inserted the relevant domains into a database table with a "processed" flag set as false, the background script would then:
Scan the database for domains that aren't marked as processed.
Carry out the CURL lookup, etc.
Update the database record accordingly and mark it as processed.
...
To ensure no overlap with an existing executing batch processing script, you should only invoke the php script every five minutes from cron and (within the PHP script itself) check how long the script has been running at the start of the "scan" stage and exit if its been running for four minutes or longer. (You might want to adjust these figures, but hopefully you can see where I'm going with this.)
By using this approach, you'll be able to leave the background script running indefinitely (as it's invoked via cron, it'll automatically start after reboots, etc.) and simply add domains to the database/review the results of processing, etc. via a separate web front end.
This isn't the ideal solution, but if you need to trigger this process based on a user request, you can add the following at the end of your script.
set_time_limit(0);
flush();
This will allow the PHP script to continue running, but it will return output to the user. But seriously, you should use batch processing. It will give you much more control over what's going on.
Firstly I'm sorry but Im an idiot! :)
I've loaded the site in another browser (FF) and it loads fine.
It seems Chrome puts some sort of lock on a domain when it's waiting for a server response, and I was testing the script manually through a browser.
Thanks for all your help and sorry for wasting your time.
CJ
While I agree with others that you should consider processing these tasks outside of your webserver, in a more controlled manner, I'll offer an explanation for the "server standstill".
If you're using native php sessions, php uses an exclusive locking scheme so only a single php process can deal with a given session id at a time. Having a long running php script which uses sessions can certainly cause this.
You can search for combinations of terms like:
php session concurrency lock session_write_close()
I'm sure its been discussed many times here. I'm too lazy to search for you. Maybe someone else will come along and make an answer with bulleted lists and pretty hyperlinks in exchange for stackoverflow reputation :) But not me :)
good luck.
I'm not sure how your code is structured but you could try using sleep(). That's what I use when batch processing.