I have a cron job running a php script, but theres some html and javascript that I need to execute for the actual script to work.
Converting the javascript to php isnt an option.
Basically I need it to act as though a person is viewing the page every time the cronjob runs.
EDIT:
the script uses javascript from a different site to encrypt some passwords so it is able to log into my account on the site, and the javascript is thousands of lines. The way the script flows is: Send data to website>get the data it sends back>use sites javascript to alter data>set html form value to value of data returned by javascript function>submit html form to get info back to php>send data to log me in. I know the code is very shoddy but its the only way i could think to do it without having to rewrite all the javascript they use to encrypt the password to php
Yau can try Node.JS to run JavaScript code on the server.
install your favorite web browser, and then have the cron job run the browser with the url as an argument.
something like
/usr/bin/firefox www.example.com/foo.html
you'll probably want to wait a minute or so and then kill the processes, or determine a better way to find when it finishes.
cronjobs always runs on server side only. when there is no client side - how can you expect javascript to work really???
anyway solution is: use cronjob to run another php script - which in fact calls the php script you want to run using CURL.
e.g. file1.php - file you want to execute and expect the javascript on that page to work.
file2.php - another file you create ... in this file use curl to call the file1.php ( make sure you provide full http:// path like you type in browser - you can pass values like get/post methods on html forms do as well ). in your cronjob - call file2.php.
Make sure curl is available and not any firewall rule blocking http calls i.e. port 80 calls to same server. Most of the servers both conditions above are fulfilled.
---------- sorry guys - Kristian Antonsen is right - so dont consider this as full answer at the moment. However I am leaving this on as someone might have food for thoughts from this -----
Related
I have a bash script that can take hours to finish.
I have a web frontend for it to make it easy to use.
On this main page, I wanna have a url that I press that starts my php command
<?exec('myscript that take a long time');?>
After the exec has finished, I want it to load a cookie.
setcookie('status',"done");
This is all easily done and works as is. However, the url that loads my exec command is a blank white page. I dont want this. I want the url to be an action which starts my phpscript and sets the cookie when the exec command returns all in the background.
Is this possible?
If not, how close can I get to this behavior.
EDIT:
function foo(){
var conn = new Ext.data.Connection();
conn.request({
url:‘request.php’,
method:‘POST’,
success: function(responseObject) {
alert(“Hello,Word!”);
},
failure: function() {
alert(“Something fail”);
}
});}
I have tried the above code with no luck.
I have a bash script that can take hours to finish
Stop there. That's your first problem. The WWW is not designed to maintain open requests for more than a couple of minutes. Maybe one day (since we now have websockets) but even if you know how to configure your webserver so this is not an off-switch for anyone passing by, it's exceeding unlikely that the network in between or your browser will be willing to wait this long.
Running of this job cannot be done synchronously with a web request. It must be run asynchronously.
By all means poll the status of the job from the web page (either via a meta-refresh or an ajax) but I'm having trouble understanding the benefit of setting a cookie when it has completed; usually for stuff like this I'd send out an email from the task when it completes. You also need a way to either separate out concurrent tasks invoked like this or a method of ensuring that only one runs at a time.
One solution would be to pass the PHP session id as an argument to the script, then have the script write a file named with the session id on completion - or even provide partial updates via the file - then you web page can poll the status of the job using the session id. Of course your code should check there isn't already an instance of the job running before starting a new one.
While the other answer are correct and the WWW is not meant for open requests, we have to consider the www was never "meant" to take information.
As programmers, we made it take information such as logins.
Further more, my question was simple, how do I commit action A with result B. While sending and email would be nice and dandy as the other post by #symcbean suggests, its not a solution but a sidestep to the problem.
Web applications often need to communicate with the webserver to update their status. This is an oxymoron because the websterver is stateless. Cookies are the soultion.
Here was my solution:
$.ajax({
url: url,
data: data,
success: success,
dataType: dataType
});
setcookie('status',"done");
The url is a php function page with an if statement acting as a switch statement and running my external script that takes a really long time. This call is blocking, that is, it will not execute setcookie until it has finished.
Once it does, it will set my cookie and the rest of my web application can continue working.
Situation:
My php/html page retrieves the contents of another page on a different domain every 5-10 minutes or so. I use a JavaScript setInterval() and a jquery .load() to request content from the other domain into an element on my page. Each time it retrieves content, javascript compares new content with the previous content and then I make an Ajax call to a php script that sends me an email of what the changes are.
Problem:
It's all working fine and dandy except for the fact that I need a browser constantly open, requesting the updates.
Question:
Is there a way to accomplish this with some sort of 'self executing' script on the server? Something that I would only have to start once, and it continues to run on it's own without needing a browser to be open as long as I want the script to run?
Thanks in advance!
P.S. I'm not a php/javascript expert by any means, but I can get my way around.
I believe the thing you are looking for is a cron job.
If your script relies on Javascript for proper execution, you will need to use a browser to accomplish your goals.
However, if you can alter your script to perform all of the functionality via PHP, perhaps using cURL to request the necessary data, you can use a cron job to execute the script at regular intervals.
If you're running a script at an interval, I would recommend using a bash script instead that runs in the background.
#!/bin/bash
while [ 1 ]
do
php "script.php"
sleep 300
done
Then you can run the script like nohup bash.sh. 300 seconds = 5 minutes.
Problem:
I'm trying to see if I can have a back and forth between a program running on the server-side and JavaScript running on the client-side. All the outputs from the program are sent to JavaScript to be displayed to the user, and all the inputs from the user are sent from JavaScript to the program.
Having JavaScript receive the output and send the input is easily done with AJAX. The problem is that I do not know how to access an already running program on the server.
Attempt:
I tried to use PHP, but ran into some hurdles I couldn't leap over. Now, I can execute a program with PHP without any issue using proc_open. I can hook into the stdin and stdout streams, and I can get output from the program and send it input as well. But I can do this only once.
If the same PHP script is executed(?) again, I end up running the program again. So all I ever get out of multiple executions is whatever the program writes to stdout first, multiple times.
Right now, I use proc_open in the script which is supposed to only take care of input and output because I do not know how to access the stdout and stdin streams of an already running program. The way I see it, I need to maintain the state of my program in execution over multiple executions of the same PHP script; maintain the resource returned by proc_open and the pipes hooked into the stdin and stdout streams.
$_SESSION does NOT work. I cannot use it to maintain resources.
Is there a way to have such a back and forth with a program? Any help is really appreciated.
This sounds like a job for websockets
Try something like http://socketo.me/ or http://code.google.com/p/phpwebsocket/
I've always used Node for this type of thing, but from the above two links and a few others, it looks like there's options for PHP as well.
There may be a more efficient way to do it, but you could get the program to write it's output to a text file, and read the contents of that text file in with php. That way you'd have access to the full stream of data from the running program. There are issues with managing the size of the file, and handling requests from multiple clients, but it's a simple approach that might be good enough for your needs.
You are running the same program again, because it's the way PHP works. In your case client does a HTTP request and runs the script. Second request will run the script again. I'm not sure if continuous interaction is possible, so I would suggest making your script able to handle discrete transactions.
In order to figure different steps of the same "interaction", you will have to save data about previous ones in database. Basically, you need to give some unique hash to every client to identify them in your script, then it will know who does the request and will be able to differ consecutive requests from one user from requests of different users.
If your script is heavy and runs for a long time, consider making two script - one heavy and one for interaction (AJAX will query second one). In this case, second script will fill data into database and heavy script will simply fetch it from there.
Is there a way to make PHP request where user doesn't have to wait for response? Some sort of "php request in background"?
For example, if application needs to send 100 emails because the user had submitted something, I don't want to show "sending... please wait" for this user, but I want some other script to do the job independent from that user...
Options:
Stick in a db (or a file) and use a cron to poll it (easiest as you probably already use a db).
Use something like RabbitMQ or ØMQ (my favourite)
Spawn a separate process to do it using fork/exec (would not recommend that).
As others have suggested - fake it by using an Ajax request. Viable - but I find it ugly.
You can maybe send the request in Ajax this way to UI won't freeze and the task will be executed on the server?
You could send the request via ajax and then redirect the user elsewhere upon success. The server script will still process, but no confirmation will be given to the user.
exec('php script.php > /dev/null & echo $!', $o);
You may want to use php-cli instead of php as well.
The command above returns the process id of the background script in $o[0] so you can set something up to poll it with ajax or somethig if you want to show the user it's progress.
Create a php routine to do the actual work of sending the emails and call it via http GET using the technique described here.
I am looking for a way to start a function on form submit that would not leave the browser window waiting for the result.
Example:
User fills in the form and press submit, the data from the form via javascript goes to the database and a function in php that will take several seconds will start but I dont want the user to be left waiting for the end of that function. I would like to be able to take him to another page and leave the function doing its thing server side.
Any thoughts?
Thanks
Thanks for all the replies...
I got the ajax part. But I cannot call ajax and have the browser move to another page.
This is what I wanted.
-User fills form and submits
-Result from the form passed to database
-long annoying process function starts
-user carries on visiting the rest of the site, independent of the status on the "long annoying process function"
By the way and before someone suggests it. No, it cannot be done by cron job
Use AJAX to call the php script, and at the top of the script turn on ignore_ user_ abort.
ignore_user_abort(true);
That way if they navigate away from the page the script will continue running in the backround. You can also use
set_time_limit(0);
to set a time limit, useful if you know your script will take a while to complete.
The most common method is:
exec("$COMMAND > /dev/null 2>&1 &");
Ah, ok, well you're essentially asking therefore, does PHP support threading, and the general answer is no... however...
there are some tricks you can perform to mimick this behaviour, one of which is highlighted above and involves forking out to a separate process on the server, this can be acheived in a number of ways, including the;
exec()
method. You also may want to look here;
PHP threading
I have also seen people try to force a flush of the output buffer halfway through the script, attempting to force the response back to the client, I dont know how successful this approach is, but maybe someone else will have some information on that one.
This is exactly what AJAX (shorthand for asynchronous JavaScript + XML) is for;
AJAX Information
It allows you to code using client side code, and send asynchronous requests to your server, such that the user's browser is not interuppted by an entire page request.
There is alot of information relating to AJAX out there on the web, so take a deep breath and get googling!
Sounds like you want to use some of the features AJAX (Asynchronous Javascript and XML - google) have to offer.
Basically, you would have a page with content. When a user clicks a button, javascript would be used to POST data to the server and begin processing. Simultaneously, that javascript might load a page from the server and then display it (eg, load data, and then replace the contents of a DIV with that new page.)
This kind of thing is the premise behind AJAX, which you see everywhere when you have a web page doing multiple things simultaneously.
Worth noting: This doesn't mean that the script is running "in the background on the server." Your web browser is still maintaining a connection with the web server - which means that the code is running in the "background" on the client's side. And by "background" we really mean "processing the HTTP request in parallel with other HTTP requests to give the feel of a 'background' running process"