I have searched and found some suggestions to my question but almost all of them are executing php files, so i don't know if that has something to do with it not working for me.
My goal is for my webpage to load completely without finishing my script that takes x amount of time, but it wont do it with this line of code. Is there something im missing? i have seen this answer in many places and it seems to work for them.
<?php
exec("sudo ./EscalonVel 50 2 100 10 20 &> /dev/null &");
echo "Hello";
?>
If you has to show on the page data calculated on the C script, then "someone" has to wait until it's finished to show it out. If the data comes directly from the command executed (output/stdout of the execution), then you cannot background the command with &: the output might come after request is dispatched, and process backgrounding disconnects it's output from actual execution. So you have 2 options:
Show the page "template" completely and prepare it to accept the contents at the very end of your HTML (suboptimal but possible)
Do a 2 request page: the first shows the template (full page) and the second (AJAX) fills it up with data from command execution. Depending on how you write it, the AJAX can do several request until the command is terminated, which is preferable if the script runs for more than 20 seconds. Then you'll need some kind of process checking and some backend to save (by command) and view (by AJAX request) the data, as a file or database.
Hope it helps!
Related
I have created script where I get user review from Yelp, google etc using their API.
Using this review for Part of speech tagging, kinda Natural lang processing stuff.
I keep loop, which contiously keep fetching the review and extract noun, adjective for each.
I am using this tutorial:
http://phpir.com/part-of-speech-tagging
When I execute the script, it abruptly stops after 40-50 review processing. It does not show any error. DOes is it due to
"php has run out of space in memory"
as per one of the comment on above link. OR some other issue. When I tried my script with limited review example, it works fine.
Here is the link where I execute my script:
http://ec2-54-186-110-98.us-west-2.compute.amazonaws.com/scrap/getreview.php
Try adding this to your script -
ini_set("memory_limit", "-1");
set_time_limit(0);
This happened to me when i fetched some mediumtext column from mysql. Are you using like this?
Yesterday I created a script which is working fine, but only with an opened Web Browser which isn't that what I wanted. What I want is that the script runs all the time, even with closed Web Browser.
Could not upload a Picture, so its a short sketch:
(lookup.php) -> pass var data1 -> (run_code.php) -> pass var data1 ->
(check.php) = {{refreshes every 5 seconds till var data2 exists in
MySQl.}} -> goto -> lookup.php.....
The only problem is that I have no idea how to send a value from one .php file to another without GET,POST,COOKIE or SESSION. Session would be the best option but "lookup.php" is looking for a value, and if I start session I get the error "header is already set".
So, how should I pass these value from "lookup.php" to "run_code.php"?
Second problem is, "check.php" is a code that checks if value exists in mysql. This code refreshed and executes itself after 5 seconds using META REFRESH. But also this is not working without a browser.
How should I write the script that the script executes itself after a time?
If i understand you want to write a script (and choose php probably because your familiar with its syntax) and you need it to run every X minutes.
Problem is that when you write a web site with PHP you can pass information using HTTP methods (get/post) and using the session
when running a script on a machine you don't have the HTTP traffic and no session (this can be simulated but its a bit complicated)
my suggestion is :
1) combine all the php files into 1 long php file that will be running (this way you can work with variables in the script with no problem) - you can copy past your code into 1 php file and you can include the needed files in your script so you can stile keep the original files
2) add a cron job (if its a Linux system) - http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/ - this way your script with run every X time (your choice)
I am using PHP and AJAX requests to get the output of a program that is always running and print it on a webpage at 5 second intervals. Sometimes this log file can get up to 2mb in size. It doesn't seem practical to me for the AJAX request to fetch the whole contents of this file every 5 seconds if the user has already gotten the full contents at least once. The request just needs to get whatever contents the user hasn't gotten in a previous request.
Problem is, I have no clue on where to begin to find what contents the user hasn't received. Any hints, tips, or suggestions?
Edit: The output from the program starts off with a time (HH:MM:SS AM/PM), everything after has no pattern. The log file may span over days, so there might not be just one "02:00:00 PM" in the file, for example. I didn't write the program that is being logged, so there isn't a way for me to modify the format in which it prints it's output.
I think using a head request might get you started along the right path.
check this thread:
HTTP HEAD Request in Javascript/Ajax?
if you're using jquery, it's a simple type change in the ajax call:
$.ajax({url: "some url", type: "HEAD".....});
personally, I would check the file size & date modified against the previous response, and fetch the new data if it has been updated. I'm not sure if you can fetch only parts of a file via ajax, but I'm sure this can be accomplished via php pretty easily, possibly this thread may help:
How to read only 5 last line of the text file in PHP?
It depends how your program is made and how does it print your data, but you can use timestamps to reduce the amount of data. If you have some kind of IDs, you should probably use them insteam of timestamps.
I've got a script in php that continually grows an array as it's results are updated. It executes for a very long time on purpose as it needs to filter a few million strings.
As it loops through results it prints out strings and fills up the page until the scroll bar is super tiny. Instead of printing out the strings, I want to just show the number of successful results dynamically as the php script continues. I did echo(count($array)); and found the number at 1,232,907... 1,233,192 ... 1,234,874 and so forth printed out on many lines.
So, how do I display this increasing php variable as a single growing number on my webpage with Javascript?
Have your PHP script store that number somewhere, then use AJAX to retrieve it every so often.
You need to find a way to interface with the process, to get the current state out of it. Your script needs to export the status periodically, e.g. by writing it to a database.
The easiest way is to write the status to a text file every so often and poll this text file periodically using AJAX.
You can use the Forever Frame technique. Basically, you have a main page containing an iframe. The iframe loads gradually, intermittently adding an additional script tag. Each script tag modifies the content of the parent page.
There is a complete guide available.
That said, there are many good reasons to consider doing more pre-computation (e.g. in a cron job) to avoid doing the actual work during the request.
This isn't what you're looking for (I'm as interested in an answer to this..), but a solution that I've found works is to keep track of the count server-side, and only print every 1000/5000/whatever number works best, rather than one-by-one.
I'd suggest that you have a PHP script that returns the value in JSON format. Then in another php page you can do an AJAX call to the page and fetch the JSON value and display it. Your AJAX call can be programmed to run perhaps every 5 seconds or so depending on how fast your numbers output. Iframe though easier, is a bit outdated.
Some operations take too much time, which lead the ajax request to time out.
How do I finish responding to the request first, then continue that operation?
The ignore_user_abort directive, and ignore_user_abort function are probably what you are looking for : it should allow you to send the response to the browser, and, after that, still run some calculations on your server.
This article about it might interest you : How to Use ignore_user_abort() to Do Processing Out of Band ; quoting :
EDIT 2010-03-22 : removed the link (was pointing to http:// ! waynepan.com/2007/10/11/ ! how-to-use-ignore_user_abort-to-do-process-out-of-band/ -- remove the spaces and ! if you want to try ), after seeing the comment of #Joel.
Basically, when you use
ignore_user_abort(true) in your php
script, the script will continue
running even if the user pressed the
esc or stop on his browser. How do you
use this? One use would be to
return content to the user and allow
the connection to be closed while
processing things that don’t require
user interaction.
The following example sends out
$response to the user, closing the
connection (making the browser’s
spinner/loading bar stop), and then
executes
do_function_that_takes_five_mins();
And the given example :
ignore_user_abort(true);
header("Connection: close");
header("Content-Length: " . mb_strlen($response));
echo $response;
flush();
do_function_that_takes_five_mins();
(There's more I didn't copy-paste)
Note that your PHP script still has to fit in the max_execution_time and memory_limit constraints -- which means you shouldn't use this for manipulations that take too much time.
This will also use one Apache process -- which means you should not have dozens of pages that do that at the same time.
Still, nice trick to enhance use experience, I suppose ;-)
Spawn a background process and return background process id so user can check on it later via some secret URL.
Sort of depends on what you're trying to accomplish.
In my case, I needed to do a bunch of server-side processing, with only a minimal amount of data being sent back to the browser - summary info really.
I was trying to create a message sender - sends out an email to over 250 people, but possibly many more (depends on how many have registered with the system).
the PHP mail handler is quick, but for large numbers, not quick enough, so it was bound to time out. To get around that, i needed to delay the timeout on the server/PHP side, and keep the browser hanging on till all data was summarized and displayed.
My solution - a teaser.
essentially, i gave the user a message stating some initial stats (attempting to send this many emails), created 2 DIV boxes (one for current status info, the second for final summary info), displayed the page footer, started the processing, and when finished, updated the summary info. it goes as follows:
Start by collecting your data you're going to process, and get some summary info. In my case, i pulled the list of email addresses, validated them and counted them.
Then display the "attempting" info:
echo "Message contents:";
echo "<blockquote>$msgsubject<p>$msgbody</blockquote><p> </p>";
echo "Attempting <strong>" . $num_rows . "</strong> email addresses.";
Now create some DIVs for status/final info:
<div id=content name=content>
<div id=progress name=progress style='border: black 1px solid; width: ".$boxwidth."px; height: 20px;'></div>
<br>
</div>
where $boxwidth is the width you'd like your progress bar to be (explained later)
Notice that they are essentially empty - we'll fill them later.
Finally, fill out the rest of the page by displaying the footer of the page (in my case I just "included" the appropriate file).
Now, all that is still hanging out in the page buffer, either on the server (if PHP is buffering) or the browser (because it hasn't been told we're done yet), so let's cause it to get pushed and/or displayed using the "ignore" and "flush" from above:
ignore_user_abort(true);
flush();
Now that the browser is starting to display stuff, we need to give the user something to see, so here's the tease - we'll create a status bar that we'll display in the inner DIV, and a final message for the outer DIV.
So, as you loop through your data, periodically (I'll leave "how often" up to you to figure out), output the following:
set_time_limit (3);
...
(process your data here)
...
<script>document.getElementById('progress').innerHTML += "<img src='images/progress_red.gif' width: $width height=20 border=0>"</script>
flush();
This will essentially stack up the little 4x20 progress_red images next to each other, making it appear that a red bar is moving across the screen. In my case, i did this 100 times, based on a percentage of what number I was currently processing out of the total number to process. The "set_time_limit" will add 3 seconds to the existing limit, so that you know your script won't run into the PHP time limit wall. Adjust the 3 seconds for your application as needed.
even though the page is technically "complete", the javascript code will update the DIV html and our progress bar will "move".
when all done, i then report on how many items were actually processed and add that to the outer DIV html and the page is complete:
<script>document.getElementById('content').innerHTML += "Message was sent to $sentcnt people."</script>
The browser is happy because it's had all it's qualifications met (end of page syntactially), the user sees somethign happening right up to the end, and the server
I tend not to use Javascript if I can help it, but in the case, there was no fancy CSS stuff being done through the Javascript, so it's pretty straightforward and easy to see whats going on and low overhead on the browser site. And it functions pretty smoothly.
I don't claim this is perfect, but for a simple, low overhead solution, it works for me without having to create cronjobs, extra queuing mechanisms or secondary processes.
You need the flush() call to have the response sent to the browser immediately.