"Loading Record Number [##] of [Total]" progress bar - php

I have a problem that may not have a solution, but hopefully someone out there can figure this out.
I developed a website in PHP/MySQL that uses HTML/CSS to process payroll. When the user submits the payroll for the past (2 week) period, it processes each employee's hours. For companies with <50 employees, it can process it pretty fast, but for companies with over 100 employees, it can take quite a while to process. What I would like ideally is not a generic 'Loading' bar or an estimated '35% loaded' bar since each company's payroll will vary greatly in employee numbers.
The best solution would be that as soon as they submit the pay period, I could pass the total record number from the PHP/MySQL processor/DB, then update the number as each employee is processed from the PHP processor, so the user would see "Processing Employee 35 of 134" for example where '35' would increment and be updated as each record is processed. Or, if not possible, I'd even be fine with a dynamic list such as:
Processing Employee 1 of 134
Processing Employee 2 of 134
Processing Employee 3 of 134
Processing Employee 4 of 134
and so on ...
Ajax or Javascript seem to be the best options to achieve this, however I can't figure out yet how to use them to achieve this. Any help or insight would be greatly appreciated. I'll continue looking and update this post if I find anything as well.

I've done that by calling the flush() command in PHP while iterating through the batch, but you could get tricky and update a hidden field and have a javascript function on setTimeOut check that value and update a progress bar.
http://php.net/manual/en/function.flush.php
And progress bar:
http://docs.jquery.com/UI/Progressbar
What I would do with the dynamic list is:
$count = 0;
// some db query
// start output
echo "<ul>";
// iterate through records and perform dynamic insert
$count ++;
echo "<li>Processed " . $count . " records.</li>";
flush();
// end iteration
// end output
echo "</ul>";
If you want to only update every % of records, then like you stated get a total count, then perhaps use a modulus operator in if clause. For example if you had 50 records and you wanted to update every 5, if($count mod 5 == 0) { echo ... flush() }

You would have to make a combination of what Mike S. suggested and quick ajax calls (say every 500 ms) You could make an ajax call to a text file that is written to from your PHP file....
For example:
<?php
$count = 0;
mysql_connect('blahblah');
// start output
$query = mysql_query("SELECT ...");
while($rs = mysql_fetch_assoc($query)) {
$fh = fopen('filename.txt','w');
fwrite($fh, $count);
fclose($fh);
++$count
}
?>
Then you need to make an ajax call every 500 ms (or sooner than that) to that filename.txt file and read the contents of that file to see how far along you are in processing your request. You could even do something similar to write in the contents of the php file [current_count]-[total_count] (15-155 for on record 15 of 155 total records) and do results.split('-') in your javascript coding.

My approach would be to store the total number of records and the current record number in session variables. Then set up a php page that returns the text/html of "Processing employee $currentRec of $totalRec".
When you submit the request from the main page, display a div on the page to show the status message. Fire off an ajax request to process the data and have it hide the div when it is complete. The code that processes the records can update the session variable as it goes along. At the same time fire off a periodical ajax request that gets the status message and updates the div's contents with the response. Have this continue until the div is no longer visible. You should have a status message on the page that pops up while the data is being processed to display the current record number, and it will update as often as you like based on how you set up the update timer.
The exact implementation would depend on whether you are using jQuery, prototype, plain Javascript, etc...

Related

run the same script but with a different variable each given period of time

Let's say I have a text file that has a list of urls, from which social media comments must be parsed regularly. I don't want to parse comments from all pages at once as that's a significant load. I need to run my script with a different $url variable corresponding to a line from that text file each 5 minutes.
So it must take the first line as $url and complete the script using this variable, after 5 minutes the variable $url must change to the second line from that file and complete the script with it, in another 5 minutes the same must be repeated for the third line from that file, and so on. When it reaches the last line, it must start from the beginning.
Sorry, can't show any attempts, because I have no idea how to implement it, and I couldn't come up with an appropriate search request either.
As a 1st step you should setup cron job (ex: cron.php) which will be executed every 5 minutes.
crontab
*/5 * * * * /path_to_your_cron_php/cron.php
Lets assume that you have your urls in file named file.txt in this simple txt format.
file.txt
https://www.google.com/
https://www.alexa.com/
https://www.yourdomain.com/
Lets create file where we will keep index of url we want to execute next in index.txt which will have just 1 line with 1 value.
index.txt
0
cron.php
<?php
$fileWithUrl = '/path/to/your/file.txt';
$index = (int)file_get_contents('/path/to/your/index.txt');
$urls = file($fileWithUrl);
$maxIndex = count($urls);
$url = $urls[$index];
your_parse_function($url);
file_put_contents('/path/to/your/index.txt',($index >= $maxIndex) ? 0 : $index++);
As you can see this script reads content of file.txt and index.txt. Convert 1st one to an array of urls and cast index.txt to integer index.
After execution of your_parse_function() this script will replace the content of index.php with incremented index or reset it to 0 if it is bigger than number of urls we have in file.txt.
Since variables don't persist through different runs, you'd need to keep track of the ones you have already parsed and the ones that remain outside of your code.
The most efficient way would be to have a semaphore table with each URL on a single row, paired with a parsed/pending flag.
Each time the cron runs, select a single row from the semaphore table which is flagged pending:
assuming it's done on mysql:
select url
from semaphore
where status='pending'
limit 1;
this will select one (whatever one) url that's yet to be parsed. Take that as input from your parser and after parsing, update the flag to parsed so it's not selected again.
Other approaches would be to keep a counter on a text file or a database table. Each time the cron runs, check what the counter is and process the next number. After processing, update the counter to the current value + 1.
EDIT:
This may be a simple way to solve your re-iteration with a variable list of URLs
1.- Create a table with the following fields:
id, url, status (pending/parsed), last_updated (datetime)
2.- on each run of your cron:
select url from semaphore where status='pending' order by last_updated asc limit 1
3.- if a url is returned, process that. Upon completion, update the status to parsed and last_updated to the current timestamp.
if nothing is returned, update every row to status = pending (but not the last_updated field) and then re-run the above query.
By doing this, you can be sure that when starting over, you'll be first processing the url that has been "waiting" longer
PHP is pretty stateless by default, so once a script has finished executing, everything is wiped.
What I would do: Try a for loop, and use PHP's sleep() function for a break in between URLs. You can either run that loop as a cron job (better), or put it in a while (true) loop and never let it "finish".
https://secure.php.net/manual/en/function.sleep.php
If you want to do this with only the things you currently are using (PHP and that text file), you could just remove that first line from the text file when you process it and then append it back to the end once you're done. You'd either have to open two successive file handles or seek to the end of the file using one, but you wouldn't need any additional data structures/SQL/what have you. Make the text file itself rotate while you blindly fire cron every five minutes.

Mysql_query how to use it with a "dynamic" database?

Today i'm working on my website, trying to display the last winner of the game. Here's my code :
$lastgame = fetchinfo("value","info","name","current_game");
$winnercost = fetchinfo("cost","games","id",$lastgame-1);
$winnerpercent = round(fetchinfo("percent","games","id",$lastgame-1),1);
$winneravatar = fetchinfo("avatar","users","steamid",$lastwinner);
$winnername = fetchinfo("name","users","steamid",$lastwinner);
echo '<p>Game #'.$lastgame.'</p>';
echo '<p>'.$winnername.' won the jackpot</p>';
echo '<p> valued at '.$winnercost.'</p>';
echo '<p>with a winning chance of '.$winnerpercent.'</p>';
The point is i use fetchinfo only, so it displays informations but not in real time, i have to refresh my page to display the latest winner. I'll need to make a mysql_query i guess.
My problem is that i don't understand how to use the mysql_query, knowing that each time a winner wins it creates a new row in my table. For example :
id : 1
startime :1441330544
total price : 3.17
Winner : Foxy
steam id : 76561198042016725
Percent chances to win : 98.7381703
Number of total item : 2
module : 0.24973750
Anyone has a solution to help me ? this -1
,$lastgame-1),1);
gives me some difficulties :(
The result on the website atm :
Game #4
Foxy won the jackpot
valued at 3.31
with a winning chance of 94.6
Based on the information you provided, I think the simplest query you could use would be something like:
select * from <tblname> order by steamid desc limit 1;
For the auto refresh you could do one of two things:
1) you could add: <meta http-equiv="refresh" content="5"> in the header of your html and the page will automatically refresh every 5 seconds. (basic)
2) you could use ajax to make a call to execute the query and update the page which is a much nicer user experience (more advanced)
Make an api page with a php or any other preferred serverside programming web language that when called gives you the most up to date query results it can.
Then create a loop in javascript that every x amount of seconds sends an ajax request to the api page and using that data,you can dynamically update your fields with jquery or js (really doesnt matter - you probably know how to update/change text)
Here some resources:
ajax getting started
link 1
more ajax
link 2

ajax sum total number of VIEWS

I have a statistic on my page which shows the TOTAL NUMBER OF ARTICLE VIEWS...
here is the code I wrote to have the total number of views:
$db = JFactory::getDBO();
$total_views = "SELECT SUM(times_viewed) FROM #__hdflv_upload";
$db->setQuery($total_views);
$result = $db->loadResult();
What I'm trying to achieve is, UPDATE this total number of Views AJAX on page, so if I get for example +10 views in the last 5 seconds, the total number of views must SUM +10 AJAX... without page reload....
I searched in google etc.. but didn't find nothing close to this that can help me... can somebody PLEASE give me a hand. THANK YOU VERY MUCH.
On your statistic page, you could set a JS timer which uses .ajax to get the SQL results every 5 seconds. The script it calls can return the query result in JSON format, then update your DOM accordingly.
There's a pretty good answer at Joomla Stackexchange which should help you get a Joomla AJAX call working properly to run the SQL and return the result.

Best way to handle a large while loop in PHP

I have a script that runs via CRON that processes each row (or user) in one of the tables in my databases, then uses cURL to pull a URL based on the username found in the row, and then adds or updates additional information into the same row. This works fine for the most part, but seems to take about 20 minutes+ to go through the whole database and it seems to go slower and slower the farther it is into the while loop. I have about 4000 rows at the moment and there will be even more in the future.
Right now a simplified version of my code is like this:
$i=0;
while ($i < $rows) {
$username = mysql_result($query,$i,"username");
curl_setopt($ch, CURLOPT_URL, 'http://www.test.com/'.$username.'.php');
$page = curl_exec($ch);
preg_match_all('htmlcode',$page,$test)
foreach ($test as $test3) {
$test2 = $test[$test3][0];
}
mysql_query("UPDATE user SET info = '$test2' WHERE username = '$username');
++$i;
}
I know MySQL querys shouldn't be in a while loop, and it's the last query for me to remove from it, but what is the best way to handle a while loop that needs to run over and over for a very long time?
I was thinking the best option would be to have the script run through the rows ten at a time then stop. For instance, since I have the script in CRON, I would like to have it run every 5 minutes and it would run through 10 rows, stop, and then somehow know to pick up the next 10 rows when the CRON job starts again. I have no idea how to accomplish this however.
Any help would be appreciated!
About loading the data step by step:
You could add a column "last_updated" to your table and update it every time you load the page. Then you compare the column with the current timestamp before you load the website again.
Example:
mysql_query("UPDATE user SET info = '$test2', last_updated = ".time()." WHERE username = '$username');
And when you load your data, make it "WHERE last_updated > (time()-$time_since_last_update)"
What about dropping the 'foreach' loop?
Just use the last element of the $test array.
LIMIT and OFFSET are your friends here. Keep track of where you are through a DB field as suggested by Bastian or you could even store the last offset you used somewhere (could be a flat file) and then increase that every time you run the script. When you don't get any more data back, reset it to 0.

Storing and Displaying Live Stats

Say we are a site receiving massive amounts of traffic, Amazon.com size traffic. And say we wanted to display a counter on the home page displaying the total number of sales since December the first and the counter was to refresh via ajax every 10 seconds.
How would we go about doing this?
Would we have a summary database table displaying the total sales and each checkout would +1 to the counter and we would get that number every 10 seconds? Would we COUNT() the entire 'sales' table every 10 seconds?? Is there an external API I can push the stats off to and then do an ajax pull from them?
Hope you can help, Thanks
If your site is ecomm based, in that you are conducting sales, then you MUST have a sales tracking table somewhere. You could simply make the database count part of the page render when a user visits or refreshes your site.
IMO, there is no need to ajax this count as most visitors won't really care.
Also, I would recommend this query be run against a readonly (slave) database if your traffic is truly at amazon levels.
I would put triggers on the tables to manage the counter tables. When inserting a new sale the sum table would get the new value added to the row for the current day. That also gives sales per day historically without actually querying the big table.
Also, it allows for orders to be entered manually for other dates than today and that day get updated statistics.
As for the Ajax part that's just going to be a query into that sum table.
Whatever you do, do not re-COUNT everything every 10 seconds. Why not to have a cronjob, which does the counting of data every 10 seconds? It could take current time-10 seconds and in slave database add the difference to current count ?
Still 10 seconds sound bizarre. Every minute, mm?

Categories