PHP: Most efficient way to make multiple fsockopen(); connections? - php

Hey guys i'm making a website where you submit a server for advertising. When the user goes to the index page of my website it grabs the ip's of all the servers submitted and then tests to see if it is online using fsockopen() like so:
while($row = mysql_fetch_assoc($rs)) {
$ip = $row['ip'];
$info = #fsockopen($ip, 25565, $errno, $errstr, 0.5);
if($info) {
$status = "<div><img width='32px' height='32px'
title='$name is online!' src='images/online.png'/></div>";
$online = true;
} else {
$status = "<div><img width='32px' height='32px'
title='$name is offline!' src='images/offline.png'/></div>";
$online = false;
}
}
}
This way works fine, but the only downside is when you load the site it takes a good 2-4 seconds to start loading the website due to the fsockopen() methods being called. I want to know if there is a better way to do this that will reduce the amount of wait time before the website loads.
Any information will be appreciated, thanks.

Store the online status and last check time in a database, if the last check time is longer than 15 minutes for example, update it. I am pretty sure you don't need to get the status on EVERY pageload? It's the time it takes to connect to each server that slows down the website.
Then again, you would probably wanna move the update process to a cronjob instead of relying on someone visiting your website to update the server statuses.

Looking at your example, I'd make all the $status bits be javascript calls to another php page that checks that individual server.
However, the idea to move the status checks to cron job or use some kind of status caching is very good too. Maybe store statuses in a database only only check the ones that have expired (time limit set by you).

Related

Which Method is More Practical or Orthodox When Importing Large Data?

I have a file that has the function of importing data into a sql database from an api. A problem I encountered was that the api can only retrieve a max dataset size of 1000, even though sometimes I need to retrieve large amounts of data, ranging from 10-200,000. My first thought was to create a while loop in which inside I make calls to the api until all of the data is properly retrieved, and afterwards, can I enter it into the database.
$moreDataToImport = true;
$lastId = null;
$query = '';
while ($moreDataToImport) {
$result = json_decode(callToApi($lastId));
$query .= formatResult($result);
$moreDataToImport = !empty($result['dataNotExported']);
$lastId = getLastId($result['customers']);
}
mysqli_multi_query($con, $query);
The issue I encountered with this is that I was quickly reaching memory limits. The easy solution to this is to simply increase the memory limit until it was suffice. How much memory I needed, however, was undeclared, because there is always a possibility that I need to import very large datasets, and can theoretically always run out of memory. I don't want to set an infinite memory limit, as the problems with that are unimaginable.
My second solution to this was instead of looping through the imported data, I could instead send it to my database, and then do a page refresh, with a get request specifying the last Id I left off on.
if (isset($_GET['lastId'])
$lastId = $_GET['lastId'];
else
$lastId = null;
$result = json_decode(callToApi($lastId));
$query .= formatResult($result);
mysqli_multi_query($con, $query);
if (!empty($result['dataNotExported'])) {
header('Location: ./page.php?lastId='.getLastId($result['customers']));
}
This solution solves my memory limit issue, however now I have another issue, being that browsers, after 20 redirects (depends on the browser), will automatically kill the program to stop a potential redirect loop, then shortly refresh the page. The solution to this would be to kill the program yourself at the 20th redirect and allow it to do a page refresh, continuing the process.
if (isset($_GET['redirects'])) {
$redirects = $_GET['redirects'];
if ($redirects == '20') {
if ($lastId == null) {
header("Location: ./page.php?redirects=2");
}
else {
header("Location: ./page.php?lastId=$lastId&redirects=2");
}
exit;
}
}
else
$redirects = '1';
Though this solves my issues, I am afraid this is more impractical than other solutions, as there must be a better way to do this. Is this, or the issue of possibly running out of memory my only two choices? And if so, is one more efficient/orthodox than the other?
Do the insert query inside the loop that fetches each page from the API, rather than concatenating all the queries.
$moreDataToImport = true;
$lastId = null;
$query = '';
while ($moreDataToImport) {
$result = json_decode(callToApi($lastId));
$query = formatResult($result);
mysqli_query($con, $query);
$moreDataToImport = !empty($result['dataNotExported']);
$lastId = getLastId($result['customers']);
}
Page your work. Break it up into smaller chunks that will be below your memory limit.
If the API only returns 1000 at a time, then only process 1000 at a time in a loop. In each iteration of the loop you'll query the API, process the data, and store it. Then, on the next iteration, you'll be using the same variables so your memory won't skyrocket.
A couple things to consider:
If this becomes a long running script, you may hit the default script running time limit - so you'll have to extend that with set_time_limit().
Some browsers will consider scripts that run too long to be timed out and will show the appropriate error message.
For processing upwards of 200,000 pieces of data from an API, I think the best solution is to not make this work dependant on a page load. If possible, I'd put this in a cron job to be run by the server on a regular schedule.
If the dataset is dependant on the request (for example, if you're processing temperatures from one of 1000s of weather stations - the specific station ID to be set by the user), then consider creating a secondary script that does the work. Calling and forking the secondary script from your primary script will enable your primary script to finish execution while your secondary script executes in the background on your server. Something like:
exec('php path/to/secondary-script.php > /dev/null &');

php and mysqli actions in cron jobs?

I usually update my site "by hand", entering in one page called "enterheretoupdate.php". This page refreshes every minute to do all the job I need, so while this page is "open", my site keeps refreshing every minute.
What does "enterheretoupdate.php" do? It makes things related to mysql: create tables, selects from tables, add rows to tables, etc. Apart from that, it also make calculations on php and updates .json files.
I would like to create a cron job, so that it is not necessary for me to visit "enterheretoupdate.php" on my computer for updating my site every minute.
I am quite new on this, but I have learned how to create a cron job (I use 1and1). The example cron job I have created, consisting on sending an email every minute, works fine.
But then, I tried to save "enterheretoupdate.php" as a cron job and it does not work. Is there a "limitation" on the things a cron job can do? How should I "translate" my php file to make it work as a cron job?
Any help is really welcome.
This is how my .php file looks like:
<?php
$page = $_SERVER['PHP_SELF'];
$sec = "10";
//Change 1 to reload, 0 to not to reload;
$reload=1;
$gamecode=7;
$cmp="EL";
$year=2017;
if ($reload==1) echo"<head><meta http-equiv='refresh'content=".$sec.";URL='".$page."?gamecode=".$gamecode."&cmp=".$cmp."&year=".$year."'></head>";
include("../newcon.php");
include("../formulas.php");
include_once("funLightCreateTables.php");
include_once("funLightFirstFive.php");
include_once("funLightChanges.php");
include_once("funLightLiveJsons.php");
if ($cmp=="EC") {$l="U";}
if ($cmp=="EL") {$l="E";}
//Check
$q="SELECT * FROM LightLiveSchedule WHERE year=".$year." and cmp=".$cmp." and gamecode=".$gamecode."";
$res=mysqli_query($link,$q);
while ($r=mysqli_fetch_assoc($res)){
$started=$r['started'];
}
if ($started==0){
LightCreateTables($cmp,$year,$gamecode);
$q="UPDATE LightLiveSchedule SET started=1 WHERE year=".$year." and cmp=".$cmp." and gamecode=".$gamecode."";
mysqli_query($link,$q);
}
//Read
$pbp=file_get_contents("http://thesite.com/data.json?gamecode=".$gamecode."&seasoncode=".$l.$year."");
$pbp = json_decode($pbp,true);
//Insert
mysqli_query($link,"Truncate P_Live_Temp_".$cmp."_".$year."_".$gamecode."");
$lres=0;
$vres=0;
$n=0;
for ($i=0;$i<=4;$i++){
$nplays[$i]=count($pbp[$qtitle[$i]]);
$ii=0;
for ($j=0;$j<=$nplays[$i];$j++){
//change results
if ($pbp[$qtitle[$i]][$ii]['PUNTOS_A']!=null) {
$lres=$pbp[$qtitle[$i]][$ii]['PUNTOS_A'];
}
if ($pbp[$qtitle[$i]][$ii]['PUNTOS_B']!=null) {
$vres=$pbp[$qtitle[$i]][$ii]['PUNTOS_B'];
}
//clean
if (strpos($pbp[$qtitle[$i]][$ii]['CSDESCWEB'],"(")==0) {$play=$pbp[$qtitle[$i]][$ii]['CSDESCWEB'];}
if (strpos($pbp[$qtitle[$i]][$ii]['CSDESCWEB'],"(")>0) {$play=substr($pbp[$qtitle[$i]][$ii]['CSDESCWEB'],0,strpos($pbp[$qtitle[$i]][$ii]['CSDESCWEB'],"(")-1);}
//count
$points=0;
if ($play=="Three Pointer") {$points=3;}
if ($play=="Two Pointer" or $play=="Lay Up" or $play=="Dunk") {$points=2;}
if ($play=="Free Throw In") {$points=1;}
//ntconsole=00:00 at End Game
if ($play=="End Game") {$pbp[$qtitle[$i]][$ii]['NTCONSOLA']="00:00";}
//insert
$q="INSERT INTO P_Live_temp_".$cmp."_".$year."_".$gamecode."
(orden,shteam,shloc,shvis,quarter,minute,ntconsole,pcode,play,locres,visres,points)
VALUES
(".$n.",'".$pbp[$qtitle[$i]][$ii]['NTEQUIPO']."','".$pbp['ca']."','".$pbp['cb']."',".($i+1).",
".$pbp[$qtitle[$i]][$ii]['MINUTO'].",'".$pbp[$qtitle[$i]][$ii]['NTCONSOLA']."',
'".str_replace(" ","",substr($pbp[$qtitle[$i]][$ii]['NTJUGD'],1,10))."','".$play."',".$lres.",".$vres.",".$points.")";
mysqli_query($link,$q);
$ii++;
$n++;
}
}
Do you think it is suitable for a cron job? How should I proceed? Thanks a lot!
I had similar issues but the following worked for me.
See the link to change default mysql permission
How to allow remote connection to mysql
Now change your db_server value in the sql connection file from
localhost to 127.0.0.1
In your case the you need to edit the file ../newcon.php it seems.

Prevent PHP from sending multiple emails when running parallel instances

This is more of a logic question than language question, though the approach might vary depending on the language. In this instance I'm using Actionscript and PHP.
I have a flash graphic that is getting data stored in a mysql database served from a PHP script. This part is working fine. It cycles through database entries every time it is fired.
The graphic is not on a website, but is being used at 5 locations, set to load and run at regular intervals (all 5 locations fire at the same time, or at least within <500ms of each-other). This is real-time info, so time is of the essence, currently the script loads and parses at all 5 locations between 30ms-300ms (depending on the distance from the server)
I was originally having a pagination problem, where each of the 5 locations would pull a different database entry since i was moving to the next entry every time the script runs. I solved this by setting the script to only move to the next entry after a certain amount of time passed, solving the problem.
However, I also need the script to send an email every time it displays a new entry, I only want it to send one email. I've attempted to solve this by adding a "has been emailed" boolean to the database. But, since all the scripts run at the same time, this rarely works (it does sometimes). Most of the time I get 5 emails sent. The timeliness of sending this email doesn't have to be as fast as the graphic gets info from the script, 5-10 second delay is fine.
I've been trying to come up with a solution for this. Currently I'm thinking of spawning a python script through PHP, that has a random delay (between 2 and 5 seconds) hopefully alleviating the problem. However, I'm not quite sure how to run exec() command from php without the script waiting for the command to finish. Or, is there a better way to accomplish this?
UPDATE: here is my current logic (relevant code only):
//get the top "unread" information from the database
$query="SELECT * FROM database WHERE Read = '0' ORDER BY Entry ASC LIMIT 1";
//DATA
$emailed = $row["emailed"];
$Entry = $row["databaseEntryID"];
if($emailed == 0)
{
**CODE TO SEND EMAIL**
$EmailSent="UPDATE database SET emailed = '1' WHERE databaseEntryID = '$Entry'";
$mysqli->query($EmailSent);
}
Thanks!
You need to use some kind of locking. E.g. database locking
function send_email_sync($message)
{
sql_query("UPDATE email_table SET email_sent=1 WHERE email_sent=0");
$result = FALSE;
if(number_of_affacted_rows() == 1) {
send_email_now($message);
$result = TRUE;
}
return $result;
}
The functions sql_query and number_of_affected_rows need to be adapted to your particular database.
Old answer:
Use file-based locking: (only works if the script only runs on a single server)
function send_email_sync($message)
{
$fd = fopen(__FILE__, "r");
if(!$fd) {
die("something bad happened in ".__FILE__.":".__LINE__);
}
$result = FALSE;
if(flock($fd, LOCK_EX | LOCK_NB)) {
if(!email_has_already_been_sent()) {
actually_send_email($message);
mark_email_as_sent();
$result = TRUE; //email has been sent
}
flock($fd, LOCK_UN);
}
fclose($fd);
return $result;
}
You will need to lock the row in your database by using a transaction.
psuedo code:
Start transaction
select row .. for update
update row
commit
if (mysqli_affected_rows ( $connection )) >1
send_email();

PHP: Check mysql database every 10 seconds for any new rows

I am making a php chat and am starting the php checking database part. So when a user types something into the chat, it gets recorded in the MySQL database, how would I check the database every 10 seconds so that one user's chat would update with new messages from other users. I know that you can use an ajax request to a page with an interval, but I want the php to be on the same page, instead of having to use numerous pages. This is the code for checking the database
<?php
$con = mysqli_connect('host','user','pass','database');
$query = mysqli_query($con,"SELECT * FROM `messages`");
while ($row=mysqli_fetch_assoc($query)) {
$user = $row['user'];
$message = $row['message'];
echo 'User: ',$user,' Message: ',$message;
}
?>
Thanks in advance anyone!
Use MySQL Event Scheduler.
Below link will guide you through .
http://www.9lessons.info/2012/10/mysql-event-scheduler.html.
I think best option in your case .
AJAX is probably the simplest solution. You can perform an AJAX request on the same page your PHP code is executing on if you really want to.
(function check() {
$.get('mypage.php', function(data) {
doSomethingWith(data);
setTimeout(check, 5000); // every 5 seconds
});
})();
PHP doesn't have a setInterval function. While I'm sure you can use a crontask to automate it on the server, you can also achieve this with some simple Javascript.
The concept you are trying to achieve is known as Short Polling. What you want to do is to have a setInterval function in Javascript that constantly makes AJAX requests to your PHP file which performs the check to the database for new messages. Your PHP should return that information to your script where you can then simply populate the user's screen.
There is also Long Polling where you simply maintain the connection and have a setTimeout to wait for messages to come in. You can find more information yourself and if you have questions, you can come back here.
A good video about this:
https://www.youtube.com/watch?v=wHmSqFor1HU
Hope this helps.
This is what you need. We need set time for ajax auto reload. Don't put everything in one page. Because you must reload page to refresh data. That is bad solution.
Call jQuery Ajax Request Each X Minutes
Make a while for 30 seconds, and check the db every second, once you find a record the while is being broken, also it is being broken when 30 secs are expired.
$sec = 1;
while($sec <= 30) {
if(has record)
Send to the user;
$sec++;
sleep(one sec here);
}
Use sleep for 10 secs in order to check every 10 secs...

PHP MySQL Negative Value (Balance) Issue

I am developing a desktop software where it charge user per execution the main action. For example say it will charge user 0.1$ for per PDF print.
and my software provide multithreading. .
so, if it run single thread it works fine :)
but the problem is if user run multiple thread at one (say 10/20 threads)
it (php) also continues user to allow the server/execution even balance get below zero..
though my php script check whether balance is positive ..
but after user run multiple threads balance become like -5.95$ or -25.75$ etc
and that is a big security/financial issue..
here is the code I am using:
<?php
$strSQL = "Select * from users where Email = '$strUser'";
$return = mysql_query($strSQL, $strDBConn);
$strDBData = mysql_fetch_array($return, MYSQL_ASSOC);
//checking balance
$strBalance = $strDBData['Balance'];
if($strBalance < 0)
{
// if balance 0 then exit so, my software/thread will not process further
mysql_close($strDBConn);
exit('Balance Exceed');
}
//rest of the codes that realted to service executaion
// code that substract the balnce
$dblCost = 0.25;
$strSQL = "Update users set Balance = Balance - '$dblCost' where Email = '$strUser'";
$return = mysql_query($strSQL, $strDBConn);
//rest finising codes
?>
any help/suggestion would be highly appreciated..
thanks in advance.
best regards
I think, this is a quite similar question:
What is equivalent of the C# lock statement in PHP?
First, try to switch away from the old "mysql" to somethin new, maybe some PDO like DB access ;).
Then, for getting around with multi-thread in php, it can be a good idea, to write a file for every userid (!) and lock this file, when there's a request. When file is locked in another thread, wait for x seconds for the file to be unlocked by the locker-thread. If it is not unlocked within time, something went wrong. When in locked-thread all went good, unlock the file after every operation needed.
Theoraticaly you will be good with then till there's a multi-thread soloution in PHP ;)

Categories