I have this small bit of code (listed at the end of this message), that runs on page load. We get around 50,000 UNIQUE visitors per day (not counting repeats). It could be coincidental, but ever since implementation, there have been random server load issues.
So what I'm asking is...
1) Can someone confirm/deny whether or not the below code can in fact cause issues?
2) Can this be optimized?
Just fyi:
-- I have stuck this function in the HEADER file of a WordPress layout.
-- It is called 10+ times in the footer
-- It is a VPS server using NGINX
-- I have not checked the logs just yet
The code's purpose...
We specify a percentage to the function that tells the code to display a string that percent of the time (so if we put 60, then it means the string should show up 60% of the time). Each entry in the footer generates its own random number.
The code:
function writeRndString($theString, $percent) {
$randno = rand(1,100);
if($randno <= (int)$percent) {
echo "Random String: " . $theString;
echo "\n\n";
}
}
This is a very simple function, should be fast, even if you call it several times. Even with 50000 daily, which is about 2 pages per second.
If you can, simply remove it for a few minutes and check the server load. It could be called a lot more times than you assume :)
Maby....
You forgot a $ on: echo "Random String: " . theString;
And alitte bether maby, don't use variables you don't need in fact.
maby also use return
function writeRndString($theString, $percent) {
if (rand(1, 100) <= (int) $percent) {
return "Random String: " . $theString . "\n\n";
}
}
PHP:
<?php
echo "blablabla" . writeRndString($x, $y);
?>
Related
I've been trying to validate over 1 million randomly generated values (strings) with PHP and a client side programming language on an online form, but there are a few challenges I'm facing:
PHP
Link to the (editable) PHP code:https://3v4l.org/AtTkO
The PHP code:
<?php
function generateRandomString($length = 10) {
$characters = '0123456789abcdefghijklmnopqrstuvwxyz-_.';
$charactersLength = strlen($characters);
$randomString = '';
for ($i = 0; $i < $length; $i++) {
$randomString .= $characters[rand(0, $charactersLength - 1)];
}
return $randomString;
}
$unique = array();
for ($i = 0; $i < 9000000; $i++)
{
$u=$i+1;
$random = generateRandomString(5);
if(!in_array($random, $unique)){
echo $u.".m".$random."#[server]\n";
$unique[] = $random;
gc_collect_cycles();
}else{
echo "duplicate detected";
$i--;
}
}
echo memory_get_peak_usage();
What should happen:
New 5 character value gets randomly generated
Value gets checked if it already exists in the array
Value gets added to array
All randomly generated values are exported to a .txt file to be used for validating. (Not in the script yet)
What actually happens:
I hit either a memory usage limit or a server timeout for the execution time.
What I've tried
I've tried using sleep(3) during the for loop.
Setting Memory limit to -1 and timeout to 0. The unlimited memory doesn't make a difference and is too dangerous in a working environment.
Using gc_collect_cycles() during the for loop
Using echo memory_get_peak_usage(); -> I don't really understand
how I could use this for debugging.
What I need help with:
Memory management in PHP
Having pauses in the script that will reset the PHP execution timer
Client Side Programming language
This is where I have absolutely no clue which way I should go or which programming language I should use for this.
What I want to achieve
Load a webpage that has a form
Load the .txt with all randomly generated strings
fill in the form with the first string
submit the form:
If positive response from form > save string in special .txt file or array, go to the next value
If negative response from form > delete string from file, go to the next value | or just go to the next value
All values with a positive response are filtered out and easily accessible at the end.
I don't know which programming language I should use for this function. I've been thinking about Javascript and Python but I'm not sure how I could combine that with PHP. A nudge in the right direction would be appreciated.
I might be completely wrong for trying to achieve this with PHP, if so, please let me know what would be the better and easier option.
Thanks!
Interesting question, first of all whenever you think of a solution like this, one of the first things you need to consider is can it be async? If your answer is yes, then your implementation will likely be simple, else, you will likely have to pay huge server costs or render random cached results.
NB remove gc_collect_cycles. It does the opposite of what you want, and you hardly ever need to call it manually.
That being said, the approach I would recommend in your case is as follows:
Use a websocket which will be opened only once on the client browser, and then forward results in realtime from server to the browser. Of course, this code itself, can run completely on clientside via javascript, so if it's not just a PoC, you can convert the php code to javascript.
Change your code to yield items or forward results via websocket once a generated code has been confirmed as unique.
However, if you're really just doing only what the PHP code says, you can do that completely in javascript and save your server resources. See this answer for an example code to replace your generateRandomString function.
Assuming you have the ability to edit the php.ini:
Increase your memory limit as described here:
PHP MEMORY LIMIT INCREASE
For the 'memory limit' see here
and for the 'timeout for the execution time' add :
set_time_limit(0);
on the top of the PHP file.
Have you tried using sets? https://www.php.net/manual/en/class.ds-set.php
Sets are very efficient whenever you want to ensure a value isn't present twice.
Checking the presence of a value in a set it way way way faster that loop across all entries on the array.
I'm not a expert with PHP but it would look like something like that in Ruby
require 'set'
CHARS = '0123456789abcdefghijklmnopqrstuvwxyz-_.'.split('');
unique = Set.new()
def generateRandomString(l = 10)
Array.new(l) { CHARS.sample }.join
end
while unique.length < 1_000_000
random_string = generateRandomString
if !unique.include?(random_string)
unique.add(random_string)
end
end
hope it helps
I have this very simple PHP call to Alpha Vantage API to fill a table (or list) with NASDAQ stock prices:
<?php
function get_price($commodity = "")
{
$url = 'https://www.alphavantage.co/query?function=TIME_SERIES_DAILY_ADJUSTED&symbol=' . $commodity . '&outputsize=full&apikey=myKey';
$obj = json_decode(file_get_contents($url), true);
$date = $obj['Meta Data']['3. Last Refreshed'];
$result = $obj['Time Series (Daily)']['2018-03-23']['4. close'];
$rd_result = round($result, 2);
echo $result;
}
?>
<?php get_price("XOM");
get_price("AAPL");
get_price("MSFT");
get_price("CVX");
get_price("CAT");
get_price("BA");
?>
And it works, but just so freaking slow. It can take ove 30 secs. to load while the json file from Alpha Vantage loads in a fraction of second.
Does anyone knows where am I going wrong?
This what i did when the API took time to reply, my solution is written in C# but the logic would be the same.
string[] AlphaVantageApiKey = { "RK*********", "B2***********", 4FD*********QN", "7S3Z*********FRX", "U************I3" };
int ApiKeyValue = 0;
foreach (var stock in listOfStocks)
{
DataTable dtResult = DataRetrival.GetIntradayStockFeedForSelectedStockAs(stock.Symbol.Trim().ToUpper(), ApiKeyValue);
ApiKeyValue = (ApiKeyValue == 4) ? 0 : ApiKeyValue + 1;
}
I use 5 to 6 different API keys, when i'm querying data. I loop thought each of them for each call. There by reducing load on one perpendicular token.
I observed that this improved my performance a lot. It takes me less than 1 min to get Intraday data for 50 stocks.
Another, way you can improve your performance is to use
outputsize=compact
compact returns only the latest 100 data points in the time series.
UPDATE: Batch Stock Quotes
You might want to consider using this type of query as well. Multiple stock quotes all in one call.
Also, using the full output size is grabbing data from the past 20 years, if applicable. Take that out of your query and have the API use its condensed output default.
EDIT: According to the above, you should make changes to your query. But it can also be an issue with your server. I tested this for a use case I am working on and it takes me a few seconds to get the data, albeit I am only pulling it for one stock symbol on a page at a time.
Try increasing your memory limit if things are too slow for your liking.
<?php
ini_set('memory_limit','500M'); // or your desired limit
?>
Also, if you have shared hosting, that might be the problem. However, I do not know enough about your server to answer that fully.
With this code :
for($i = 1; $i <= $times; $i++){
$milliseconds = round(microtime(true) * 1000);
$res = file_get_contents($url);
$milliseconds2 = round(microtime(true) * 1000);
$milisecondsCount = $milliseconds2 - $milliseconds;
echo 'miliseconds=' . $milisecondsCount . ' ms' . "\n";
}
I get this output :
miliseconds=1048 ms
miliseconds=169 ms
miliseconds=229 ms
miliseconds=209 ms
....
But with sleep:
for($i = 1; $i <= $times; $i++){
$milliseconds = round(microtime(true) * 1000);
$res = file_get_contents($url);
$milliseconds2 = round(microtime(true) * 1000);
$milisecondsCount = $milliseconds2 - $milliseconds;
echo 'miliseconds=' . $milisecondsCount . ' ms' . "\n";
sleep(2);
}
This :
miliseconds=1172 ms
miliseconds=1157 ms
miliseconds=1638 ms
....
So what is happening here ?
My questions:
1) Why don't you test this yourself by using clearstatcache and checking the time signatures with and without using it?
2) Your method of testing is terrible, as a first fix - have you tried swapping so that the "sleep" reading function plays first rather than second?
3) How many iterations of your test have you done?If it's less than 10,000 then I suggest you repeat your tests to be sure to identify firstly the average delay (or lack thereof) and then what makes you think that this is caused specifically by caching?
4) What are the specs. of your machine, your RAM and free and available memory upon each iteration?
5) Is your server live? Are you able to remove outside services as possible causes of time disparity? (such as anti-virus, background processes loading, server traffic, etc. etc.)?
My Answer:
Short answer, is NO. file_get_contents does not use the cache.
However, operating system level caches may cache recently used files to
make for quicker access, but I wouldn't count on those being
available.
Calling a file reference multiple times will only read the file
from disk a single time, subsequent calls will read the value from the
static variable within the function. Note that you shouldn't count on
this approach for large files or ones used only once per page, since
the memory used by the static variable will persist until the end of
the request. Keeping the contents of files unnecessarily in static
variables is a good way to make your script a memory hog.
Quoted from This answer.
For remote (non local filesystem) files, there are so many possible causes of variable delays that really file_get_contents caching is a long way down the list of options.
As you claim to be connecting using a localhost/ reference structure, I would hazard (but not certain) that your server will be using various firewall and checking techniques to check the incoming request which will add a large variable to the time taken.
Concatenate a random query string with url.
For eg: $url = 'http://example.com/file.html?r=' . rand(0, 9999);
In my PHP project I'm trying to time page generation. Using the script from http://blog.alastair.pro/2013/01/18/php-page-generation-time/ I have created a function that I am calling in footer.php and other miscellaneous PHP files that don't have a footer. As far as I'm aware, this should work:
function timer($type)
{
$totalTime = round((microtime(TRUE) - $_SERVER['REQUEST_TIME_FLOAT']), 4);
if (is_int($type)) {
if ($type == TYPE_TIMER_COMMENT) {
return "<!-- Page generated in " . $totalTime . " seconds. -->";
} elseif ($type == TYPE_TIMER_PLAINTEXT) {
return "Page generated in " . $totalTime . " seconds.";
}
} else {
header('location:./error/type.html');
die();
}
}
Unfortunately, it doesn't. It outputs huge numbers:
Page generated in 1428979311.8916 seconds.
Page generated in 1428979357.1691 seconds.
Page generated in 1428979346.8255 seconds.
etc etc
I have absolutely no idea what's happening here. I'm calling it like so:
if (Config::DEBUG)
echo ('<small>' . timer(TYPE_TIMER_PLAINTEXT) . '</small>');
TYPE_TIMER_PLAINTEXT and TYPE_TIMER_COMMENT are consts declared inside of functions.php, but I don't see how this could affect what the function is doing. Any ideas on what's happening here?
The numbers you're getting look a lot like numbers you would get from microtime(true) - 0. This implies $_SERVER['REQUEST_TIME_FLOAT'] = 0 (or at least something that acts like zero when you subtract it.)
Based on this note in the PHP documentation for microtime, I assume your PHP version is < 5.4, and 'REQUEST_TIME_FLOAT' is not present in $_SERVER. If your PHP version is >=5.4, then I agree it looks like it should be working.
As of PHP 5.4.0, REQUEST_TIME_FLOAT is available in the $_SERVER superglobal array. It contains the timestamp of the start of the request with microsecond precision.
UPDATE: I fixed the problem by updating my PHP to something newer than 5.4.0. The $_SERVER['REQUEST_TIME_FLOAT'] was only introduced in 5.4.0, so having PHP 5.3.3 created the issue.
I'm working on a game, written in PHP and that runs in a console. Think back to old MUDs and other text-based games, even some ASCII art!
Anyway, what I'm trying to do is have things happening while also accepting user input.
For instance, let's say it's a two player game and Player 1 is waiting for Player 2 to make a move. This is easily done by just listening for a message.
But what if Player 1 wants to change some options? What if they want to view details on aspects of the game state? What about conceding the game? There are many things a Player may want to do while waiting for their opponent to make a move.
Unfortunately the best I have right now is the fact that Ctrl+C completely kills the program. The other player is then left hanging, until the connection is dropped. Oh, and the game is completely lost.
I get user input with fgets(STDIN). But this blocks execution until input has been received (which is usually a good thing).
Is it even possible for a console program like this to handle input and output simultaneously? Or should I just look at some other interface?
In short PHP is not built for this, but you might get some help from one of these extensions. I'm not sure how thorough they are, but you really probably want to use a text UI library. (And really you probably do not want to use PHP for this.)
All that said, you need to get non blocking input from STDIN character by character. Unfortunately most terminals are buffered from PHP's point of view, so you won't get anything until enter is pressed.
If you run stty -icanon (or your OS's equivalent) on your terminal to disable buffering, then the following short program basically works:
<?php
stream_set_blocking(STDIN, false);
$line = '';
$time = microtime(true);
$prompt = '> ';
echo $prompt;
while (true)
{
if (microtime(true) - $time > 5)
{
echo "\nTick...\n$prompt$line";
$time = microtime(true);
}
$c = fgetc(STDIN);
if ($c !== false)
{
if ($c != "\n")
$line .= $c;
else
{
if ($line == 'exit' || $line == 'quit')
break;
else if ($line == 'help')
echo "Type exit\n";
else
echo "Unrecognized command.\n";
echo $prompt;
$line = '';
}
}
}
(It relies on local echo being enabled to print the characters as they are typed.)
As you see, we are just looping around forever. If a character exists, add it to the $line. If enter is pressed, process $line. Meanwhile, we are ticking every five seconds just to show that we could be doing something else while we wait for input. (This will consume maximum CPU; you'd have to issue a sleep() to get around that.)
This isn't meant to be a practical example, per se, but perhaps will get you thinking in the proper direction.
It is possible to build a game like you describe using ncurses (non-blocking mode) and libevent. That way, you get close to no CPU consumption. Handling individual keys is sometimes awkward (implement Backspace yourself, it's not fun at all - and did you know various OSes send different keycodes on Backspace press?), and gets really tricky if you want to support UTF-8 properly. Still, completely viable.
In particular, it is beneficial to make extensive use of libevent, by reading both the network and keyboard (stdin) input with it. This function enables you to listen for individual keys:
http://www.php.net/manual/en/function.ncurses-cbreak.php
which you can later read using libevent API. The key to keep in mind is that you will sometimes end up reading more than 1 key at a time, and it has to be handled (so loop over everything that you have read). Otherwise, the user will be annoyed to see that not all key presses are "reaching" the application and some are lost.
Sorry Matthew, I'm going to have to un-accept your answer, because I have found it myself:
Use the following code to receive user input while still doing something else:
while(/* some condition that the code running is waiting on */) {
// perform one step or iteration of that code
exec("choice /N /C ___ /D _ /T _",$out,$ret);
// /C is a list of letters that do something
// /D is the default action that will be used as a no-op
// /T is the amount of time to wait, probably best set to one second
switch($ret) {
// handle cases - the "default" case should be "continue 2"
}
}
This can then be used to interrupt the loop and enter an options menu, or trigger some other event, or could even be used to type out a command if used right.