keep track of a global value on a website - php

I'm currently working on a project that involves a website which gets data from a game API.
Problem is, I am bound to a specific Rate limit (of currently 500 requests per 10 minutes) which i must not exceed.
How do i keep track of the current count of requests other than writing / reading it to a file / database everytime someone requests the data (i guess that woulnd't be the best approach and could potentially ?! lead to problems with a few houndred people accessing the website at the same time)
The website calls a php script with neccessary information the user provides to get the data from the API

You can use APC for this.
The Alternative PHP Cache (APC) is a free and open opcode cache for
PHP. Its goal is to provide a free, open, and robust framework for
caching and optimizing PHP intermediate code.
You don't need any external library to build this extension. Saving and fetching a variable across requests are as easy as this:
<?php
$bar = 'BAR';
apc_add('foo', $bar);
var_dump(apc_fetch('foo'));
echo "\n";
$bar = 'NEVER GETS SET';
apc_add('foo', $bar);
var_dump(apc_fetch('foo'));
echo "\n";
?>
Here is the documentation.

Since all requests are separate, they don't know anything about the other requests. There is no way to have a "shared" variable in PHP.
Best way is probably to create a database table and create a record in there every time you do a request. Keep track of when each request was made with a datetime column.
That way you can quickly check how many requests were done in the last 10minutes by counting the records made in the last 10 minutes.
Run a generic delete query on the table on occasion.
A simple query like that will not really hurt your performance unless you have a really busy site.
Another solution might be to cache the results from the API and re-use the results for each request. Then refresh the results from the API about every few seconds (1 request every 2 seconds ends up at 300/10minutes). But that would require the data to be actually cache-able and re-useable.

Related

PHP Memcache potential problems?

I'll most probably be using MemCache for caching some database results.
As I haven't ever written and done caching I thought it would be a good idea to ask those of you who have already done it. The system I'm writing may have concurrency running scripts at some point of time. This is what I'm planning on doing:
I'm writing a banner exchange system.
The information about banners are stored in the database.
There are different sites, with different traffic, loading a php script that would generate code for those banners. (so that the banners are displayed on the client's site)
When a banner is being displayed for the first time - it get's cached with memcache.
The banner has a cache life time for example 1 hour.
Every hour the cache is renewed.
The potential problem I see in this task is at step 4 and 6.
If we have for example 100 sites with big traffic it may happen that the script has a several instances running simultaneously. How could I guarantee that when the cache expires it'll get regenerated once and the data will be intact?
How could I guarantee that when the cache expires it'll get regenerated once and the data will be intact?
The approach to caching I take is, for lack of a better word, a "lazy" implementation. That is, you don't cache something until you retrieve it once, with the hope that someone will need it again. Here's the pseudo code of what that algorithm would look like:
// returns false if there is no value or the value is expired
result = cache_check(key)
if (!result)
{
result = fetch_from_db()
// set it for next time, until it expires anyway
cache_set(key, result, expiry)
}
This works pretty well for what we want to use it for, as long as you use the cache intelligently and understand that not all information is the same. For example, in a hypothetical user comment system, you don't need an expiry time because you can simply invalidate the cache whenever a new user posts a comment on an article, so the next time comments are loaded, they're recached. Some information however (weather data comes to mind) should get a manual expiry time since you're not relying on user input to update your data.
For what its worth, memcache works well in a clustered environment and you should find that setting something like that up isn't hard to do, so this should scale pretty easily to whatever you need it to be.

Is it possible at the same time to give all users of website the same session?

Is it possible at the same time to give all users of website the same $_SESSION?
I have interpreted your question in the following way:
Is it possible at the same time to give all users of a website the same [state]?
Yes, shared state is usually stored in a database, although state may also be stored in the local file system.
Using $_SESSION is meant to save state for a single user. If you abuse that tool, you will create an insecure system.
I want to do following
if($_SESSION['new_posts'] + 60 >
time()) echo "there are new posts in
the forum!"; I don't want to use mysql
for that.
An easy and fast way
on every new post you do:
//will write an empty file or just update modification time
touch ('new_posts.txt');
And then:
if(filemtime('new_posts.txt') + 60 > time()) { ... }
Global sessions are not possible without opening your site up for a gigantic security hole.
Instead, let's look at what you want to do, which is grabbing data while avoiding a query every page.
1) Writing the value out to a file and then reading it every request is an option. Sessions are stored in a file on the server by default, so it would have the same speed as a session.
2) Store it in a cache such as APC, Memcache, Redis.
Keep in mind these are cached values - you'll still have to update them regularly. You have to use either a cron job or have the client update them. But what do you do when the cache expires and tons of clients are trying to update the cache at once? This is called the dogpile effect and it's something to think about.
Or you could just simply write a SQL query and execute it every page and keep it simple. Why don't you want to do this? Have you profiled the code and determined that it's an actual issue? Worrying about this before it's a known problem is a waste of time.

How to live update browser game attributes like the 4 resources in Travian game?

I would like to make a web-based game which is Travian-like (or Ikariam-like). The game will be in PHP & MySQL-based. I wonder how can I achieve the live updating of game attributes.
For frontend, I can achieve by using AJAX calls (fetch the latest values from database), or even fake update of values (not communicated with server).
For backend, is this done by a PHP cron job (which runs every few seconds)? If so, can anyone provide me some sample codes?
by the way, I know it would be a trouble if I use IIS + FastCGI.
=== Version Information ===
PHP : 5.2.3
IIS : 6.0 with FastCGI
OS : Windows Server 2003 Standard R2
The correct answer depends on your exact needs.
Does everyone always get resources at the same rate? If so, a simple solution is to track how long their user has existed, calculate the amount of resources based on the rate they're getting, and subtract the number of resources they've spent in total. That's going to be a bit of a problem if the rate can ever change, though, so if you use this solution, you're pretty much stuck with the rate you pick unless you rewrite the handling entirely (for example to the one below).
If it varies how quickly people can get resources, you'll need to update the data periodically. A cronjob/scheduled task would work well to make sure everyone is updated, but in some situations, it might be better to simply measure how long it's been since you've updated each user's resources, and then update them on every page load they make while logged in by multiplying the time they've been away by the rate at which they gain resources - that way, you avoid updating until you actually need the new value.
For a Travian like resource management you need to keep track when you updated the users resources for the last time. If you read the resource values (for a page refresh or something), you need to add the amount of resources gained since the 'last update time' (depending on the amount of resources fields and boni the user gets) and send that value to the browser. You could also the let browser script calculate these amounts.
You might to consider caching all resource amounts somehow, since these values are required a lot, improving the communication with your database.
If a user finishes building a resource field, uses the market, builds a structure, etc you need to update the amount of resources (and the 'last update time'), because you cannot keep track on these kind of events simply.
By calculating the resources the database load is reduced, since you do not need to write the new values every time when the user refreshes the browser page. It is also more accurate since you have less rounding errors.
To keep the resources increasing between page refreshes you need a method as Frank Farmer described. Just embed the resource amount and the 'gain frequency' in some javascript and increase the resource amount every 'gain frequency' by one.
You can also calculate the ressources each time a page or the javascript asks. You'd need to store the last updated time.
It may be an old post but it comes up right away in Google so here's another option which is how the game I've been developing does it.
I use a client side JavaScript that uses a flash socket to get live updates from a dedicated game server running on the host.
I use the xmlsocket kit from http://devpro.it/xmlsocket/

Zend_Cache vs cronjob

I am working on my bachelor's project and I'm trying to figure out a simple dilemma.
It's a website of a football club. There is some information that will be fetched from the website of national football association (basically league table and matches history). I'm trying to decide the best way to store this fetched data. I'm thinking about two possibilities:
1) I will set up a cron job that will run let's say every hour. It will call a script that will fetch the league table and all other data from the website and store them in a flat file.
2) I will use Zend_Cache object to do the same, except the data will be stored in cached files. The cache will get updated about every hour as well.
Which is the better approach?
I think the answer can be found in why you want to cache the file. Is it to place minimal load on the external server by only updating the cache every so often, or is it to keep pages loading fast because the file takes long to download or process?
If it's only to respect the other server, and fetching/processing the page takes little noticable time to the end user, I'd just implement Zend_Cache. It's simple, you don't have to worry about one script downloading the page, then another script loading the downloaded data (plus the cron job).
If the cache is also needed because fetching/processing the page is significant, I'd still use Zend_Cache; however, I'd set the cache to expire every 2 hours, and setup a cron job (or something similar) to manually update the cache every hour. Sure, this adds back the complexity of two scripts (or at least adding a request flag to manually refresh the cache), but should the cron job fail, you're still fine.
Well if you choose 1 it somewhat adds complexity because you have to use cron as well (not that cron is overly complex) and then you have to test that the data file is complete before using it or deall with moving files from a temp location after they have downloaded and been parsed to the proper format.
If you use two it eliminates much of 1, except now on the request where the cache is dead you have to wait for the download/parse.
I would say 1 is the better option, but 2 is going to be easier to implement and less prone to error. That said its fairly trivial to implement things in the cron script to prevent the negatives i describe. So i would probably go with 1.

What's the ideal way to implement this PHP/XML function in a website?

I have this code written up to snatch some weather data and make it available on my website:
if( ! $xml = simplexml_load_file('http://www.weather.gov/data/current_obs/KBED.xml') )
{
echo 'unable to load XML file';
}
else
{
$temp = $xml->temp_f.' Degrees';
$wind = $xml->wind_mph;
$wind_dir = $xml->wind_dir;
$gust = $xml->wind_gust_mph;
$time = $xml->observation_time;
$pres = $xml->pressure_in;
$weath = $xml->weather;
}
And then I just echo them out inside the tags I want them inside. My site is low traffic, but I'm wondering what the "best" way is to do something like this if I were to spike way up in traffic. Should I write those variables I want into a database every hour (when the XML is refreshed) with a cron job to save pinging the server each time, or is that not bad practice? I understand this is a bit subjective, but I have no one else to ask for "best ways". Thanks!!
I would suggest the following:
When you first get the content of the xml, parse it, and serialise it to a file, with a timestamp attached to the file in some way (perhaps as part of the serialised data structure)
Every time the page loads, grab that serialised data, and check the timestamp. If it's passed a certain point, go and grab the xml again and cache it, making sure to update the timestamp. If not, just use that data.
That should work, means you only have to go get the xml occasionally, and also, once the cache has expired, you don't have the waste of going and getting it regularly even though no-one is visiting (since it is only updated on a request).
Set up a cron job to periodically fetch the XML document, parse it and store the variables in a database.
When a page is requested, fetch the variables from the database and render your page.
It is a good idea to store the timestamp of the last update in the database as well, so that you can tell when the data is stale (because the weather website is down or so).
This setup looks very reasonable to me.
You could cache the output of the external site, and let it renew itself say every 5-10 seconds. That would kill the impact of a lot of 'pings' from your site. It really depends on how important timing accuracy is to your customer/client.
In a high traffic situation I would have a separate script that runs a a daemon or cron job and fetches the weather every specified interval, and overwrites the public website page when done. That way, you've not to worry about caching as it's done by a background task, your visitors are merely accessing a static page from the web server. That also avoids or at least minimises the need to incorporate a database into the equation, and is fairly light-weight.
On the downside, it does create a second point of failure and could be pretty useless if the information needs to be accurate to the time of page access.

Categories