How do I make a php script wait for user input - php

Here is my problem, for those of you who looked at the title and thought "PHP does not wait on user input because it is a server side language and therefore your problem is a client-side one," just hear me out.
I'm making a game. It is mulitplayer game, therefore multiple users. At the start of every game round the users involved in the game are prompted with a chose of things they want to do that round. Course, the round isn't to commence until everyone has made a selection on what they want to do that round.
See, that is my problem. How do I make a script wait for 'all' users to send a request (send input) before continuing execution? The server-side language is PHP. The answer wouldn't be with the client as the client is only responsible for one user and wouldn't know what the other users are doing.
Thanks.

Fundamentally, you have two choices:
Choice one, you have each script check if it has all the data it needs, and then do all the work to calculate the next move (or whatever). That is actually much harder to get right than it sounds, because you run into problems of concurrency.
Basically, that approach leads to more than one "process" - page load - trying to do the same work on the same data, and that opens the door to races where you either don't do the work at all, or where you do it twice.
Choice two, which sounds harder, is where you write another PHP script that checks to see if it has all the moves, calculates the outcome, and updates the database (or whatever) in the background.
Then, run that off a cron job, or something like that, so you only have one instance running at a time. That makes life easier: your "is everything done" script is only running once, and so you don't have to worry about races - but there might be a lag between the last move being submitted and calculating the outcome.
That approach is actually easier in the long run, because while it involves more code and more moving parts, it actually avoids the hard problems (concurrency) in return for a few more easy problems (a bit more code, using cron).
You can improve on both of those, of course, but those are the fundamental models. Locking and other coordination techniques can make "calculate in the last page" work better, but they involve you addressing the races.
Using various "background job" tools can improve the latency of the second approach, by letting you trigger the check instantly rather than just on a timer. You still have some latency, but the user doesn't see as much of it.
Really, though, you get to pick one of those two strategies and go with it.
(Also, I strongly advise that if you can, grab a framework or something where someone else already solved these problems, then use that.)

Since you can't push data out, I'd tackle this as follows:
collect the submitted information on the server in eg DB
run a client-side script to check periodically if all required information is available
on the client evaluate the response: if true, start game - if not "wait" = keep checking
so you need:
script on the server that collects the info
server script that checks if all info is available
client script that checks periodically and evaluates

Related

PHP - How would you make buildings or such things finish etc?

So as you know in some browser-games such as Travian, Tribalwars and etcetera, you can build up a building, it takes X amount of time and it finishes.
So I'm curious how that is done?
Is it a cron-job running every second or what? How are they then doing with troops, they can't have a cron-job running ever millisecond, that wouldn't be resource usage friendly, right?
So I'm really curious about this and I have no idea so I can't really say I have tried. I have however searched around but never found anything helpful.
Thanks.
The easiest way to implement something like this is simple timestamps. The request on the front end generates a timestamp based on the constraints given by the details of the request (what type of building you are building, what level you are, if you have bought the upgrade). Then a timestamp is inserted into the database for when the completion occurs. Then, if you want the browser to refresh when the job is up, you make a script on the js that makes a request for all timestamps in queue and reloads when they come up.
One way: You let the client handle the timer. So the timer will sit on the browser side counting down using javascript. When the time is up it will contact server to see if it's valid (never trust client side code). Server looks up the building and see if it would have been finished by then. Server side doesn't need to keep any timers it just answers requests. Timers are UI side.
Well, PHP is one of the worst things you could use to build a game... In games, everything is controlled by the main loop that controls the game. So basically, in a game, everything is running inside an infinite loop, albeit one that allows for user input without freezeing the computer, obviously. So that loop takes care of the timing as well, and the way to compute timing will depends on the language on which the game is developed. For web-based games, Java, JavaScript and Flash are usual choices.

How to process massive data-sets and provide a live user experience

I am a programmer at an internet marketing company that primaraly makes tools. These tools have certian requirements:
They run in a browser and must work in all of them.
The user either uploads something (.csv) to process or they provide a URL and API calls are made to retrieve information about it.
They are moving around THOUSANDS of lines of data (think large databases). These tools literally run for hours, usually over night.
The user must be able to watch live as their information is processed and is presented to them.
Currently we are writing in PHP, MySQL and Ajax.
My question is how do I process LARGE quantities of data and provide a user experience as the tool is running. Currently I use a custom queue system that sends ajax calls and inserts rows into tables or data into divs.
This method is a huge pain in the ass and couldnt possibly be the correct method. Should I be using a templating system or is there a better way to refresh chunks of the page with A LOT of data. And I really mean a lot of data because we come close to maxing out PHP memory and is something we are always on the look for.
Also I would love to make it so these tools could run on the server by themselves. I mean upload a .csv and close the browser window and then have an email sent to the user when the tool is done.
Does anyone have any methods (programming standards) for me that are better than using .ajax calls? Thank you.
I wanted to update with some notes incase anyone has the same question. I am looking into the following to see which is the best solution:
SlickGrid / DataTables
GearMan
Web Socket
Ratchet
Node.js
These are in no particular order and the one I choose will be based on what works for my issue and what can be used by the rest of my department. I will update when I pick the golden framework.
First of all, you cannot handle big data via Ajax. To make users able to watch the processes live you can do this using web sockets. As you are experienced in PHP, I can suggest you Ratchet which is quite new.
On the other hand, to make calculations and store big data I would use NoSQL instead of MySQL
Since you're kind of pinched for time already, migrating to Node.js may not be time sensitive. It'll also help with the question of notifying users of when the results are ready as it can do browser notification push without polling. As it makes use of Javascript you might find some of your client-side code is reusable.
I think you can run what you need in the background with some kind of Queue manager. I use something similar with CakePHP and it lets me run time intensive processes in the background asynchronously, so the browser does not need to be open.
Another plus side for this is that it's scalable, as it's easy to increase the number of queue workers running.
Basically with PHP, you just need a cron job that runs every once in a while that starts a worker that checks a Queue database for pending tasks. If none are found it keeps running in a loop until one shows up.

turn based game score recording

I'm developing a very, very basic turn-based game using php and jquery and I'm looking at two different ways of keeping track of the current user's score:
1) global javascript variables - for example var currentScore at the beginning of the js. The game action and turns are all controlled via ajax so I don't have to worry about a page refreshing losing the variable data.
2) mysql - create a row with currentScore, user, etc and access it / update it every turn.
I'm trying to balance a) load speed and b) making the score tamper-proof. I'm thinking that local javascript would be fast and less load time but mysql records would be more tamper-proof. Does anyone have any advice as to which is faster and which is more tamper-proof, or perhaps have another way of accomplishing this that I didn't list above?
Run your game in PHP, not in JS.
What I mean to say is instead of allowing the player's computer to control the action, and send the results back to the server...
that allows for people to hijack and send messages to your PHP like auto-firing pistols...
...or headshot scripts ...or speed-hacking.
...or even worse -- sending in messages like: "I just scored 500 points on my turn", and having your PHP script go: "Okay!".
So instead, the core of the game engine should run in PHP, the client should just say: "My character wants to move X squares.", and then the server can say: "No, you're a cheating tool, you can only move 3 squares.", and then the client will have to adhere to those rules.
In this regard, PHP will be 100% in control of the score-keeping.
any data that is not stored on the server will be tamperable. any data sent to the server can be doctored. not only should the server store all of the game data, but it should be validating all incoming data from the client. for instance, do the rules actually allow this player to use the move they are telling me they are using? Otherwise, it will be fairly easy to cheat. Then again, your project may not require that amount of scrutiny.
both,
never trust javascript in games. There will always be a clever player which will mess with it.
Use javascript for the gui part and controlling the game, but always check ALL results in PHP, especially player specific values. Check for the right player!! Else some losers will mess with your game.
Don't worry about speed, just script your game (of course with speed and data in mind) and investigate when you hit performance problems. One issue is important from beginning: think about your database tables and queries. That will become most likely your performnce bottleneck,more then bad php scripting.

System with two asynchronous processes

I'm planning to write a system which should accept input from users (from browser), make some calculations and show updated data to all users, currently visiting certain website.
Input can come one time in a hour, but can also come 100 times each second. It is VERY important not to loose any of user inputs, but really register and process ALL of them.
So, the idea was to create two programs. One will receive data (input) from browser and store it somehow in a queue (maybe an array, to be really fast?). Second program should wait until there are new items in the queue (saving resources) and then became active and begin to process the queue items. Both programs should run asynchronously.
I can php, so I would write first program using php. But I'm not sure about second part.. I'm not sure about how to send an event from first to second program. I need some advice at this point. Threads are not possible with php? I need some ideas how to create the system like i discribed.
I would use comet server to communicate feedback to the website the input came from (this part already tested)
As per the comments above, trivially you appear to be describing a message queueing / processing system, however looking at your question in more depth this is probably not the case:
Both programs should run asynchronously.
Having a program which process a request from a browser but does it asynchronously is an oxymoron. While you could handle the enqueueing of a message after dealing with the HTTP request, its still a synchronous process.
It is VERY important not to loose any of user inputs
PHP is not a good language for writing control systems for nuclear reactors (nor, according to Microsoft, is Java). HTTP and TCP/IP are not ideal for real time systems either.
100 times each second
Sorry - I thought you meant there could be a lot of concurrent requests. This is not a huge amount.
You seem to be confusing the objective of using COMET / Ajax with asynchronous processing of the application. Even with very large amounts of data, it should be possible to handle the interaction using a single php script working synchronously.

Offloading script function to post-response: methods and best-practices?

First,
the set up:
I have a script that executes several tasks after a user hits the "upload" button that sends the script the data it need. Now, this part is currently mandatory, we don't have the option at this point to cut out the upload and draw from a live source.
This section intentionally long-winded to make a point. Skip ahead if you hate that
Right now the data is parsed from a really funky source using regex, then broken down into an array. It then checks the DB for any data already in the uploaded data's date range. If the data date ranges don't already exist in the DB, it inserts the data and outputs success to the user (there is also some security checks, data source validation, and basic upload validation)... If the data does exist, the script then gets the data already in the DB, finds the differences between the two sets, deletes the old data that doesn't match, adds the new data, and then sends an email to each person affected by these changes (one email per person with all relevant changes in said email, which is a whole other step). The email addresses are pulled by means of an LDAP search as our DB has their work email but the LDAP has their personal email which ensures they get the email before they come in the next day and get caught unaware. Finally, the data-uploader is told "Changes have been made, emails have been sent." which is really all they care about.
Now I may be adding a Google Calendar API that posts the data (when it's scheduling data) to the user's Google Calendar. I would do it via their work calendar, but I thought I'd get my toes wet with Google's API before dealing with setting up a WebDav system for Exchange.
</backstory>
Now!
The practical question
At this point, pre-Google integration, the script takes at most a second and a half to run. It's pretty impressive, at least I think so (the server, not my coding). But the Google bit, in tests, is SLOOOOW. We can probably fix that, but it raises the bigger question...
What is the best way to off-load some of the work after the user has gotten confirmation that the DB has been updated? This is the part he's most concerned with and the part most critical. Email notifications and Google Calendar updates are only there for the benefit of those affected by the upload, and if there is a problem with these notifications, he'll hear about it (and then I'll hear about it) regardless of the script telling him first.
So is there a way, for example, to run a cronjob that's triggered by a script's last execution? Can PHP create cronjobs with exec() ability? Is there some normalized way of handling post-execution work that needs getting done?
Any advice on this is really appreciated. I feel like the scripts bloated-ness reflects my stage of development and the need for me to finally know how to do division-of-labor in web apps.
But I also get worried that this is not done, as user's need to know when all tasks are completed, etc. So this brings up:
The best-practices/more-subjective question
Basically, is there an idea that progress bars, real-time offloading, and other ways of keeping the user tethered to the script are --when combined with optimization of the code, of course-- the better, more-preferred method then simply saying "We're done with your part, if you need us, we'll be notifying users" etc etc.
Are there any BIG things to avoid (other than obviously not giving the user any feedback at all)?
Thanks for reading. The coding part is crucial, so don't feel obliged to cover the second part or forget to cover the coding part!
A cron job is good for this. If all you want to do when a user uploads data is say "Hey user, thanks for the data!" then this will be fine.
If you prefer a more immediate approach, then you can use exec() to start a background process. In a Linux environment it would look something like this:
exec("php /path/to/your/worker/script.php >/dev/null &");
The & part says "run me in the backgound." The >/dev/null part redirects output to a black hole. As far as handling all errors and notifying appropriate parties--this is all down to the design of your worker script.
For a more flexible cross-platform approach, check out this PHP Manual post
There are a number of ways to go about this. You could exec(), like the above says, but you could potentially run into a DoS situation if there are too many submit clicks. the pcntl extension is arguably better at managing processes like this. Check out this post to see a discussion (there are 3 parts).
You could use Javascript to send a second, ajax post that runs the appropriate worker script afterwards. By using ignore_user_abort() and sending a Content-Length, the browser can disconnect early, but your apache process will continue to run and process your data. Upside is no forkbomb potential, Downside is it will open more apache processes.
Yet another option is to use a cron in the background that looks at a process-queue table for things to do 'later' - you stick items into this table on the front end, remove them on the backend while processing (see Zend_Queue).
Yet another is to use a more distributed job framework like gearmand - which can process items on other machines.
It all depends on your overall capabilities and requirements.

Categories