Using a PHP script, I need to manage data sent to the script in a variable format.
The URL sent is something like: http://hawkserv.co.uk/heartbeat.php?port=25565&max=32&name=My%20Server&public=True&version=7&salt=wo6kVAHjxoJcInKx&players=&worlds=guest&motd=testtet&lvlcount=1&servversion=67.5.0.1&hash=randomhash&users=0
(clicking the link returns a formatted table of the results)
What is the best method of storing this information for it to be used in a formatted HTML page?
Multiple URL's will be sent to the script, with different values. The script needs to store each response to be used later, and also "time out" responses that haven't been updated in a while.
Example scenario:
3 servers exist, Server 1, Server 2, and Server 3. Each of these servers send the above url every 45 seconds with a few values changed per server. A formatted table can display information when the page is requested, and is updated when the page refreshes to any new information that the servers send.
Server 1 goes offline, and doesn't send any more requests. The script accounts for this lack of request and removes Server 1's information from the list, declaring it offline.
Although code is greatly appreciated to have, I think I can just go off the best way of doing it. Is it storing each url as an array in a file, and reading the file when needed, or is there some other way?
I would store the variables + the time the request was received in a database. The database can be a SQLite one if you don't like to go through the hassle of setting up a full blown system. The advantages of using SQLite over dumping arrays to a file is that you can do flexible queries without coding up parsing routines and the like.
Related
I have a question about php and how it works on server side.
Im thinking about creating array of users in PHP than access it. When user access the website, my script will push the username to array. Can I access this array in another page without using the database to store the usernames?
Without database- use sessions. Add session_start() on every page you want to access your users array, and then $_SESSION["session_variable_name"] = $users_array to assign your array to session variable. Then if you want to use it, just access it like a usual variable: echo $_SESSION["session_variable_name"][0].
But using database would be much more appropriate. Session variables are accessible only on that session, so if there's new user added to the array, only client who added it will see it. Each user will have his own session, where user array may be completely different from what others will see.
What I'd do- after successful login to the system, assign username to session variable, and then if I want to perform a task for that user, say, get his email address from database, I'd make sql query, selecting email from users, where username equals to username stored in session.
No. This won't work as you described. You can either use a database (typical with PHP is MySQL but there are many other options) or a file (JSON or any of a number of other formats). But the key is understanding why it won't work:
I think you are looking at PHP on the server as one system serving many users. However, from the perspective of PHP, each user is seeing a fresh and separate instance of the PHP system. Depending on how the server is actually configured, there may or may not be multiple PHP processes running at the operating system level. But at the application level, each instance (run) of a PHP program is totally independent, with its own copy of all local (RAM) data storage - including arrays. That is NOT necessarily the case in all languages, but for practical purposes you can treat a PHP server process as if it were a separate server brought online to serve the one current user, then rebooted to serve the next user. With that setup, it should be clear that the only way for the server process serving the 2nd user to see what the 1st user did is if the process serving the 1st user stored the information in non-volatile storage - e.g., MySQL or a file on disk. The advantages of a database over a simple file are that it allows fast & easy queries of the information and hides most of the little details of cooperation between the different server processes - e.g., if there are 10 users all accessing the server at basically the same time, the individual PHP server processes don't have to worry about who can write the file "now".
In addition to the problem of each server process seeing (or not seeing) the data collected from other users, even within a single user the data is not available. Each PHP page request, including regular POST or GET and AJAX calls too, starts a fresh PHP instance (again, this is logically speaking - the server code may actually be resident and not reloaded, but it runs "fresh" each time), so with the exception of session data (as others have rightly suggested) there is no data carried over between pages even for an individual user. Session data is normally limited in size and often is just used as a reference into server-side data stored in files or a database.
first you need to prepare user details array as per below
$userDetailArr = array(
'username' => 'foobar',
'id' => 1,
'email' => 'foobar#foo.com'
);
now add session_start(); function and store above array into $_SESSION variable
$_SESSION['userdetail'] = $userDetailArr;
$_SESSION['userdetail'] you can use anywhere you want. but make sure to add session_start() function at top of page before use session variable.
A while back I wrote a rather long javascript procedure for organizing data we receive at work. The user simply paste in the mess we get and script throws out all the worthless info and generates a nice cleaned up data table.
I would like to add the ability to then transfer the processed information to the mySQL database. I'm growing a bit more comfortable using javascript, but I don't have close to the time or know-how to recreate the long processing procedure in PHP. How should I prep the data in javascript to most efficiently hand off the data to the server and have PHP insert it into mySQL tables?
The less PHP server side the better, although I doubt it would be safe to have a PHP page that blindly followed any instructions a referring page might send it.
At this point the data my script presents in the browser looks a lot like mySQL records already.
ex.
(Wilson, Paul, 1000400, A399)
(Smalls, Kalah, 4993944, B11)
(Chase, Danny, 244422, B133)
(Larson, Jay, 3948489, J39)
...
Thanks!
If you could have the data in a JSON array.
Then on the php side use json_decode to pull the data in as an array loop through it and do your updates and inserts for your data in MySQL.
http://php.net/manual/en/book.json.php
Caveat, I know this has the potential to be a ridiculously stupid question, but I had the thought and want to know the answer.
Aim: run an interactive session between browser and server with a single request without ajax or websockets etc..
Scenario: a PHP file on the server receives data by POST method from a user. The content length in the header is 8MB so it keeps the connection open until it receives the full data of 8MB. But on the user side we are delivering this data very very slowly (simulating a terrible connection, for example). The server is receiving the data bits at a time. [can this be passed to the PHP file to process bits at a time? Or does it only get passed once all the data is received?] It then does whatever it wants with those bits, and delivers it to the browser, in an echo loop). At certain time intervals, the user injects new data into the 'stream' which will be surrounded by a continuous stream of padding data.
Is any of that possible? Or even with CGI? I am expecting this not to be possible really, but what stops the process timing out if someone does have a terrible connection and the POST data is huge?
As far as I know, you could do this, but the PHP file you are calling with the POST data will only be called by the webserver once it has received all the data. Otherwise, say you were sending an image with POST, and your PHP script moves this image from the tempfiles directory to another directory, before all the data has been received, you would have a corrupt image, nothing more, nothing less.
As long as the ini configurations have been altered correctly, I would think so. But would be a great test to try out!
Normally I try to format my question as a basic question and then explain my situation, but the solution I'm looking for might be the wrong one altogether, so here's the problem:
I'm building a catalog application for an auction website that has the ability to save individual lots. So far this has worked great by simply creating a cookie with a comma-separated list of IDs for those lots, via something like this:
$_COOKIE["MyLots_$AuctionId"] = implode(",",$arrayOfIds);
The problem I'm now hitting is that when I go to print the lots, I'm using wkhtmltopdf through the command-line to request the url of the printout I want, like this:
exec("wkhtmltopdf '$urlofmylots' filename.pdf");
The problem is that I can't pass a cookie to this call, because Apache sees an internal request, not the request of the user. I tried putting it in the get string, but once I have more than a pre-set limit for GET parameters, that value disappears from the $_GET array on the target url. I can't seem to find a way to send POST data between them. My next possible ideas are the following:
Maybe just pass the sessionID to the url, and see if there's a way that I can use PHP to dig through the cookies for that session and pull the right cookie, but that sounds like it'd be risky security-wise for a PHP server to allow (letting one session be aware of another). Example:
exec("wkhtmltopdf '$urlofmylots?sessionId=$sessionIdFromThisRequest' filename.pdf");
Possibly set a session variable and then pass that session Id, and see if I can use PHP to wade through that information instead (rather than using the cookie).
Would I be able to just create an array and somehow have that other script be aware of it, possibly by including it? That doesn't really solve the problem of wkhtmltopdf expecting a web-facing address as its first parameter.
(not really an idea, but some reasoning) In other instances of using this, I've just passed an ID to the script that generates the markup for wkhtmltopdf to parse, and the script uses that ID to get data from the database. I don't want to store this data in a file or the database for the simple purpose of transferring data from the caller to the callee in this case. Cookies and sessions seem cleaner since apache/php handle memory allocation for these sessions.
The ultimate problem here is that I'm trying to get my second script (referenced here by $urlofmylots) to be aware of data available to the calling script but it's being executed as if it were an external web request, not two php scripts being called from the web root.
Can anyone offer some insight here?
You might consider rendering whatever the output of $urlofmylots?lots=$lots_to_print would be to a temporary file and running wkhtmltopdf against that file.
Ok, I didn't really now how to formulate this question, and especially not the title. But i'll give it a try and hope i'm being specific enough while trying to keep it relevant to others.
I you want to run a php script in the background (via ajax) every X seconds that returns data from a database, how do you do this the best way without using to much of the server resources?
My solution looks like this:
A user visits a webpage, ever x seconds that page runs a javascript. The javascript calls a PHP script/file that calls the database, retrieves the data and returns the data to the javascript. The javascript then prints the data to the page. My fear is that this way of solving it will put a lot of pressure on the server if there is a lot (10 000) simultaneous visitors on the page. Is there another way to do this?
That sounds like the best way, given the spec/requirement you set out.
Another way is to have an intermediary step. If you are going to have a huge amount of traffic (otherwise this does not introduce any benefit, but to the contrary may overcomplicat/slow the process), add another table that records the last time a dataset was pulled, and a hard file (say, XML) which if the 'last time' was deemed too long ago, is created from a new query, this XML then feeds the result returned to the user.
So:
1.Javascript calls PHP script (AJAX)
2.PHP pings DB table which contains last time data was fully output
3.If time is too great, 'main' query is rerun and XML file is regenerated from output
ELSE skip to 4
4.Fetch the XML file and output as appropriate for returned AJAX
You can do it the other way, contacting the client just when you need it and wasting less resources.
Comet it's the way to go for this option:
Comet is a programming technique that
enables web servers to send data to
the client without having any need for
the client to request it. This
technique will produce more responsive
applications than classic AJAX. In
classic AJAX applications, web browser
(client) cannot be notified in real
time that the server data model has
changed. The user must create a
request (for example by clicking on a
link) or a periodic AJAX request must
happen in order to get new data fro
the server.