Mysql or flat file or php get request JSON - php

Question about best practices / scaling of the web application.
What would be the best way to access application settings on which all users rely and can not modify. Data needs to be served in JSON, application is written in PHP / JS.
Have 3 ways in my mind:
Serve json as a php get request which will call final method
final function get_requestJSON()
{
return json_encode(array(.... ));
}
Serve is as a flat file with JSON written in it and use file_get_contents
Store it in text format in MySQL backed and access it directly with a model.
Which one of those would be considered fastest?
Any thoughts?
j

Don't worry about speed in this case (unless you know that the wrong solution is unacceptably slow) and instead think of your requirements. This is especially true because if you are issuing server calls through a client, there's going to be a latency anyway that will likely dwarf the time for this operation.
Also, the scenarios are not mutually exclusive. You can store things in a database, and use JSON as your protocol for sending things back and forth (for example).
I recommend you instead think about all the things you can see doing with this information, both now and in the future, and let that guide you towards a decision.

Taking all into consideration, php library with methods is the best solution for me. Thanks for the info though.

Related

best ways to displaying progress on server with php

What are best ways to display progress on server work?
xhr processing request
iframe method ( i dont prefer this method )
server sent events
polling with some flash messages on $_SESSION variable in main php script
which is most clean and most used way?
use case is for example when we upload zip files, i want to track progress on unpacking, copying files etc
Your question is a bit vague - there is no "best" way, each one has advantages and disadvantages. Try to explain what is important to you.
For example, the most clean, scalable and performant way would probably be to push an update on progress to the client using WebSockets. However, this may require some special implementation on the server and client side and may be a bit complex.
On the other hand, if you want simplicity and ease of implementation and are willing to sacrifice performance and scalability, going with a $_SESSION + Ajax polling solution could be better - however you should be careful as some session storage backends lock the session which means you cannot have one PHP request updating it while another is attempting to read from it. You can instead use DB / memcache / file storage (depending on your scale needs and server setup) to store this information.

PHP: fastest way to save form data without any processing

I am accepting rather complex form data from clients (web and mobile) to a PHP server. When I receive the data, I need not process it in any way, I just need to store it somewhere and let the client know that the form was successfully submitted. I batch process the data later.
What's the best way to quickly store the incoming form data?
Two functions seem relevant in this context:
serialize -- http://php.net/manual/en/function.serialize.php
var_export -- http://php.net/manual/en/function.var-export.php
Also, instead of sending this serialized data to database, it would be lot faster to append it to some text file, right?
Can someone give me more information about these things?
If you need just get text from POST you could get plain POST request without any serialization.
echo file_get_contents('php://input');
Appending to a file is pretty fast. But remember that you will need also manage this file.
P.S. It looks like premature optimization.
I would consider
base64_encode(serialize($_POST));
// assuming $array is the returned mySQL column (BLOB type)
unserialize(base64_decode($array));
Please note that base64_encode is optional and depends on the data stored, which may or may not need to be encoded.
The fastest way to save the data is not to use PHP, but to write a C extension to whatever web-server you're using which will simply save the file. Fastest is not the best. Unless you're talking about trillions of requests per hour, you're better off processing them at least some inline, and storing them into a relational DB or Object DB. I would personally go with MySQL or Mongo since I've some experience with them, but many other datastore could be appropriate.
Trying to outsmart databases by "simply appending to a file" will likely get you into a lot of trouble. Databases already have a lot of support for concurrent operations, optimizing writes, caching, transactional support, etc... If you "append" to a file from multiple web requests concurrently, you'll end up with corrupt data.
I know its a bit of a non-answer, but its been my experience that the type of question that you asked is misguided at best. If you truly absolutely need to do something "the fastest way possible", you're likely not going to need to ask about it on Stack Overflow, because you've gotten yourself into a cutting-edge development environment where you've had decade+ of experience in creating such implementations.

are 'long-lived' php objects possible?

So, I know that the general idea in PHP is that the whole application is loaded and executed every time the page loads. But for optimization of a sizable object-oriented PHP app that needs to be portableā€¦is it possible to load an object into memory that can be used for every request, not recreated for each one?
I've seen people using the $_SESSION variable for something like this, but that seems like it is a) ugly, b) will take up a lot of space on the server, and c) doesn't really do what I need it to as it's a session by session sort of thing.
Is there some sort of $_ALL_SESSIONS? ;)
(Or, approaching the question from a different angle, are purely static objects loaded into memory each time you load the page with a standard Apache mod-php install?)
You are more or less looking for an equivalent of ASP/IIS's Application object in PHP. AFAIK there isn't one.
There is EG(persistent_list), a list of "objects" that are not (necessarily) removed after a request is served. It's used by functions like mysql_pconnect(), pg_pconnect(), ... But it is not directly accessible by script code.
memchache has already been mentioned. Can you please elaborate on "purely static objects" ?
Maybe you could serialize it and store it in memcache? I don't know if that would be any faster though.
Not by default, no. You'll have to use some workaround, whether it be a 3rd party tool (memcached, DBMS, etc.), or a built-in mechanism (sessions, serializing to a file, etc.) Whether it's faster than recreating the object for each record is up to you.
You could also write a PHP plugin for this. :) Or maybe there already is one. A quick google search revealed nothing but I didn't try very hard.
If you do decide on writing one yourself know that it's not as straightforward as it sounds. For example, webservers such as Apache spawn several child processes for handling requests in parallel. You'll have to be tricky to get data across to them. Not to mention proper locking (and lock breaking if a request hangs), handling of webserver clusters, etc.
What you can do is use the CLI version of PHP to write a 'daemon' app which persists across requests and maintains state etc, and then have a regular web based script which can communicate it with via sockets or some other mechanism (here's one example)
If the server is Your own machine then it should be possible to run a process in background that would do the "global thing". You could communicate with it using SOAP.
You would only need to create a SOAP object.
That's the only way I see to really create a long-lived object for php. Everything else is just serialization. There might be a technology outside PHP for that purpose though.
Honestly, I don't think Your object is big and complicated enough for it to be created and populated longer than it takes to make a SOAP call. But if creating this object requires lots of DB connections - it's plausible that my idea could help...

Performance considerations of JSON vs. XML

I am using a webservice which provides a large result set either in XML or JSON format.
Which format will be faster or better (perfomance based)? Also which language should I use to parse the XML/JSON? Should I use PHP or JavaScript?
"PHP or JavaScript" sounds like an odd choice to offer: PHP is usually used as a server-side language, whereas JavaScript is usually used as a client-side language.
What's your situation? What makes you suggest those two languages in particular? If you could give more information about what you're trying to do, that would help a lot. (We don't know whether you're developing a web app, a batch processing tool, a GUI application, etc.)
I suspect JSON will be a bit more compact than XML, although if they're compressing the data you may well find they end up taking the same bandwith (as a lot of the "bloat" of XML is easily compressible).
As ever, the best way to find out is to test the specific web service with some realistic data. Generalities aren't a good basis for decision-making.
both have their advantages:
JSON
easy to handle: $dataStructure = JSON_decode($serializedString);, done.
XML
partial data handling: if your result-set is too big to be processed (parsed) at once, this may be the way to go. note: SimpleXML is the easier to work with xml lib, but also parses the whole xml-file at once, so in this case there's no benefit over JSON.
the question which language to handle your result set with is a bit non-sensical. javascript is client-side*, php is server side. so, it depends on what you want to do with the result set.
you can pass the result directly on to the browser/js without doing anything on the server side, and let the client do the filtering and rendering. this may make sense in certain situations, but normally it's not what you want.
my advice: if possible, use JSON.
ad *: you can use javascript on the server side (rhino, v8cgi, ...), but that's not what you have in mind.
I would go for JSON, you're not paying the "angled bracket tax". The choice between PHP and Javascript is related to the amount of processing required on the data (I'm taking a leap here).
Lots of processing, use PHP so it's server side. Little processing use Javascript for a more responsive page (load data via AJAX).
Although performance aspects really vary a lot between language/tool combinations, in the end xml and json tend to have similar performance characteristics when using best tools of the platform. That is, you won't find one more twice as fast or more; theoretical limits are similar for textual formats. Both are "good enough" in this regard for almost any use case.
So I would focus more on tool support, for the task you have. Other than that, format choice is unlikely to be the most important aspect to consider.
And like Jon mentioned, comparison of PHP and Javascript really sounds odd... apples and oranges or so.
One thing that has perhaps been missed is that a JavaScript client does not have to parse JSON, so you will get a performance win there. The browser will parse the XML into a DOM and hand this to your callback, you then need to extract from that DOM the info you need. With JSON you get back an object and the need to extract from the DOM is gone.
I think you'll have to measure it yourself. The performance will depend on:
the size of the data
the complexity of the data
its format (JSON or XML)
So you can see there are a number of variables.
Does the web service that you're using take longer to assemble the data in one format vs. another ?
There are a sizable number of options for parsing JSON and XML, so don't restrict yourself to PHP or Javascript (if you have a choice). And finally, if you're requesting from a webservice, you'll have the overhead of network transport costs, connection setup etc. So any time savings in parsing performance may be negligible. Your efforts may be better spent elsewhere.
I don't think I've answered your question, other than give you more things to think about!
You should use python, preferably :)
As for format, I guess JSON would be a bit faster, but it depends on the details, and this task would be network-bound anyway.
If you are using application with ajax then you should choose Javascript to process data on the client side which will reduce the usage of php on the server means more efficient server side. you can store your resultset in a json file and then can call it on client side with javascript this will be the best possible way because this will not consume you resources on server and data will be processed in the client side. here i will give preference to json over xml because xml takes more space to store than json because of its style of tags and json is like an array in javascript which will be faster than xml. same thing in server side, you can easly convert your json array to php array (just refer to json_decode function in PHP).
Now days json is in fashion because of it is easy to use and is faster.
for faster performance you should reduce the data processing on the server side and should use client side resource instead this approach will give your application speed and cost effectiveness.

Adding some custom session variables to a JavaScript object

I currently have a custom session handler class which simply builds on php's session functionality (and ties in some mySQL tables).
I have a wide variety of session variables that best suits my application (primarily kept on the server side). Although I am also using jQuery to improve the usability of the front-end, and I was wondering if feeding some of the session variables (some basics and some browse preference id's) to a JS object would be a bad way to go.
Currently if I need to access any of this information at the front-end I do a ajax request to a php page specifically written to provide the appropriate response, although I am unsure if this is the best practice (actually I'm pretty sure this just creates a excess number of Ajax requests).
Has anyone got any comments on this? Would this be the best way to have this sort of information available to the client side?
I really guess it depends on many factors. I'm always having "premature optimization ..." in the back of my head.
In earlier years I rushed every little idea that came to my mind into the app. That often lead to "i made it cool but I didn't took time to fully grasp the problem I'm trying to solve; was there a problem anyway?"
Nowadays I use the obvious approach (like yours) which is fast (without scarifying performance completely on the first try) and then analyze if I'm getting into problems or not.
In other words:
How often do you need to access this information from different kind of loaded pages (because if you load the information once without the user reloading there's probably not much point in re-fetching it anyway) multiplied by number of concurrent clients?
If you write the information into a client side cookie for fast JS access, can harm be done to your application if abused (modified without application consent)? Replace "JS" and "cookie" without any kind of offline storage like WHATWG proposes it, if #1 applies.
The "fast" approach suits me, because often there's not the big investment into prior-development research. If you've done that carefully ... but then you would probably know that answer already ;)
As 3. you could always push the HTML to your client already including the data you need in JS, maybe that can work in your case. Will be interesting to see what other suggestions will come!
As I side note: I've had PHP sessions stored in DB too, until I moved them over to memcached (alert: it's a cache and not a persistent store so may be not a good idea for you case, I can live with it, I just make sure it's always running) to realize a average drop of 20% of database queries and and through this a 90% drop of write queries. And I wasn't even using any fancy Ajax yet, just the number of concurrent users.
I would say that's definately an overkill of AJAX, are these sessions private or important not to show to a visitor? Just to throw it out there; a cookie is the easiest when it comes to both, to have the data in a javascript object makes it just as easily readable to a visitor, and when it comes down to cookies being enabled or not, without cookies you wouldn't have sessions anyway.
http://www.quirksmode.org/js/cookies.html is a good source about cookie handling in JS and includes two functions for reading and writing cookies.

Categories