PHP: fastest way to save form data without any processing - php

I am accepting rather complex form data from clients (web and mobile) to a PHP server. When I receive the data, I need not process it in any way, I just need to store it somewhere and let the client know that the form was successfully submitted. I batch process the data later.
What's the best way to quickly store the incoming form data?
Two functions seem relevant in this context:
serialize -- http://php.net/manual/en/function.serialize.php
var_export -- http://php.net/manual/en/function.var-export.php
Also, instead of sending this serialized data to database, it would be lot faster to append it to some text file, right?
Can someone give me more information about these things?

If you need just get text from POST you could get plain POST request without any serialization.
echo file_get_contents('php://input');
Appending to a file is pretty fast. But remember that you will need also manage this file.
P.S. It looks like premature optimization.

I would consider
base64_encode(serialize($_POST));
// assuming $array is the returned mySQL column (BLOB type)
unserialize(base64_decode($array));
Please note that base64_encode is optional and depends on the data stored, which may or may not need to be encoded.

The fastest way to save the data is not to use PHP, but to write a C extension to whatever web-server you're using which will simply save the file. Fastest is not the best. Unless you're talking about trillions of requests per hour, you're better off processing them at least some inline, and storing them into a relational DB or Object DB. I would personally go with MySQL or Mongo since I've some experience with them, but many other datastore could be appropriate.
Trying to outsmart databases by "simply appending to a file" will likely get you into a lot of trouble. Databases already have a lot of support for concurrent operations, optimizing writes, caching, transactional support, etc... If you "append" to a file from multiple web requests concurrently, you'll end up with corrupt data.
I know its a bit of a non-answer, but its been my experience that the type of question that you asked is misguided at best. If you truly absolutely need to do something "the fastest way possible", you're likely not going to need to ask about it on Stack Overflow, because you've gotten yourself into a cutting-edge development environment where you've had decade+ of experience in creating such implementations.

Related

What is the best way to pass data from one cli app to another

For example how to pass data from one cli application for example Ruby script to PHP script. I mean raw data, text, binary and etc.. Arguments falls out because of raw data using temp files also is kind lame. The last thing I know which would work is stdin, stdout. Maybe someone knows anything better?
Depends on the how those programs are executed (in relation to one another). If one stops and start the other, stdin/out or even dumping a file might do the trick (although it may be considered lame, it does the job!), if there is a lot of data you might even consider creating a small database table in a database like SQLite or something. That also depends on the requirements how volatile the data passed between the data might be.
If there is a parent/child process relation, so the parent starts a child process, you might consider a pipe/fifo, or shared memory or a message queue or any other form of IPC.
What OS are you on? That determines what is available to you somewhat. You should use what is common for the platform. On *nix systems we use pipes (|), sockets and data files, depending on the application.
There's really no one way of doing it, it just depends on the type of application, and the type of data.
If you are moving columns of text or CSV data you could use pipes or text files.
If you are moving hashes and objects between Ruby apps, use JSON or "Marshall" the data and a pipe, socket or a file.
If you are moving data between languages, use JSON, XML or YAML and one of those data paths.
We regularly use JSON for our inter-application data format, and write our code to allow it to emit JSON when we set a command-line flag. That allows us to easily wrap command-line apps with a little REST service and call them remotely, throwing their results around the web.
All that said, there isn't a hard and fast rule saying how anything is done, just be consistent with your code and follow the style of the apps you are working with.

Mysql or flat file or php get request JSON

Question about best practices / scaling of the web application.
What would be the best way to access application settings on which all users rely and can not modify. Data needs to be served in JSON, application is written in PHP / JS.
Have 3 ways in my mind:
Serve json as a php get request which will call final method
final function get_requestJSON()
{
return json_encode(array(.... ));
}
Serve is as a flat file with JSON written in it and use file_get_contents
Store it in text format in MySQL backed and access it directly with a model.
Which one of those would be considered fastest?
Any thoughts?
j
Don't worry about speed in this case (unless you know that the wrong solution is unacceptably slow) and instead think of your requirements. This is especially true because if you are issuing server calls through a client, there's going to be a latency anyway that will likely dwarf the time for this operation.
Also, the scenarios are not mutually exclusive. You can store things in a database, and use JSON as your protocol for sending things back and forth (for example).
I recommend you instead think about all the things you can see doing with this information, both now and in the future, and let that guide you towards a decision.
Taking all into consideration, php library with methods is the best solution for me. Thanks for the info though.

Should I use Curl or XML Parsing

I am about to write a script that links multiple databases across servers in php, and I am trying to decide the best tack to take. I like my DOMDocuments, and already have multiple xml feeds coming from the database that the others can hook into. On the other hand the database it fed by a form, and this could easily send its data across the other datbases on submit using Curl or even jsonp.
My personal feeling is parse the xml on a daily basis, but the other approach would make the db linkup more instantanious.
What do you guys think?
I think its better you use something like JSON with less oerhead. With XML you have a lot of overhead.
Or you use a socket and send the data direct over connection with your own short protocol. I think XML is ok but for linking server is JSON or a Socket really better.
I think we would definitely need some more information here before we can give you solid advice.
But I definitely lean toward the form route, of the two options you've outlined, for sure.
On the one hand, you said the xml feeds already exist. If they contain information other than just the changeset, then that balloons the amount of data you have to process on the receiving end.
On the other hand, you've already written them, so there may be less work involved. You have to write the "receiving" code either way.

Is it better to generate html for an ajax function in the JS handler or in the PHP ajax function?

I'm designing some UI's for a product and we want to utilize jQuery and PHP. The content to generate is a list of checkboxes (10-100) that a user will be modifying (removing multiple at a time and changing the entries). I thought I'd try something new and ask StackOverflow (read: you) what is preferred: generate the html in the php call and return, or return JSON data that jQuery can than go and generate the html checkboxes using.
I appreciate the feedback! My preferred method so far is to let PHP generate html because it knows more about the data at the time of modification (it's interacting with the database and it could build the html easy enough without having to pass back id's, names, etc in JSON).
Thanks!
[Edit] Bandwidth is not a constraint. This is an internal intranet application. The information needing to be printed to the user will not require dom modification after the fact (outside of checkboxes, but that's built in to the browser...) some good points have been made on the amount of data that's being passed back though:
passing back
Label
vs.
{
"Label": "Unique_ID"
}
is obviously a lot of redundancy.
There's really no right/wrong way to do this. Passing JSON back, and then using client-site processing to turn that into HTML uses less bandwidth, but more local processing power. Passing HTML back, uses more bandwidth and less local processing (these are seriously minor points, only if you're talking extremely popular or frequently changing sites might it even be relevant).
Return Flexibility - HTML
One of the benefits to HTML passing is you can return anything if the request causes an error, or could generate different types of data you just return different HTML. If you're returning JSON, the parsing script has to deal with these alternate structures (ie error handling, and/or multiple structure parsing algorithms).
Local Processing - JSON
If you're localizing, sorting, or framing the data from the user's point of view, it may well be simpler to return JSON and then use client side scripts to interpret. For example when user=2, reporting "You" instead of "Mike" might be a nice personalization touch. You could do this server side, but now the script needs to take that into account, so the same query needs to return different data based on context (again not impossible). You can keep your server code more generic by using client side scripts to perform this.
Local Presenting - JSON
Perhaps a single command collects the data, but there's multiple parts of the page that should be updated with what's returned. With an HTML approach, you either need separate queries, or some sort of delimiter in your return (with escapes!), and a local processing script to decide what goes where... with a JSON approach, the local processing script can update the locations from the same single source as it's retrieved.
You could approach the question both from the aspect of server burden and in terms of client performance.
If your server is having to dynamically generate the HTML output for every user, it will endure a somewhat higher burden than if you delegated the content-generation to client-side JavaScript. Clients have abundant computing power at their disposal, so feel free to have them collectively shoulder the burden rather than having your server do all the work (which could easily add up, depending on how busy your server is).
Likewise, generating the HTML markup on the server results in a significantly larger page download for the client. The markup for a hundred check-boxes could easily add kilobytes to the size of the page, while the data itself--which is all you would send using the JSON approach--is much smaller. Of course, larger page downloads mean longer download times for the client. We as web developers often forget that there are still quite a few people who still have dial-up internet connections.
For these reasons, I would personally opt for sending the data via JSON and doing DOM-modification via JavaScript.
Cheers,
Josh
The answer is: it depends. If you are going to be doing DOM manipulation on the new data, then you pretty much have to append the elements using jQuery. If there is no such manipulation needed, then you can just print it out with php and add the blob.
I think that the latter is much easier and simpler, so if you don't need to do DOM manipulation on the elements, you can just add the html blob from php.

Should XML be used server-side, and JSON client-side?

As a personal project, I'm making an AJAX chatroom application using XML as a server-side storage, and JSON for client-side processing.
Here's how it works:
AJAX Request gets sent to PHP using GET (chat messages/logins/logouts)
PHP fetches/modifies the XML file on the server
PHP encodes the XML into JSON, and sends back JSON response
Javascript handles JSON information (chat messages/logins/logouts)
I want to eventually make this a larger-scale chatroom application. Therefore, I want to make sure it's fast and efficient.
Was this a bad design choice? In this case, is switching between XML and JSON ok, or is there a better way?
EDIT:
Two mechanisms prevent a big server load when fetching information from the server:
An "event id" is assigned to each message/login/logout, so all that's sent back to the user is the events his client hasn't yet processed.
When an XML file becomes too big, a new one is created.
As far as I am concerned, JSON is always a good choice for async. data transfer, because it is not as bloated as XML is. I'd choose latter only if I want the data to be human readable, e.g. config files.
--- Edited:
And remember: Serializing/deserializing XML is a performance issue and not particularly convenient for persisting web application data with high frequency access, while, as mentioned, using xml as config files is imo best practice.
Both XML and JSON are good for inter-applications communication. In Javascript, JSON is much easier than XML to deal with, so that's what I'd recommend.
As for storage... both are terrible as large datastores I'm afraid. MySQL would work better than editing a file, but it's still not an appropriate solution for a chat, especially if you're on a shared host. You may want to take a look at SQLite, perhaps creating one file per chat room.

Categories