What is the best software design to use in this scenario - php

I need to generate HTML snippets using jQuery. The creation of those snippets depends on some data. The data is stored server-side, in session (where PHP is used).
At the moment I achieved this
- retrieving the data from the server via AJAX in form of JSON
- and building the snippets via specific javascript functions that read those data
The problem is that the complexity of the data is getting bigger and hence the serialization into JSON is getting even more difficult since I can't do it automatically.
I can't do it automatically because some information are sensible so I generate a "stripped" version to send to the client.
I know it is difficult to understand without any code to read, but I am hoping this is a common scenario and would be glad for any tip, suggestion or even design-pattern you can give me.
Should I store both a complete and a stripped data on the server and then use some library to automatically generate the JSON from the stripped data? But this also means I have to get the two data synchronized.
Or maybe I could move the logic server-side, this way avoiding sending the data. But this means sending javascript code (since I rely on jQuery). Maybe not a good idea.
Feel free to ask me more details if this is not clear.
Thank you for any help

There are several Javascript/jQuery templating solutions available. John Resig is working on one that's likely to become a popular jQuery add-on, if not part of the core distribution. Kyle Simpson is also doing one.
I googled for a reference to it, but really I'd suggest doing your own searching because there's lots of good information out there.
edit well here's a pretty good link: http://www.west-wind.com/Weblog/posts/509108.aspx

You can use PHP's json_encode and json_decode methods to convert native PHP objects into JSON data representation.

Related

XML as storage for API/ Store data from form in XML-file

I have researched APIs and I generally understand how they work and how to use them (HTTP request to API, get data and parse it, etc.), however, for my project I need to use data that I collected myself so I can't just use another database for example. I'm quite new to this so I don't understand a few things. I'll try to explain my plan as clearly as possible. Please let me know if any additional explanation is required.
I have an HTML form which can be filled out and saved. This form is not supposed to be local, but rather on a server
I read a lot about XML-files and API's and I also saw many similar questions on here but I'm not sure what applies to my instance.
I wanted to store the information from the form in an XML-file. Some people said, that this could be done with JavaScript, some people said this would require some server-side script. What applies in this case? I would guess that I need a server-side script but as I said, I'm kinda at loss here.
I thought I could simply use JavaScript to store whatever is entered into the form and use python or php to create an XML-file in which I store this information. This XML-file would then be used by the API. This is were I have trouble understanding.
The edited form is supposed to be saved (on the server I guess, so several people can access it), so you can go back to it later and edit it again. How exactly would I implement an API here? Can I just "make" my own XML-file, which the API uses as database? Is there any better way to do this?
I know this probably seems like a stupid question but I really want to understand this so bear with me. I'm very much overwhelmed by this task so I appreciate any help.

How to send big data from delphi to a remote php script and wait for answer?

I am in risk of being very criticized over this question, because it might have been answered here, but I do not know to look right for it. I tried to get an answer in Foruns, but people bother to show up and say "I don't know".
I usually use TIdHTTP to call a remote PHP script and receive some data, but only when I need to communicate with REST server inputting the data directly as a parameter.
Now I need to send some big JSON object (encoded and much more than 255 bytes) and I do not know how to do it in Delphi. I know that it should be through POST method, but how to send it from Delphi? And how to receive it in PHP, $request[]?
I also need a way to do it and then wait for an answer from the server, in form of other JSON object, encoded of course. It should be very simple. I have a DB online and I want to trade some data encrypted.
Thank you for your help!
Delphi doesn't have a library for that purpose so you will have to use one of the REST client libraries that can be found on the web.
This one should suit your needs just nicely.
Here is an easy example showing how to use it.

Mysql or flat file or php get request JSON

Question about best practices / scaling of the web application.
What would be the best way to access application settings on which all users rely and can not modify. Data needs to be served in JSON, application is written in PHP / JS.
Have 3 ways in my mind:
Serve json as a php get request which will call final method
final function get_requestJSON()
{
return json_encode(array(.... ));
}
Serve is as a flat file with JSON written in it and use file_get_contents
Store it in text format in MySQL backed and access it directly with a model.
Which one of those would be considered fastest?
Any thoughts?
j
Don't worry about speed in this case (unless you know that the wrong solution is unacceptably slow) and instead think of your requirements. This is especially true because if you are issuing server calls through a client, there's going to be a latency anyway that will likely dwarf the time for this operation.
Also, the scenarios are not mutually exclusive. You can store things in a database, and use JSON as your protocol for sending things back and forth (for example).
I recommend you instead think about all the things you can see doing with this information, both now and in the future, and let that guide you towards a decision.
Taking all into consideration, php library with methods is the best solution for me. Thanks for the info though.

Using POST for all functions in REST protocol

Is it wrong to use POST for everything in REST? I know POST is for updating and creating but what if I use it for SELECTS and DELETS as well? What I basically need is a web service that can do CRUD operations to a database. Im using PHP. Also, what if I use query strings to do the POST request instead of using JSON or XML?
If you're using POST for all operations, you can't really call it REST anymore.
So it's not wrong, but it's not REST either :)
Yes, it's wrong. If you are using POST for anything other than updating you aren't actually following the REST convention.
You can eat soup with fork but why? Anyway you can use any method, just don't call it REST protocol.
Is it wrong to use POST for everything in REST?
You can't use POST for everything in REST (e.g. only for requests, not responses), so the question about wrong or right is not a valid question.
I know POST is for updating and creating but what if I use it for SELECTS and DELETS as well?
It normally just works.
What I basically need is a web service that can do CRUD operations to a database.
That's fine.
Im using PHP.
That's fine, too. PHP hast support for the HEAD, GET and POST method. For the PUT and DELETE methods you need to do a little bit more coding.
Also, what if I use query strings to do the POST request instead of using JSON or XML?
You use query strings instead of JSON/XML request bodies. REST does not specify the protocol, so you can do whatever pleases you. Using "query strings" might be a good idea with PHP as PHP supports those out of the box. But it depends what you use for a handler or what you want to support.
As your questions are very broad, you might want to read some basic description of REST first before so you can more specifically ask: A Brief Introduction to REST.
It's wrong to use POST for calls that are meant to be repeatable.
POST calls are never cached but your select calls should be cache-able (and repeatable) so you shouldn't be using POST for them.
But there's no technical reason why you can't force everything through POST, (particularly as DELETE isn't supported across all browsers), but it does mean that caching won't work for any of your calls, and also that your users may not be able to refresh a web page without being asked to confirm page reload if POST calls have been made to create it.
At this REST tutorial here the 4th article in the series says:
For creation, updating, and deleting data, use POST requests. (POST can also be used for read-only queries, as noted above, when complex parameters are required.)
Of course GET is usually used for read-only queries, so putting it all together, if complexity is the default assumption, which is kind of a handy assumption when no threshold between complexity and simplicity is stated, then this guy who is enough an authority to write a whole series on REST is saying that it's okay to POST everything. May be caches make some assumptions about responses to various HTTP verbs that might be beneficial but if your application does not benefit from caching (indeed, might be harmed by it, so disable it or do something so your application does not suffer from it) then I don't know if there's any problem at all.

Should I use Curl or XML Parsing

I am about to write a script that links multiple databases across servers in php, and I am trying to decide the best tack to take. I like my DOMDocuments, and already have multiple xml feeds coming from the database that the others can hook into. On the other hand the database it fed by a form, and this could easily send its data across the other datbases on submit using Curl or even jsonp.
My personal feeling is parse the xml on a daily basis, but the other approach would make the db linkup more instantanious.
What do you guys think?
I think its better you use something like JSON with less oerhead. With XML you have a lot of overhead.
Or you use a socket and send the data direct over connection with your own short protocol. I think XML is ok but for linking server is JSON or a Socket really better.
I think we would definitely need some more information here before we can give you solid advice.
But I definitely lean toward the form route, of the two options you've outlined, for sure.
On the one hand, you said the xml feeds already exist. If they contain information other than just the changeset, then that balloons the amount of data you have to process on the receiving end.
On the other hand, you've already written them, so there may be less work involved. You have to write the "receiving" code either way.

Categories