I was wondering what would be the best way to store JSON/XML responses. I'm currently building an application that supports heavily on the SoundCloud API to fetch songs/playlists etc.
Here are some ideas I've come up with.
Storing the results in a Relational Database and then using PHP to convert them to classes to make easy use of them throughout my application.
Doing the above, only this time using my framework's built-in ORM.
Using a Document-Oriented Database. (ie. MongoDB, couchDB, ...)
Storing the JSON responses in a cache. (using my framework's cache classes)
Can anyone care to shed some light on some of the advantages/disadvantages of using any of these methods?
Which one do you prefer?
If you have a solid schema, that you wont think it will change, you might want to use relational database. You will need to parse the json and make objects out of the JSON response and using your framework you can persist it to database.
If you think your schema will change use NoSQL.
It also depends what will you do with this data. Are you going to search the nodes within JSON?
You can also do a object to mongo mapping, you can either parse the JSON and store it as an object or you can store the JSON the way it is.
Nice thing about NOSQL is that they support JSON pretty well in which they use BSON (Binary JSON).
In terms of cache, IME, it should be used only for lookups, and actually, you cant search the cache. It s just for getting objects faster than going to database and getting it.
Take a look at this:
http://www.mongodb.org/display/DOCS/Inserting#Inserting-JSON
If you can tolerate hosting your music and playlist data on Google's AppEngine, Ubud-db can be something for you: https://bitbucket.org/f94os/ubud-db/wiki
Ubud-db is a document store on AppEngine with a REST-JSON API. Spring/Jackson maps from JSON to a Map, and then Ubud's service maps from the Map to Entity, persisted by the Datastore.
The REST-JSON API makes it easy to integrate with a website using AJAX to access and display dynamic data.
If I need keep the data for longer than a cache provider, I would store them in a database as-is and then just json_decode them when I retrieve them from the DB. If it's just temporary storage, cache is a great idea, still leaving it encoded as json to reduce the size.
Related
I have a PHP project which provides a api endpoint exposing JSON data. Part of that data are serialized strings generated in PHP.
In a nodejs application I can unserialize that data using the php-unserialize package. However, I am a complete beginner at Angular and can't find a similar package for AngularJS.
I would appreciate some direction on how to unserialize a string into JSON data in AngularJS
and, where I can find AngularJS packages?
The AngularJS framework does not have a service that decodes array and objects serialized by PHP serialize.
Instead encode using PHP json_encode.
I seriously recommend that PHP APIs avoid using serialize to communicate data.
From Anonymous:1
Please! please! please! DO NOT serialize data and place it into your database. Serialize can be used that way, but that's missing the point of a relational database and the datatypes inherent in your database engine. Doing this makes data in your database non-portable, difficult to read, and can complicate queries. If you want your application to be portable to other languages, like let's say you find that you want to use Java for some portion of your app that it makes sense to use Java in, serialization will become a pain in the buttocks. You should always be able to query and modify data in the database without using a third party intermediary tool to manipulate data to be inserted.
I've encountered this too many times in my career, it makes for difficult to maintain code, code with portability issues, and data that is it more difficult to migrate to other RDMS systems, new schema, etc. It also has the added disadvantage of making it messy to search your database based on one of the fields that you've serialized.
That's not to say serialize() is useless. It's not... A good place to use it may be a cache file that contains the result of a data intensive operation, for instance. There are tons of others... Just don't abuse serialize because the next guy who comes along will have a maintenance or migration nightmare.
The JSON Standard is well supported by most languages and widely used for data interchange on the Web. The AngularJS $http service supports it automatically by default.
I'm starting a new project that uses couchbase (a noSQL database that stores objects in json format), together with php.
The thing is that it would be really easy to work with them both if I could have something that maps json into one of my own php classes (and vice versa).
Do you know any library for that?
One way to start is to look (or use) the "Basement" Library that is available here:
https://github.com/Basement/Basement
This library uses json_decode/encode.
Hope that will help you.
You may use our JSONmapper to map from JSON to your PHP classes.
It unfortunately does not support mapping back (yet).
Tug already mentioned Basement, which will provide that functionality of "models" in the near future like you know it from ORM systems.
Aside from that, mapping your plain old php objects to JSON is very easy, thanks to the nature of json_encode/decode. Since you can pass it an arbitary object and it will store it as JSON, thats basically the only thing you need at hand. If you need more infos about JSON and PHP, my blog post is a good start: http://nitschinger.at/Handling-JSON-like-a-boss-in-PHP
If you use Basement, it makes it a little bit easier for you since it allows you to transform PHP types into JSON behind the scenes automatically (or write your own mapper if needed).
If you have a specific example that you want to build, let me know and I'd be happy to provide an example!
I write *sql / php applications a lot a I find myself having to rewrite javascript all the time to do the same stuff over and over. Usually, when the API i have to work with is very simple, its not a big deal to write one-off ajax methods to interact with PHP, which updates sql tables via PDO.
But, when it comes to big data objects sent from php to javascript that need to be parsed, edited, updated, sent back to PHP, and then updated by an application layer, I'm writing javascript all day long to handle these "big objects" and every. little. thing. that could be updated within them.
There has to be a better way. What is it?
What's the nature of the changes that cause you to rewrite large swaths of your frontend js code? Is it new data that you need to surface in the frontend, or is it changes to the structure of the data?
If it's new data and you want to stop focusing on updating your frontend code to deal with it, you would probably need to implement something that lets the javascript build out your frontend in a more dynamic way. I could see this working as a service that passed back as a structured UI mapping in some data format that your frontend js could parse (you'd have to include all the data you needed and probably some information about the format of that data, ie, string, text, date, etc).
That might work alright for some form data, that's basically how form objects work in MVC frameworks like CakePHP or even Drupal. And in fact if that's your goal, to just provide some user-facing content entry, you might even be well-off to check out implementing this code in one of those frameworks.
If the problem is that you're making changes to your structural data, but by and large you're surfacing the same data fields, you probably just need an abstraction of your front-end data models and your backend data models. You can come up with your javascript object definitions that define what the structured data you pass back should look like, define what your backend model looks like, and then define a mapping layer between the two. If the structure of the data changes in the backend, your contract between the javascript object definition and the mapping layer wouldn't need to change, and you could merely change the contract between the mapping layer and your backend data model layer.
I have some data which I rarely have to update. Further I want that data to be very fast to access. What kind of solution do you recommend me in Zend Framework. The options I thaught are a Mysql database, some XML files, or directly writing the data in php arrays... Is there any ORM library I should use?
Since you're already using Zend Framework, why not use Zend_Config and store the data as ini/xml/json/yaml.
That's how Zend already stores your application settings. And if it's really not that much data, just store it in application.ini.
I'd say you can use whatever you want in your backend but then wrap it in Zend_Cache. This way you have some control over a refresh cycle but also the data in a convenient way and fast access.
Don't use an ORM if your aiming at fast access. Use an ORM for easy developing.
The fastest solution is storing this data in a plain PHP array I guess. But it's not the best solution if you ask me.
What kind of data are we talking about? How often and when does is change?
Store your data in a MySQL database but also index it using Zend_Search_Lucene.
Retrieving the data from a Lucene index is pretty fast from my experience
My favorite option in this cases is use Zend Cache. If you want to optimize the response time even more you can use the memcached library http://memcached.org/ . That can be used with Zend_Cache with little effort.
I am working on a project that uses PHP to create an 'complex' object with lots of references to other objects linking back to their parent objects etc...
The object structure is then serialized by the Zend AMF module and sent over to the flex application.
The problem is that serialization takes a lot of time (+10sec).
My question is thus: can anybody give me tips on how the serialization works and in what way I may be able to optimize the object structure in order to make serialization run faster?
Switching to JSON will help a great deal with this, as it allows easier caching.
APC will also help, just for the opcode-cache part, not for storing objects in memory.
How big is this object exactly? Could it be worth it not sending the entire thing? If you're just dealing with recordsets, you might be able to fix it in the frontend by only downloading only what the user can see, or will see in the near future.
The default serializer will iterate through every property, if a property is an object it will then iterate through each of those objects and their properties until it's done.
Since your object is complex, there's lots of crunching going on and many levels of objects that are being serialized.
As a point of optimization you may wish to look into implementing the serializable interface on your objects and serializing the minimal amount of information you require to be serialized and sent over the wire to your Flex app.
http://php.net/manual/en/class.serializable.php
When doing AMF serialization, or any serialization for that matter, it is usually better to work with smaller pieces of data if performance is a concern. By doing that you can work with individual objects as true ActionScript objects instead of just data placeholders. Smaller data when doing any type of RPC is usually better. You could use JSON instead, but then you'd loose the tight data binding that you get from using AMF. So try working with smaller packets of data using multiple HTTP requests.