php array vs database for static data - php

I have an associative array in php consisting of about 4k elements.
Product Id and Product Name
a sample row:
'434353', 'TeaCups'
So no big data. In fact the whole php array file is about 80kb.
This is static data, so I won't be changing, deleting any data.
Considering the size of the array and the number of elements in it,
Would it be better to access data from the array or I should create
a database instead?
The data might be read about 20k times a day.
PS: Each time the data will be read, I will be fetching exactly one
element

If this is static data, I recommend you store this data in a JSON format as a file, that you can access via PHP using the fopen() function.
However, if the data becomes bigger, like lets say, 2 GB, or even 200 MB, unless if you have a supercomputer, you should use the database and query from there.
Note that databases are usually only useful when you have a lot of information, or if you have too much information to process in a regular JSON.

Related

Inserting a JSON object into a blank database

I have close to 120 JSON fields in objects.
A whole bunch of various versions of this sample JSON object
I really don't have time to map every field, nor do I care if the field title is called Array 0 or some other default.
I just want the data stuffed into a database, maintaining it's structure in indexable format.
JSON is a perfectly valid data structure, why not use it?
I did find this -> https://github.com/adamwulf/json-to-mysql and hypothetically it would do the job.
But the software is broke (no error but nothing ever gets populated).
Other alternatives such as https://doctrine-couchdb.readthedocs.org/en/latest/reference/introduction.html involve mapping. I don't have time to map 120 fields and nothing should require mapping, just use the existing.
Whats an easy way to stuff large JSON objects into MySQL, auto building the structure from an existing format?

Storing an array in a MySQL table

I have a 5 level multidimensional array. The number of keys in the array fluctuates but I need to store it in a database so I can access it with PHP later on. Are there any easy ways to do this?
My idea was to convert the array into a single string using several different delimiters like #* and %* and then using a series of explode() to convert the data back into an array when I need it.
I haven't written any code at this point because I'm hoping there will be a better way to do this. But I do have a potential solution which I tried to outline below:
here's an overview of my array:
n=button number
i=item number
btn[n][0] = button name
btn[n][1] = button desc
btn[n][2] = success or not (Y or N)
btn[n][3] = array containing item info
btn[n][3][i][0] = item intput type (Default/Preset/UserTxt/UserDD)
btn[n][3][i][1] = array containing item value - if more than one index then display as drop down
Here's a run-down of the delimiters I was going to use:
#*Button Title //button title
&*val1=*usr1234 //items and values
&*val2=*FROM_USER(_TEXT_$*name:) //if an items value contains "FROM_USER" then extract the data between the perenthesis
&*val3=*FROM_USER(_TEXT_$*Time:) //if the datatype contains _TEXT_ then explode AGAIN by $* and just display a textfield with the title
&*val4=*FROM_USER($*name1#*value1$*name2#*value2) //else explode AGAIN by $* for a list of name value pairs which represent a drop box - name2#*value2
//sample string - a single button
#*Button Title%*val1=*usr1234&*val2=*FROM_USER(_TEXT_$*name:)&*val3=*FROM_USER(_TEXT_$*date:)&*val4=*FROM_USER($*name1#*value1$*name2#*value2)
In summary, I am seeking some ideas of how to store a multidimensional array in a single database table.
What you want is a data serialization method. Don't invent your own, there are plenty already out there. The most obvious candidates are JSON (json_encode) or the PHP specific serialize. XML is also an option, especially if your database may support it natively to some degree.
Have a look at serialize or json_encode
The best decision for you is json_encode.
It has some advantages for json_encode beside serialize for storing in db.
taking smaller size
if you
must modify data manually in db there will be some problems with serialize, because this format stores size of values that has been serialized and modifying this values you must count and modify this params.
SQL (whether mySQL or any other variant) does not support array data types.
The way you are supposed to deal with this kind of data in SQL is to store it across multiple tables.
So in this example, you'd have one table that contains buttonID, buttonName, buttonSuccess, etc fields, and another table that contains buttonInputType and buttonInputValue fields, as well as buttonID to link back to the parent table.
That would be the recommended "relational" way of doing things. The point of doing it this way is that it makes it easier to query the data back out of the DB when the time comes.
There are other options though.
One option would be to use mySQL's enum feature. Since you've got a fixed set of values available for the input type, you could use an enum field for it, which could save you from needing to have an extra table for that.
Another option, of course, is what everyone else has suggested, and simply serialise the data using json_encode() or similar, and store it all in a big text field.
If the data is going to be used as a simple block of data, without any need to ever run a query to examine parts of it, then this can sometimes be the simplest solution. It's not something a database expert would want to see, but from a pragmatic angle, if it does the job then feel free to use it.
However, it's important to be aware of the limitations. By using a serialised solution, you're basically saying "this data doesn't need to be managed in any way at all, so I can't be bothered to do proper database design for it.". And that's fine, as long as you don't need to manage it or search for values within it. If you do, you need to think harder about your DB design, and be wary of taking the 'easy' option.

Return object instead of json_encode

So, I found that making my WebMethod As Object, Return Dictionary rather than As String, Return JavaScriptSerializer.Serialized() reduces the size of the JSON by ~ 20%.
Yeah, I know this isn't a big deal for traditional webapps where you're serving the consumer (in the past) and a few kb would be big, but it's HUGE for B2B where you're trying to serve up to your customers AJAXd jQuery pages with far less data transmission and greater speed when transmitting dynamic tables that could be potentially 100mb before dynamicization and id lists of about 1-2mb, but I digress.
It looks like json_encode does the same thing adding more than necessary to the JSON, from what I've read on other posts. Is there a way to simply output the array as an object or build an object from multiple arrays and export that?
1) Is print (and its' family) the only way to output?
2) Is json_encode (and its' family) necessary? After all, I don't have to decode if I output properly at the jQuery level.
I'm a big fan of speed & efficiency. As AJAX/jsLibs take over, and data becomes bigger while these hunkering server-side scripts go by the wayside, it looks like the next logical objective (aside from a standardized push to client) is to keep the size of the JSON as small as possible.
How can I keep the garbage with AJAX/PHP down to a mininum? How can I export arrays as object directly?
Thanks for bearing with me. I'm terrible with vocabulary. I hope what I want to do is relatively clear enough.
As always, thanks in advance, and thank you stack for being my brain!
The problem I was having with .net webmethod was that I was javascriptserializing my output before sending it out with <ScriptMethod(ResponseFormat:=ResponseFormat.Json)> included. That's what added the extra data to my json output.
Arrays can be exported as is with json_encode, and that's all that's needed.

PHP Search through CSV File

I have a CSV file which is more than 5 MB. As an example like this,
id,song,album,track
200,"Best of 10","Best of 10","No where"
230,"Best of 10","Best of 10","Love"
4340,"Best of 10","Best of 10","Al Road"
I have plenty of records. I need to get a one record at a time. Assume I need to get the detail of the id 8999. My main question is Is there any method to get exact data record from CSV just like we query in mysql table.
Other than that I have following solutions in my mind to perform the task. What could be the best way.
Read the CSV and load in to array. Then search trough it and get data. (I have to get all records. The order is different that's why I face this issue. Other wise I could get record by record.)
Export CSV to MYSQL database and then query from that table
Read the CSV all the time without loading it to array
If I can search in CSV file in quick way it will be great. Appreciate your suggestions.
Thank you.
I'm not aware if there are any libraries that allow you to directly query a CSV file. If there are, that would be your best bet. If not, here's another solution.
If you want to get details of id 8999, it would be extremely memory inefficient to load in the entire file.
You could do something like this
Read in a line using fgets, explode on comma and check 0th element. If it is not the ID you want, discard and repeat.
Not familiar with PHP, but if this query is frequent, consider keeping the data in the memory and index them by id using a mapping data structure. (should also be Array in PHP)

PHP Serialized database entries messing me all up

O.K. so I'm pretty clever - I've made a library for keeping a bunch of WP themes that are all set up to my needs so when I put out a new site I can just create the new blog in a few minutes.
As a holder for the new domain everything in the sql file that has the old domain in it I replace with [token].
Everything was working fine right up to the point where I made one with a child theme that apparently serialized data before entering it into the database. End up with stuff like this:
Theme','a:8:{s:12:\"header_image\";s:92:\"http://[token]wp-content/uploads/2011/05/494-Caring-for-fruit-trees-PLR.jpg\";s:16:\"background_image
So I dig into serialization and it turnss out that s:92 bit for example is the number of characters in that value. Since i'm changing [token] it changes and breaks.
Up till now I did all my changes to the sql file and didn't edit the database other than to populate it with the sql file - but I'm at a loss on how to deal with the data in the serial array.
Any ideas?
Easiest way would be to grab that data and use the unserialize() function, like
$arr = unserialize($data);
Then edit the data that way. When you're done, re-serialize it with serialize(), and store it back. You may have to to do a print_r() on the unserialized data to see how it's stored to see what you need to edit.
If you do the changes directly from the serialized data, you'll have to get the length of the current substring, make the change, then get the new length and splice that back into the serialized data, which is way more complicated than it needs to be.

Categories