I've got a json string stored in a mySQL DB and now I'm trying to find a way to check if a specific key contains a value at all or not.
Been googling around, but most solutions point to finding a specific value, where instead I want to check if there's a value there or not.
This is to be implemented into some sort of php check function, and if there's a way to get all results from mysql instead of doing multiple queries, that'd be great.
Example:
row 1 {"Name":"Jane","Group":"","customernumber":"12345"}
row 2 {"Name":"Mike","Group":"Sales","customernumber":"23456"}
row 3 {"Name":"Steve","Group":"","customernumber":"34567"}
The resulting array would contain Mike with details and so on.
A little help, please?
EDIT
I didn't choose to store the data like this. It's the CMS I'm working with that stores custom form like this.
I've got about 400 db entries and I thought of letting MySQL do the processing since I don't know if storing that many results in an array would be bad for performance since a couple of users are going to view pages that uses these results causing quite frequent requests.
Related
I have a script that receives json data from various sources and processes it.
I have a list in a database and also as a text file of known good sources. The list has thousands of records.
Before processing I want to compare the source value from json with the source value in the list. Data is received every 10 sec. The list does not change often.
At the moment I can make this work either by querying the database for the sources list or read the list from a text file, however it seems redundant to do this every 10 sec upon receiving json since the list is going to be the same 99% of the time.
The question is - what is the good way to do this?
Assuming this DB is something you have more than read access - you mentioned the database records do not change often, you could add a trigger on the DB for any changes. Have the trigger update a single row in a new table called "listUpdated" to True.
Load the list into an array in your PHP and use that to bump your data against. Every 10 seconds you can just check if the "listUpdated" field has been set to True. If it is, update your array and change the value back to False.
This is more about checking my logic to see if i understand whether or not my idea of migration is the same as more experienced folk.
I have 2 databases, one with my current website stuff and one with my new development stuff. I now want to load all the current website database onto my new development database.
These databases have different schemas, different structures in terms of column names and a bit more decoupling in terms of data within the new development database tables.
To do this i am using php and sql, where by i'm calling specific tables using sql into multidimensional arrays to get all the relevant data needed for my new development database tables. checking to see if its repeat data and ordering it.
So now i have a multidimensional array that is full of data needed for my new database table which has been extracted from the old tables. I have renamed all the keys in the multidimensional array to match the names of the database. So technically i have a multidimensional array that is a copy of what i want to insert into my database.
Then i insert that multidimensional array into the new database and bobs your uncle a migration of a database?
Does this sound right and is there some suggested reading that you guys n girls might point me to?
Regards Mike
EDIT
By using multidimensional arrays to collect all the data that i want to put into my new database, wont i then be double handling the data and therefore use alot more resources from my migration script?
I have never tried this before but I am pretty certain you can access 2 databases at the same time. That being said you can extract from DB1 do your checks, changes, etc then just insert into the new DB.
Here is a stack question that does connect to 2 db's
So I've researched about creating a migration script and I thought id explain to anyone else who has to do this a general outline in how to implement it. Bare in mind I'm only really using basic php no class's or functions, all procedural. I'm going to focus on one particular table and from that you can extrapolate for the whole database.
1) Create a php file specifically for collating the data of this one table (e.g table1.php)
2) Create all the sql statements you'll need to extract all relevant information for that particular table
3) With each sql statement create a loop and put all the data fetched from the sql statement into a array
4) Then create loop and an sql statement for inserting the data from the arrays you just populated into the new database. and if you want to check for repeat data just implement this check within this loop and sql statement.
5) Note you can add a timer and a counter for checking how long it took and amount of files transferred and or number of duplicates.
This may be obvious for most people, and might be considered wrong by others but my original plan on collating the data in a "table equivalent multidimensional array" and then inserting that array into the table meant i was double handling data (i think). So i assumed it would more efficient doing it this way, and a lot more simple.
I hope this basic outline will help anyone considering doing the same thing for the first time, and if someone has thoughts on how to make this operation more effective please feel free to rip this explanation apart. As this is only what I've implemented myself through trail and error as i have no real experience in this its just what I've concocted myself.
Regards Mike
I am using php mysql.
First on form submission I run query on mysql and get the row id -> store it in session array.
For displaying the result I have modified one php array pagination script to create pagination of session array with first,prev,next,last, Jump to page number functionality. For that pagination I have taken the reference from:
http://lotsofcode.com/php/php-array-pagination.htm
Script is working fine.
But I have two questions.
Question 1: Is it fine to store big result in session array? if not
what will be the good alternative for that? (I am wondering to keep
first 500 result in session array and if result result is more than
500 then create one xml file.)
Question 2: Is it possible to use ajax for pagination working with
session array and having functionality of first,prev,next,last, Jump
to page number?
If any solution is there please update me.
Thanks
Ravindra.
I wouldn't use sessions to store large amounts of data, and I wouldnt use sessions to store data returned from a table (mysql or otherwise).
The session data is stored on the server - and held within memory - whats going to happen when you get multiple users using the same table ?
The database (mysql or otherwise) should be sufficient enough, with the correct indexes in place, to handle queries for displaying data.
I have a mysql table with 120 million records and can extract data quickly using date ranges - no speed issues what so ever.
Lets say I wanted to build an app that will pull url links from a database and show 50 on a page. I am just using links as an example.
What if I had to store multiple values, an array into 1 mysql field for each link/record posted.
If there is 5 items for every Link, I could have an array of 5 items on the page, a list of 5 items seperated by commas, or I could use json encode/decode, What would be best for performance for saving a link to the DB and showing it on the page, something like implode/explode with a list of item, json encode/decode, or serialize/un-serialize the array?
Serializing is probably the best if you don't want to use multiple tables. The reason being that it deals with the special characters. If you use a comma separated list, then you'll need to worry about values that have commas in them already. Serialize/unserialize deals with this already. But the issue is the serializing is not terribly fast, although your arrays sound quite simple.
The best is still to have multiple tables as it allows you to search and/or manipulate the data much easier at a later date. It also isn't hard in PHP to create a loop to generate a SQL that adds multiple records to the second table (relating them back to the parent in the main table).
BTW, there are many other questions similar to this on SO: Optimal Way to Store/Retrieve Array in Table
The best way, since you ask for it, is to create a table that describes the data you're going to save, and then insert a row for each element. In your example, that would mean you need two tables; pages and links, where there is a foreign key in links.page_id referring pages.id.
Is anyone aware of a script/class (preferably in PHP) that would parse a given MySQL table's structure and then fill it with x number of rows of random test data based on the field types?
I have never seen or heard of something like this and thought I would check before writing one myself.
What you are after would be a data generator.
There is one available here which i had bookmarked but i haven't got around to trying it yet.