Is there an easy (easier) way to split up my JSON? I'm not so good with JSON but here is my situation;
I have a table of payments and a table of locations. There are 10 payments at location A and 5 payments at location B.
my payments table is like;
CREATE TABLE `payments` (
`id` int(11) unsigned NOT NULL AUTO_INCREMENT,
`amount` decimal(15,2) NOT NULL,
`location_id` int(7) NOT NULL DEFAULT '1',
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=63678 DEFAULT CHARSET=latin1;
my locations table is like;
CREATE TABLE `locations` (
`id` int(11) unsigned NOT NULL AUTO_INCREMENT,
`name` varchar(100) NOT NULL DEFAULT '',
`auth_key` varchar(32) NOT NULL DEFAULT '',
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=latin1;
So I'd like my JSON to be something like;
location:[
{"A":
"auth_key": "--justanexample--",
"payments":[
{"id":"1", "amount":"15.15"},
{"id":"3", "amount":"11.85"},
{"id":"4", "amount":"18.95"}
]},
{"B":
"auth_key": "--justanotherexample--",
"payments":[
{"id":"2", "amount":"15.00"},
{"id":"5", "amount":"12.77"},
{"id":"6", "amount":"13.45"}
]}
]
At the moment, I am using Eloquent and I'm creating two nested loops, first to get all the locations and then to get all the payments per location;
$locations = $app->location->get();
foreach ($locations as $location) {
$payments = $app->payment->get();
foreach ($payments as $payment) {
echo json_encode($payment);
Or should I use an array function here instead and then encode that?
Related
I found this useful Internationalization code:
http://pastebin.com/SyKmPYTX
everything works well except I am unable to connect to database.
what I need:
1) I need to check last segment
2) this is my db
DROP TABLE IF EXISTS `translate`;
DROP TABLE IF EXISTS `trans_data`;
DROP TABLE IF EXISTS `languages`;
/*Table structure for table `languages` */
CREATE TABLE `languages` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`name` varchar(50) NOT NULL,
`key` varchar(5) NOT NULL,
`default` tinyint(1) NOT NULL DEFAULT '0',
`display` tinyint(1) NOT NULL DEFAULT '0',
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8;
/*Data for the table `languages` */
insert into `languages`(`name`,`key`,`default`,`display`) values
('GEO','ka',1,1),
('ENG','en',0,1),
('RUS','ru',0,1);
/*Table structure for table `trans_data` */
CREATE TABLE `trans_data` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`name` varchar(250) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8;
/*Table structure for table `translate` */
CREATE TABLE `translate` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`lang` int(11) NOT NULL,
`data` int(11) NOT NULL,
`trans` text NOT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `lang_2` (`lang`,`data`),
KEY `lang` (`lang`),
KEY `data` (`data`),
CONSTRAINT `translate_ibfk_1` FOREIGN KEY (`lang`) REFERENCES `languages` (`id`),
CONSTRAINT `translate_ibfk_2` FOREIGN KEY (`data`) REFERENCES `trans_data` (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=340 DEFAULT CHARSET=utf8;
in the first I need to do something like this
$this->db->where('key', end($this->segments));
$query = $this->db->get('languages');
$lang = $query->row();
if(empty($lang)){
$this->db->where('default', 1);
$query2 = $this->db->get('languages');
$lang = $query2->row();
}
$lang_array = [
"lang_key" => $lang->key,
"lang_id" => $lang->id
];
$this->session->set_userdata($lang_array);
if(end($this->uri->segment_array()) == $lang->key){
unset($this->uri->segments[intval($this->uri->total_segments() - 1)]);
}
after this functionality
then I need to download translates from database and save it in object with this query
SELECT TD.`name` , T.`trans`
FROM `translate` AS T
INNER JOIN `languages` AS L ON (T.`lang` = L.`id`)
INNER JOIN `trans_data` AS TD ON (T.`data` = TD.`id`)
WHERE (L.`key` = :lang_key);
and when I call from view
echo $this->lang->line('trans_key');
I need to get the value
but I do not undestand how to edit this class to do this job...
can you help me and edit this class for me?
thank you for future...
Okay, so lets say I have an api, which returns a object of data in json. The object looks like the following.
{
"id" : 1,
"name":"SynAck",
"sex":"Male",
"age":34,
"roles": [
{
"id":1
"name":"user"
"assigned":"2013-06-10"
},
{
"id":1
"name":"admin"
"assigned":"2014-01-09"
}
],
"created":"2014-06-10"
}
As you can see there are 2 objects returned here namely, the first one, lets call it "user" and another object, which is an array inside the first, "roles". There are 2 roles assigned to the user.
So lets say I want to load this information up into my DB verbatim. I could create 3 tables, 'user' and 'roles' and 'user_roles', with foreign keys linking user->user_roles->roles. One user has many roles.
CREATE TABLE IF NOT EXISTS `users` (
`id` INT(12) UNSIGNED NOT NULL AUTO_INCREMENT,
`name` VARCHAR(50) NOT NULL,
`sex` VARCHAR(10) NOT NULL,
`age` SMALLINT(2) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=1 ;
CREATE TABLE IF NOT EXISTS `roles` (
`id` INT(12) UNSIGNED NOT NULL AUTO_INCREMENT,
`name` VARCHAR(50) NOT NULL,
`assigned` DATETIME ON UPDATE CURRENT_TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=1 ;
CREATE TABLE IF NOT EXISTS `user_roles` (
`user_id` INT(12) UNSIGNED NOT NULL ,
`role_id` INT(12) UNSIGNED NOT NULL ,
CONSTRAINT `fk_ur_user_id` FOREIGN KEY(`user_id`) REFERENCES users(`id`) ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT `fk_ur_role_id` FOREIGN KEY(`role_id`) REFERENCES roles(`id`) ON DELETE CASCADE ON UPDATE CASCADE,
UNIQUE `idx_user_id_role_id`(`user_id`, `role_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 ;
I would then use artisan to create the Eloquent models and in the user model, I would have a hasManyThrough('roles', 'user_roles');
What would be the quickest and cleanest way of loading the data from the json into the models/tables?
Would I just have to assign the values one by one, or is there a way to map the attributes from the json objects to the eloquent model? If this is possible, how does it handle the relations?
I think that is possible to create the models for each table and then use the funcion json_decode to convert the JSON to an array and then use Eloquent methods to save then. Raw example:
$data = json_decode($json);
User::create($data);
Still, you need to extract the relationship data. Raw example:
$data = json_decode($json);
$roles = $data['roles'];
unset($data['roles']);
$user = User::create($data);
$user->roles()->create($roles);
I would like to implement a follow/favorite system. I can think of 2 ways of implementing a database/table structure but am unsure of which one to implement. Which one of these would be considered best practices and most importantly why?
I put all of my followers in a single string. By putting all followers in a single string it reduces the amount of redundant rows.
Ex.
id (1) || user_id (1) || follower_ids (2, 3, 45)
'CREATE TABLE `users` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`username` varchar(20) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
'CREATE TABLE `follow` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`user_id` int(10) unsigned NOT NULL,
`follower_ids` text NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
OR
I put each follow_id individually but adds redundancy by having 3 rows for the same user_id.
Ex.
id (1) || user_id (1) || follower_id (2)
id (2) || user_id (1) || follower_id (3)
id (3) || user_id (1) || follower_id (45)
'CREATE TABLE `users` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`username` varchar(20) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
'CREATE TABLE `follow` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`user_id` int(10) unsigned NOT NULL,
`follower_id` int(10) unsigned NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
Your second option with a slight modification on the field names, as both the follower and the followed are BOTH user_id's. As mentioned by John, add foreign keys to both of the *_user_id fields in the follow table.
Additionally, never have plural table names. 'user' and 'follow' are sufficient. I personally prefer tables like 'follow' to have a prefix like 'xref_' so that i know it's simply a cross reference table allowing a many-to-many relationship (A user can follow many users, and a user may have many following users).
'CREATE TABLE `user` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`username` varchar(20) NOT NULL,
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
'CREATE TABLE `follow` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`followed_user_id` int(10) unsigned NOT NULL,
`follower_user_id` int(10) unsigned NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
The best way is neither. You should have a middle table between follow table and followers table. The middle table have only two columns. follow_id and followers_id.
With this kind of approach you omit the disadvantages of two solutions you have mentioned. You do not have to proccess a string and you have not duplicate entries and the performances with only indexes included are pretty fast.
'CREATE TABLE `follow` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
'CREATE TABLE `user` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`username` varchar(255) unsigned NOT NULL,
....
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1';
'CREATE TABLE `follow_user` (
`user_id` int(10) unsigned NOT NULL,
`follower_id` int(10) unsigned NOT NULL,
)
Cause you change your post a lot, i think that your second approach is better if followers are the same as users. Because you only store the indexex of follow and users and a good select query to see what a single user follows something is way more better than parsing and searching a string The duplicate entries are no problem cause they are only indexes and there's no problem to that.
I have the following mysql tables:
CREATE TABLE `video` (
`video_id` int(11) unsigned NOT NULL auto_increment,
`title` varchar(255) NOT NULL default '',
`description` text NOT NULL,
PRIMARY KEY (`video_id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
CREATE TABLE `video_categories` (
`cat_id` int(11) unsigned NOT NULL auto_increment,
`name` varchar(255) NOT NULL default '',
PRIMARY KEY (`cat_id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
CREATE TABLE `video_category` (
`video_id` int(11) unsigned NOT NULL default '0',
`cat_id` int(11) unsigned NOT NULL default '0',
KEY `video_id` (`video_id`),
KEY `cat_id` (`cat_id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
CREATE TABLE `video_tags` (
`tag_id` int(11) unsigned NOT NULL auto_increment,
`video_id` int(11) unsigned NOT NULL default '0',
`name` varchar(255) NOT NULL default '',
KEY `video_id` (`video_id`),
KEY `name` (`name`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
I created a sphinx configuration file and i can search from PHP. The problem is when i want to search for related videos, a related video must be in the same category as the video i'm searching for. I can do this with MVA and and SetFilter('categories', array(3)) for example, however the total number of matches results is the global one (i need total to display pagination via ajax) not the one in the category.
Any ideas how i can search through videos (documents in sphinx) that are only in a specified category?
Thanks,
Adrian.
You can define for the category ID an integer attribute in Sphinx:
sql_attr_uint = cat_id
And then just add #cat_id=12345 to your query.
i am building a real estate application where in it will store the properties and search it. the property will have different categories like (residential, commercial, industrial or agricultural). based upon the category i want to serailize each and every property listing . for example the property with id 1 belongs to resedential will have the serial code rs_SOMERANDOMUNIQUENUMBER. and for commercial it can be cm_SOMERANDOMUNIQUENUMBER and so on. for this my database table looks like this.
CREATE TABLE IF NOT EXISTS `propSerials` (
`id` bigint(20) NOT NULL auto_increment,
`serial` varchar(50) NOT NULL,
`property_id` int(10) UNIQUE NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
what would be the best possible format to store the serial with the prefix according to category?
thank you
Why dont you add another column that holds category_id and in category table add column with prefixes for that category.
CREATE TABLE IF NOT EXISTS `propSerials` (
`id` bigint(20) NOT NULL auto_increment,
`serial` varchar(50) NOT NULL,
`property_id` int(10) UNIQUE NOT NULL,
`category_id` int(10) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
CREATE TABLE IF NOT EXISTS `propCategories` (
`id` bigint(20) NOT NULL auto_increment,
`category` varchar(50) NOT NULL,
`property_prefix` char(3) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
In query you can:
SELECT CONCAT('prefix_', 'serial');