How to Use MySQL and MongoDb together - php

I am planning of using MongoDB for my comments function. But all the user's data is in MySQL. That means, I am trying to store comments in MongoDB and other fix information in MySQL. But when I started to think about retrieving comments from MongoDB, I came across the question that, how can I relate MongoDB data with MySQL.
For example,
user name, profile_url is stored in MySQL
comments are stored in MongoDB with user_id
So how can I retrieve data like
| name | profile_url | comments |
|-------|--------------|-----------------------|
| xyz | image.jpg | That was nice comment |
| abc | image.jpg | I agree |
Is it possible to do so? Or is there any other way?
and I am using Laravel 5 with jenssegers/laravel-mongodb package.

MongoDB and MySQL are completely separate applications. They have no way to communicate with each other except through your application. That means if a request needs data from both sources, it needs to query both separately.
But what you could do is keep redundant data in both databases. When you store a comment in MongoDB, also put the relevant user information into the Comment document. Such a duplication of information is a deadly sin in relational databases but is common practice in MongoDB. Until recently (3.2) MongoDB had no support for JOINs whatsoever, and even now it's still quite rudimentary. That means you should usually avoid storing the data you need to fulfill a request in more than one collection, even if that means that you have redundancies.

You can't retrieve the user and comments data in the same time. You have to:
get the user ID
query mongodb using the user ID as parameter and get the comments you need
If you need aggregate data for more users, you could consider to aggregate the queries:
get all the users' data from mysql
query mongodb asking all the comments of the specified users
merge users data with comments data in PHP
Finally: are you sure storing the comments in mongodb is the right solution in your case? Are you planning to have a so huge ammount of comments at the point that it will require to store them in an external DB ?
If you choose this way, you can consider to store the user data in mongodb too. But before doing this, plan carefully if it's the right choice for you (i.e. consider the queries you'll need to do, and check if the data, stored in this way, would be fit for you queries )

Related

Storing values in two MySQL tables

I have a scenario in which I am not sure about what to do.
I have a website where a user can update their status. I am allowing the use of hash tags so a possible user post might look like:
Went for a great hike today!! #hiking
Now, I intend to store the post in a table appropriately named "POSTS" which is structured like this:
post_id | user_id | text | date
Now, when a user submits the form which holds the post text I run a script to create an array to get all of the hash tag terms the user used and then store them in an array.
So then I can loop through that array and insert the tags into the aptly named "TAGS" table. Now the structure of this table is this:
tag_id | post_id | user_id | tag
The only problem with this is that I do not know the post_id of the post until after I insert the data into the "POSTS" table (post_id is the primary key and is auto increment).
Now, I was thinking I could just SELECT the last row of data from the "POSTS" table for that user (after I insert the post), and then in turn use the returned post_id for my query that inserts the tag data into the "TAGS" table. This seems like not the best way? My question is:
Is this the best solution or is there a better way to go about this scenario?
I am brand new to Stack Overflow, so don't please down vote me. Comment and tell me what I am doing wrong and I will learn and ask better questions.
Thanks
You can get last insterted ID very simply:
mysql_insert_id() if you don't use PDO or using function lastInsertId() if you do.
Have a new column in both tables - unique_id - which holds a string you generate in code before querying the database. That way you have an id to tie posts and tags together before submission. I use this method all the time for similar applications.
Only issue is uniqueness, but there a variety of ways to generate unique ids (I normally use a mixture of timestamps and hashing).
This sort of depends on which version of mysql you're using and how you want to organize your code.
Option 1. Do exactly what you've said. PHP would contain the code to manage the database and how data is stored into the database. The only drawback that I see in what you've outlined is if there's an issue with dealing with the hashtags, then possibly you would have a post that is inserted to the database, but the hash part did not successfully complete. For certain applications (like a bank account), this may not be acceptable and this is what database transactions are for.
Option 2. Another way to handle this would be to write a mysql stored procedure that does both the insert and handling the hash tags. The stored procedure could also wrap the whole thing in a transaction so that your database is consistent. Note that this requires a version of mysql that supports stored procedures. The bad side of doing this is that you would have to write in mysql, which is different from PHP.
Both mysql and PHP can handle this application logic/datastore logic. It is a matter of how you want to organize the code. I would prefer keeping the layers distinct. Even if you are to do this in PHP, at least have a separate class that deals with the database and not do anything else. When your code gets bigger, having a separate class or module or namespace that manages these types of code really makes them easier to change and to test.

Codeigniter 2.1 - multi language insert

I need to insert data into DB in two language, and I am having a bit of a dilemma (data needs to exist in both languages). Is it better to make user insert data in both language at once, or is it better for the user to first insert in one language and then to insert in the second one? And if the latter is better how is the most efficient way to do this? How can I present all articles that are not inserted in both language?
DB structure for the articles:
Common table for all article (same data):
**article -> id_article | image | date_created | category_id | subcategory_id**
Table where data is different:
article_info -> article_id | name | text | lang_id
If the data must exist in both languages - i.e., the application assumes that if an item exists in one language, than it must exist in the other - then you should design your application so that the user must add them both at once.
When you perform the database writes, you should also be using transactions. This will ensure that either all of your writes succeed, or none of them do. It prevents the database from being left in an indeterminate state with a record for one language but not the other.
Have a look at this CodeIgniter manual page on transactions to get an idea on how they work.
You can also use the insert_batch method in the database class to insert both records at once. I don't know how it works with all database drivers, but the mysqli driver will generate a single query when you use insert_batch, so the entire insert will succeed or the entire insert will fail, similar to what happens with transactions. That said, I would still wrap the call to insert_batch in a transaction block just to be a bit paranoid and future-proof.

Can I use MySQL temporary tables to store search results?

I have a search page written in PHP, and it needs to search in the MySQL database, and the result need to be sortable. This search page will be accessed by many users (>1000 at any time).
However, it is not feasible to sort the search result in MySQL, as it would be very slow.
I'm thinking of storing each search result into a temporary table (not MySQL temporary table), and the table name is stored inside another table for reference like this:
| id | table_name | timeout |
-----------------------------
| 1 | result_1 | 10000 |
| 2 | result_2 | 10000 |
Then I can use the temporary tables to sort any search results whenever needed without the need to reconstruct (with some modification) the query.
Each table will be dropped, according to the specified timeout.
Assuming I cannot modify the structure of existing tables that are used in the query, would this be a good solution or are there better ways? Please advice.
Thanks
There's no need to go to the trouble of storing the results in a persistent database when you just want to cache search results in memory. Do you need indexed access to relational data? If the answer is no, don't store it in a MySQL database.
I know that phpbb (an open source web forum which supports MySQL backends) uses a key-value store to back its search results. If the forum is configured to give you a link to the specific results page (with the search id hash in the URL's query string) then that link will be valid for awhile but eventually be flushed out of the cache, just like you want. It may be overkill to implement a full database abstraction layer if you're set on MySQL though. Anyway:
http://wiki.phpbb.com/Cache
You should just use memcached or something to store the results data, and you can easily retrieve the data and sort it in PHP. Also there are some PHP-specific cache frameworks that minimize the cost of loading and offloading data from the interpreter:
https://en.wikipedia.org/wiki/List_of_PHP_accelerators

Is it a bad idea to connect two databases like this? How can it be done?

I am currently using Simple Machines Forum on my website, and users both register AND login using their forum account on my website. The smf_members table contains fields such as:
id_member | member_name | date_registered
What I am trying to do now is extend this to add more custom fields and connectivity on my site. I want to use the id_member field for many things:
For example, I want a user (an entry in this smf_members table) to be able to join a team.
I am going to create a table Teams with the following fields:
ID | Name | Description
and a table TeamMembership with the following fields:
UserID | TeamID | Role
As you can see, this table will link a member and a team, while leaving data specific to a user or team in their respective tables ONLY (successfully preventing redundant data). This sounds good, right?
Well, I don't want those two new tables in the same database as the SMF stuff, because it may get messy. Is it the easiest solution, though? Do you think the easiest solution is to just create my new tables within the same database, with the prefix cst for Custom? I have no idea how to link two databases so if it is too complicated maybe I should just do my cst solution.
I've edited this post and have an additional question.
Thank you for the answers. I have an additional question. Let's say that I wanted to extend new variables to the members, but again, wanted to avoid adding new fields into the SMF forum member table. What is the easiest way about going about this? Like, I want to create a table called UsersExtended and have fields such as:
-ID (this is NOT an auto-increment field, but the value of id_member from the SMF members table)
-Country
Is it easy to create a profile page with this structure and display any relevant data I want from the two tables, in a way, Linking the two so that they act as one big table?
They totally belong in the same database. Use one db and join the tables in your queries.
The only time you want to build a unique database for your application (other than for the application itself) is if you intend to create an API which will serve up that data to other projects in an unbiased manner. That is very common with Solr in which you can churn out blazing fast API's and generally don't belong mixed in with your current MySQL tables.
There is absolutely no harm in having the custom tables in the same database. You cannot define relations between 2 different databases in a relational database system.
If you're using MySQL have a look at this question's accepted answer. Remember you have to prefix your table with database name when querying across databases.
You may try this
SELECT *
from database1.table, database2.table
where database1.table.id=database2.table.id
You should only create two tables within the same database
**smf_members**
id_member | member_name | date_registered
**TeamMembership**
Team_ID | Name | Description | Role | id_member
If any user wants team membership the you can easily manipulate its record using join queries.
Don't know Simple Machines Forum - but I do know that most applications expect to be the only show in town.
I would create a new database = perhaps called "teams" with your own tables, and use NickCL's way of doing cross-database joins to join between the two applications.
You don't mention the specific database engine you're using (I assume it's MySQL), but logically, you can think of databases as namespaces - a way of keeping stuff that belongs together in the same logical place.
Ideally, "people"/"authentication" etc. should be in a separate database from the forums stuff - but as you're working with off-the-shelf code, you don't have that luxury.
In my experience, it's better not to mess with the databases of off-the-shelf software - when you want to upgrade, you have no idea what will happen to your own tables, regardless of their names....

mysql database structure - data only relevant per user thus user tables?

I would like to build an online logbook for truck drivers. The goal is that after a truck driver logs in, he/she immediately sees a snapshot of his/her driving total this year/month/day, together with some other totals also per year/month/day. So the information stored in the database is only relevant per user (truck driver). I personally don't require any statistical data out of the database as a whole (only per user).
Let's assume 10,000 users.
My question relates to the design of the mySQL database.
Since the information stored is only relevant per user and not in mass, does it makes sense to store the data in a table per user... leading up to 10,000 tables? Would that result in the most efficient/fastest database? OR should I dump all rows in one big 'Log' table, and have it relate to another table 'Users'.... even if analysis will only be done per user?
Here's some of the information that needs to be stored per user (ends up to about 30 columns):
Date - Truck make/model - Truck ID - Route # - From - To - Total time - Stops - Gas consumption - Night time - Crew (2nd driver) - ......
Simplified example here
User
user_id
first_name
last_name
Truck
truck_id
truck_make
truck_model
Route
route_id
user_id
truck_id
route_from
route_to
gas_consumption
Without anymore details to go on this is how I'd roll it.
I would suggest going for separate tables, but maybe in your case 1 large table is a good plan.
Assuming you write efficient MySQL to access the data you shouldn't have a problem with a large dataset such as the one you have described.
I'd take a look at MySQL / Rails Performance: One table, many rows vs. many tables, less rows? for more information on why going for the tables root may be a good idea. Also Which is more efficient: Multiple MySQL tables or one large table? contains some useful information on the subject.
You're describing a multi-tenant database. SO has a tag for that; I added it for you.
MSDN has a decent article giving you an overview of the issues involved in multi-tenant databases. The structures range from shared nothing to shared everything. Note carefully that in a "shared everything" structure, it's fairly easy for the database owner (probably you) to write a query that breaks user isolation and exposes one user's data to other users.

Categories