How to sync MySQL database with solr automatically - php

I've a database which updates very frequently and I would like solr to synchronise it automatically.
I've 20+ tables but I want my search should be work in only 2 tables and only in some specific field only.
I've put some data manually in solr and run it with get api.
http://127.0.0.1:8983/solr/gettingstarted/select?indent=on&q=:&wt=json
and it works perfectly fine.
can we also do something like if my 'X' users search somethings and in another table I'll check that if post is made by 'X' and his friends then only returns that otherwise not I mean can we get data with conditions as well ?
Please provide some link or resource or any related reference or solution regarding it.
TIA

There is no automagic syncing in Solr. You'll either have to send the updates to Solr yourself (through the regular POST interface), or trigger a deltaimport through the DataImportHandler if that's what you're using.
For your second case - the answer depends - but the most straight forward way is to include a list of valid usernames or user_ids in your query and then filter against that list (in a fq=posted_by:(foo OR bar OR baz OR ...). This is limited by the number of boolean clauses in your solrconfig.

Related

PHP / MySQL - Compare tables from 2 different databases

I've got 2 frameworks (Laravel - web, Codeigniter - API) and 2 different databases. I've built a function (on the API) which detect changes on one database (from 2 tables) and apply the changes in the other database.
Note: there is no way to run both web and API on the same databases - so thats why I'm doing this thing.
Anyway, this is important that every little change will recognized. If the case is new record or delete record - its simple and no problem at all. But, if the records exists in both databases - I need to compare their values to detect changes and this section become challenging.
I know how to do this in the slowest and heavy way (pick each record and compare).
My question is - how do you suggest to make it work in smart and fast way?
Thanks a lot.
As long as the mysql user has select rights on both databases, you can qualify the database in the query like so:
SELECT * FROM `db1`.`table1`;
SELECT * FROM `db2`.`table1`;
It doesn't matter which database has been selected when you connected to PHP. The correct database will be used in the query.
The ticks are optional when the database/table name is only alphanumeric and not an SQL keyword.
Depending on the response-time of the 'slave'-database there are a two options which don't increase the overhead too much:
If you can combine both databases within the same database by prefixing one or both of the tables, you can use FOREIGN KEYS to let the database do the tough work for you.
Use the TIMESTAMP-field which you can set to update itself by the DB whenever the row gets updated.
Option 1 would be my best guess, but that might mean a physical change to the running system, and if FOREIGN KEYS are new for you, you might wanna test since they can be a real PITA (IMHO).
Option 2 is easier to implement, but you still have to manually detect changes to deleted/rows.

PHP + MySQL cascading searches from linked query results

Apologies in advance for the wall of text; not sure if this is possible but i thought i'd ask. I've looked online and can't quite find what i want. I have been learning a lot of PHP and MySQL and am at the stage where i am starting to write my own database driven websites. A freely available database i have been practicing with is the eve_sdd_crucible_11 database which is freely available from the game website. I have been using it because it's huge and requires the use of a lot of different skills to get the most out of it.
I would like to do a simple database for exploration. This database queries the main one for information, creates a new table based on the search results and also allows the user to add their own comments on what they have found. I have the various queries ready to go but because i don't want a massive user interface, i want to keep the user side as clean as possible. This app needs to query and display results from the 'mapregions', 'mapconstellations', 'mapsolarsystems' and 'mapdenormalize' tables and insert this information into a new database with the retrieved info plus a comment box for each entity.
Now the preamble is done, this is what i am looking to do:
Query 'mapregions' (region name returns region ID to be used in next query) and display results (linked)->
click on linked result, query 'mapconstellations' (constellation name returns constellation ID to be used in next query) and display results (linked)->
click on linked result, query 'mapsolarsystems' (solar system name returns solar system ID to be used in next query) and display results (linked) ->
click on linked result, query 'mapdenormalise' and display all entities in that system -> Inject content into new database along with comment boxes per listed entity.
Like i said earlier in the post, i have the queries set up ready to go, i have the beginnings of the php for the page but i am stuck on how to link these displayed results to the next query in the chain. All results have to display the name of the entity and it's the entity's corresponding ID number that is used to execute the next query in the chain.
Not sure if i've explained it particularly well, but it's the best i can do at this time of night... Any help or pointers would be vastly appreciated as it's starting to do my head in ;)
Need to look into joins - from reading through that text, it seems like you're missing a basic understanding of how to join tables - your 4th paragraph, to me, sounds like it should be a single query.
Creating a new database and/or table per search shows that you might still be missing some of the fundamentals - as that approach would never scale and would be a nightmare to manage.
Start reading up on mysql joins: Mysql Joins and go from there, looking at other examples of how joins work and real world examples - that will hugely affect how you continue building this.

insert update only modified fields over 2 servers mysql

I have 2 sql servers on 2 diferent locations.
One is a web server and the other a crm system.
People update and register on web, when they do changes i need to insert or update the changes to my crm server.
I have a view on web server where i can select from but i need to
insert into on duplicate update only fields that changed and then in a description
show
wich fields were updated?
I have no clue how to start.
You can not determine the differences on fields after changing them.
You can however select and store the contents prior to the update and then compare it with the new contents.
The question then becomes: Do you need the differences per column?
If yes: Pre-select and do the difference yourself (in the
application).
If no: Use the method described by #Ogelami (and accept his answer :)
On a side note: The Pre-Select thing won't work as well, when you start using several mysql servers, since you might run into issues with drifting data (ie one server is behind in inserted data). When this occurs, the method will get a bit more complex.
Perhaps something like this?
INSERT INTO table ON DUPLICATE UPDATE table SET field = value WHERE field != 'value'
and you might want to look into this to see if there are Affected rows.

Display latest search results from MySQL with PHP

Google unfortunately didn't seem to have the answers I wanted. I currently own a small search engine website for specific content using PHP GET.
I want to add a latest searches page, meaning to have each search recorded, saved, and then displayed on another page, with the "most searched" at the top, or even the "latest search" at the top.
In short: Store my latest searches in a MySQL database (or anything that'll work), and display them on a page afterwards.
I'm guessing this would best be accomplished with MySQL, and then I'd like to output it in to PHP.
Any help is greatly appreciated.
Recent searches could be abused easily. All I have to do is to go onto your site and search for "your site sucks" or worse and they've essentially defaced your site. I'd really think about adding that feature.
In terms of building the most popular searches and scaling it nicely I'd recommend:
Log queries somewhere. Could be a MySQL db table but a logfile would be more sensible as it's a log.
Run a script/job periodically to extract/group data from the log
Have that periodic script job populate some table with the most popular searches
I like this approach because:
A backend script does all of the hard work - there's no GROUP BY, etc made by user requests
You can introduce filtering or any other logic to the backend script and it doesn't effect user requests
You don't ever need to put big volumes of data into the database
Create a database, create a table (for example recent_searches) and fields such as query (the query searched) and timestamp (unix timestamp that the query was made) said, then for your script your MySQL query will be something like:
SELECT * FROM `recent_searches` ORDER BY `timestamp` DESC LIMIT 0, 5
This should return the 5 most recent searches, with the most recent one appearing first.
Create table (something named like latest_searches) with fields query, searched_count, results_count.
Then after each search (if results_count>0), check, if this search query exists in that table. And update or insert new line into table.
And on some page you can just use data from this table.
It's pretty simple.
Ok, your question is not yet clear. But I'm guessing that you mean you want to READ the latest results first.
To achieve this, follow these steps:
When storing the results use an extra field to hold DATETIME. So your insert query will look like this:
Insert into Table (SearchItem, When) Values ($strSearchItem, Now() )
When retrieving, make sure you include an order by like this:
Select * from Table Order by When Desc
I hope this is what you meant to do :)
You simply store the link and name of the link/search in MySQL and then add a timestamp to record what time sb searched for them. Then you pull them out of the DB ordered by the timestamp and display them on the website with PHP.
Create a table with three rows: search link timestamp.
Then write a PHP script to insert rows when needed (this is done when the user actually searches)
Your main page where you want stuff to be displayed simply gets the data back out and puts them into a link container $nameOfWebsite
It's probably best to use a for/while loop to do step 3
You could additionally add sth like a counter to know what searches are the most popular / this would be another field in MySQL and you just keep updating it (increasing it by one, but limited to the IP)

How to store search criteria or search results?

I have a php classifieds website (mostly) and I am currently using MYSQL as a database.
Later on I will use SOLR or maybe Sphinx as a "search engine".
I want to make it possible for users to view "results" of searches they have made before, but I don't know where to start...
How is this done?
Currently I have a form which is filled in and when submitted, the php just checks agains a mysql table to see if there are any matches.
Should I store the 'Search criteria' and do a new search every time the users click on one of their previous searches, or should I store the results? I would prefer to make a new search because new items may have been inserted since the last search!
If you need more input, just let me know and I will update this Q.
Thanks
Well... if you're basically talking about "saved searches", I'm doing something similar currently so that I just have a separate table where....
saved_search_id (primary) | user_id (foreign) | search_name | criteria1 | criteria2 | criteria3 ... etc
So basically I can now display to the user a list of saved searches they've created, and the table stores the criteria that were part of that search. I can then use those saved criteria to run a saved search anytime.
Does that help?
Use query-string parameters ($_GET) for the search form. Then the user can bookmark the search. If you want, you could create a bookmarking feature in your application, but there really is no need.
If you are concerned about performance, make sure that your database' cache settings are tuned correctly, and that you don't write too often to the table. MySql will do a good job of caching then.
You already have said it: if users should see the new results of old queries, you'll have to store the search parameters somehow and re-do the search when a users requests it.
Store the search criteria. This is quite obivious as if the data changes users will get old results. And consider the space the results might take after a while :)
I would also consider really storing the search criteria not the actual query. If you change the database the stored searches would still work as you need to update the query generation engine also but you would most likely forgot to update every stored query.

Categories