So i am starting to make this new app that will be a software as a service (SAAS), I have been looking at MongoDB but after reading some posts i get the impression that its not stable where you have to run the repair command alot and its easy to lose data.
So with the release of 2.0 are these issues still about?
To note: the app is not a forum but it will do simler sort of things that the database is needed for like users, posts, and others sorts of information, is a NoSQL the right type of DB for me or should i just go with MySQL? its also coded in PHP.
I think you should write "simple" "helloworld" application(blog with user registration) with Mongo.
It's very different from relational DB, so you will see the difference yourself.
Related
I'm having a hard time implementing a search feature for a web based system I’m working on, I first use MySQL Like with %wildcards%, but it not searching what I want to display, then I come upon Full Text index search, it search very good but has an issue on displaying joined multiple tables with foreign key which I don’t know workarounds, then I came along with MySQL with sphinx,
may I ask for any advice the best way/technologies to implement a search feature to search a Complex database tables
Check Apache Solr search server
Apache Solr official website
this technology will solve all your searching related problems
I guess the general answer here is you want a 'search index' - an index specifically for running searches. A repository that has all the required data to answer queries.
A RDBMS (like MySQL) is very good for Normalizing data, setting data up in a compact and easy to update format (ie minimise duplicate) - thats great for storage. But queries suffer as they have to do much more work to 'join' all the required data back.
... but for searching a denormalizaed structre may be best. (bigger, but easier - therefore quicker to 'search'.
There are many ways of doing that.
A materialized view as noted in your other thread php mysql full text search multiple table joined by id - keeps it all in mysql.
Using a external application. There are many examples, Lucene (variants include Solr and ElasticSearch), SphinxSearch, and many more.
This generally work in a similar way - setting up a dedicated copy of the data to make queries easier.
Use an external provider. Ther are many 'search as a service' systems (basically wrappers around the software mentioned in previous posts)
Building your own! Its possible to build a system yourself using just normal mysql tables. Basically an implementation of an inverted index will probably be the easiest.
Which you use is down to personal preference (eg, an external app is more work to setup, but overall is more powerful)
I'm staring to build a system for working with native languages, tags and such data in Yii Framework.
I already choose MongoDB for storing my data as I think it feets nicelly and will get better performance with less costs (the database will have huge amounts of data).
My question regards user authentication, payments, etc... This are sensitive bits of information and areas where I think the data is relational.
So:
1. Would you use two different db systems? Should I need them or I'm I complicating this?
2. If you recommend the two db approach how would I achieve that in Yii?
Thanks for your time!
PS: I do not intend this question to be another endless discussion between the relational vs non-relational folks. Having said that I think that my data feets mongo but if you have something to say about that go ahead ;)
You might be interested in this presentation on OpenSky's infrastructure, where MongoDB is used alongside MySQL. Mongo was utilized mainly for CMS-type data where a flexible schema was useful, and they relied upon MySQL for transactions (e.g. customer orders, payments). If you end up using the Doctrine library, you'll find that the ORM (for SQL databases) and MongoDB ODM share a similar API, which should make the experimentation process easier.
I wouldn't shy away from using MongoDB to store user data, though, as that's often a record that can benefit from embedded document storage (e.g. storing multiple billing/shipping addresses within a single user document). If anything, Mongo should be flexible enough to enable you to develop your application without worrying about schema changes due to evolving product requirements. As those requirements become more clear, you'll be able to make a decision based on the app's performance needs and types of database queries you end up needing.
There is no harm in using multiple databases (if you really need), many big websites are using multiple databases so go a head and start your project.
I am using a commercial PHP web application that stores information in a mysql database, and find myself needing to create some custom reports on that database information, ideally presented via the web with the ability of exporting the reports to PDF or some external format as well.
I could just slap together some PHP to query the DB and then show the results of SQL queries against that DB, but was hoping there may be a more intelligent framework I could employ to generate these reports faster and easier now as well as in the future. Codeigniter looks like it may be a good starting point, but I'm not in love with it. What do people use when they need to work with an existing SQL db info but dont want to roll it all from scratch?
edit - I know php/python/ruby enough to operate, but I'm rusty so starting from scratch will make the process longer than it probably needs to be. I'm looking to leverage quality frameworks if they exist to give me the best results in the longrun
I would recommend Django, it has a management command that can help automatically generate models from an existing database, inspectdb. You could leverage that to quickly get going and start using Django's powerful ORM to build your reports.
I'm trying to convert site www.mircscripts.org to Drupal, and researching more and more about migration I've been getting more and more confused.
There are various modules that supposidly will help with the migration process, but these are nothing but confusing and all lack in either a. documentation (listen up Migrate module!) or b. only support Drupal 6.
The Migration Module you seem to have to dive into PHP code and create some "mappings" to your old table data and Drupal. First off, I would ideally like a GUI (Table Wizard only supports Drupal 6 it seems, and the superseded Data module only Drupal 6 too). I also want to import data into Drupal and not use "mappings." I want to be able to disable the Migrate module after all the data has been "converted" to be Drupal node compatible.
If you take a look at the site above, you can pretty much see the scale of data, forum, comments, etc and get an idea of how the database tables look, just your usual stuff; users, comments, and more custom stuff like "files" which stores all the different scripts uploaded by users.
Any suggestions on how I would I go about converting the site?
Cheers
Gary
edit: I forgot to mention the site is almost entirely custom made. The code for it looks like my nan on her way to bingo -- a complete mess. There is an interesting bit of code available at http://drupal.org/node/261066 if you scroll down, although I don't feel like doing node_save() 60,000 times for every record, for just one table. It sounds evil.
The migration module comes with an example migration module and some documentation to facilitate content migrations. Dozens of sites have utilized this to achieve migration. It is not a GUI point and click but very flexible and can be tested repeatedly.
I've written my own modules before to do custom migrations prior to this, and its not as bad as it sounds. You just need to know how the schema relates and map it on paper, then pseudocode and test.
For www.dogfish.com, I migrated 12,000 some nodes. For a band site I am trying to relaunch, I have migrated 75,000+ nodes. Both methods I utilized a db connection and cron to get the next X results. I could have also used BatchAPI but that seemed slower.
Migrate module is the way to go.
Take a look at feeds. It's a GUI importer thing that allows you to map fields in a file to objects like nodes.
http://drupal.org/project/feeds
There's nothing evil about node_save over all the rows in your table. That's essentially what any import module should do anyway. I worked on a huge d5 project where we imported millions of nodes that way. It works fine.
Some people have recommended saving to the database directly. Never do that. It's just not the right way to create nodes. Use the Drupal API!
As i have spent years on drupal. I did not see any full fledged module for migration. Its better you start coding, it will save time and that the smart idea :)
I'm creating a new web application (Rails 3 beta), of which pieces of it will access data from a legacy mysql database that a current php application is using.
I do not wish to modify the legacy db schema, I just want to be able to read/write to it, as well as the rails application having it's own database using activerecord for the newer stuff. I'm using mysql for the rails app, so I have the adapter installed.
How is the best way to do this? For example, I want contacts to come from the old database. Should I create a contacts controller, and manually call sql to get the variables for the views?
Or should I create a Contact model, and define attributes that match the fields in the database, and am I able to use it like Contact.mail_address to have it call "SELECT mailaddr FROM contacts WHERE id=Contact.id".
Sorry, I've never done much in Rails outside of the standard stuff that is documented well. I'm not sure of what the best approach would be. Ideally, I want the contacts to be presented to my rails application as native as possible, so that I can expose them RESTfully for API access.
Any suggestions and code examples would be much appreciated
This really depends on how esoteric your legacy db is. This affects the solution considerably. If your legacy db is quite similar to Rails conventions then using a model with a few customizations will probably prove as the best approach. However I've heard of people who wrote a script that constantly reimported data from the legacy db into a new db - the whole structure of the db was so wrong that approach was worth it.