as the title of the question suggests, my question is simple, which one is better in terms of performance knowing that i'm on a linux shared hosting, siteground.. i'm capable of coding both, i actually coded a one that updates the DB, but from reading around some people suggested to insert and not to update.. any feed back is much appreciated..
thank you.
Use a database! Since you will have multiple people accessing your site, writing to one file will either mean blocking or having the count overwritten.
By using a database and inserting, you don't have to wait for other clients and you are safely allowing concurrent access. you just get the count by doing a select count(*) from countTbl
What are you storing in the database? If it´s just that one number (the page counter), I would not use a database but if you are storing data for each visitor, a database is the way to go.
Related
I am currently developing a PHP website and since the website will be used by many people, I just want to know if there will be a problem if there is multiple database access at the same time from those different users, and if so how to go about it. Thanks in advance.
SIMPLE ANSWER: As long as your code is well designed, No.
Elaborating: In a MySQL server, databases are made to work very efficiently and to handle a large set of tasks. Among these tasks include the constant querying of tables inside separate databases, among which include statements that SELECT data, UPDATE data, INSERT rows, DELETE rows, etc.
There are some corner cases that can happen however. Imagine if two people are registering on your website for the first time, and both of them want to register the username Awesomesauce. Programmers often code algorithms that first check if the current username exists, and if it doesn't, INSERT a new row in the users table with the new username and all the other relevant info (password, address, etc). If both users were to click the Register button at the same time, and if your code was badly designed, what could happen is two rows could be created with the same username, in which case you would have a problem.
Luckily, MySQL as features to prevent such corner cases. A UNIQUE INDEX could be implemented on the username column, hence forcing the database not to accept one of the two users who tried to register the name at the exact time.
All in all, if your code is well designed, you shouldn't have a problem.
It all depends on how much traffic, how large your site's database is and a host of other factors.
But for starters, i'ld say there's really nothing to worry about.
I think you should go with MySQL since you are just starting out with php, but you can pretty much use whatever you want with PHP's PDO http://php.net/manual/en/book.pdo.php. There is a lot of online support for mysql with php, so I would start there.
I would suggest make multiple tables in same db rather than multiple db. Though there won't be any problem even if there are multiple db access at same time.
Refer following link to know how its done:-
How do you connect to multiple MySQL databases on a single webpage?
While your question is way too broad, if you want horizontal scaling (adding more servers) look at a PHP/NoSQL solution. Otherwise, something like PHP/MySQL will be fine.
A bit of reading for you here: Difference between scaling horizontally and vertically for databases
I am creating a web-based app for android and I came to the point of the account system. Previously I stored all data for a person inside a text file, located users/<name>.txt. Now thinking about doing it in a database (like you probably should), wouldn't that take longer to load since it has to look for the row where the name is equal to the input?
So, my question is, is it faster to read data from a text file, easy to open because it knows its location, or would it be faster to get the information from a database, although it would have to first scan line by line untill it reaches the one with the correct name?
I don't care about the safety, I know the first option is not save at all. It doesn't really matter in this case.
Thanks,
Merijn
In any question about performance, the first answer is usually: Try it out and see.
In your case, you are reading a file line-by-line to find a particular name. If you have only a few names, then the file is probably faster. With more lines, you could be reading for a while.
A database can optimize this using an index. Do note that the index will not have much effect until you have a fair amount of data (tens of thousands of bytes). The reason is that the database reads the records in units called data pages. So, it doesn't read one record at a time, it reads a page's worth of records. If you have hundreds of thousands of names, a database will be faster.
Perhaps the main performance advantage of a database is that after the first time you read the data, it will reside in the page cache. Subsequent access will use the cache and just read it from memory -- automatically, I might add, with no effort on your part.
The real advantage to a database is that it then gives you the flexibility to easily add more data, to log interactions, and to store other types of data the might be relevant to your application. On the narrow question of just searching for a particular name, if you have at most a few dozen, the file is probably fast enough. The database is more useful for a large volume of data and because it gives you additional capabilities.
Abit of googling came up with this question: https://dba.stackexchange.com/questions/23124/whats-better-faster-mysql-or-filesystem
I think the answer suits this one as well.
The file system is useful if you are looking for a particular file, as
operating systems maintain a sort of index. However, the contents of a
txt file won't be indexed, which is one of the main advantages of a
database. Another is understanding the relational model, so that data
doesn't need to be repeated over and over. Another is understanding
types. If you have a txt file, you'll need to parse numbers, dates,
etc.
So - the file system might work for you in some cases, but certainly
not all.
That's where database indexes come in.
You may wish to take a look at How does database indexing work? :)
It is quite a simple solution - use database.
Not because its faster or slower, but because it has mechanisms to prevent data loss or corruption.
A failed write to the text file can happen and you will lose a user profile info.
With database engine - its much more difficult to lose data like that.
EDIT:
Also, a big question - is this about server side or app side??
Because, for app side, realistically you wont have more than 100 users per smartphone... More likely you will have 1-5 users, who share the phone and thus need their own profiles, and for the majority - you will have a single user.
I need to create a visitor counter for my websites and I'm wondering if it is better to store and read the information from a txt file located somewhere in my host or directly from the database.
Using a database would mean that a DB entry will be created for every single visitor that will access the site and honestly I don't think that would be OK.
File counter - when just count.
DB counter - when visit tracking, depenences, analysis, aggregation.
Read file is really faster, when file is small. Still, there may be a race condition effect, when site is heavy loaded. There is hard to show linked data, if needed. For this needs there is a great solution: Database Management Systems.
Database (with good design) allows to avoid race condition. Also it's a better solution for large amount of linked data structures. It's better, when you need to log visits, referers, etc...
DB Suggestions: you might store counter in one row of global_settings table and update it within each page visit, or you might get it by registrating each visit in visit table (with additional data, like IP, DateTime, UserID, etc...) with SELECT COUNT(*) from visit;.
There is another related topic here.
Loading anything from text files is pretty bad practice. Using a database is the better solution. Databases are meant to store large amounts of data, so it is perfectly acceptable.
I'm working with a programmer who doesn't want me touching his database...
I would like work with a database instead of hard coding my content but don't want the sites performance to suffer.
Is it bad practice to query two different databases on one page?
If its not a problem is there a limit to how many databases you can query per page?
PS the site is php/mysql
me touching his database
That's probably because there is a layered architecture in place and you're not supposed to be talking to the database directly.
Otherwise, if you've come already to the division - "my" database, "his" markup, it's a recipe for disaster.
Is it bad practice to query two different databases on one page?
No, if there is a real need to do it. Yes, if only because somebody declared the database their property and you've got to have your own.
No its not a problem, in some scenarios it's even a pretty good approach.
It depends on if the databases are holding related data. If they are related, it makes sense to keep them in once database. The programmer could then give you a user account with limited access so that you can't corrupt other things.
There is some cost to making a new connection, but it will likely be negligible if you are doing a number of queries.
Can you have a separate schema in his DB? If so, then you could save some connection building /destruction time.
Will you be storing data/relational data in the DB? If not, can you get away with include("file.php")
All that being said it's not a bad practice to have multiple DBs on a page you just need a good reason to do it.
i routinely hit a estimate database and a reference database aka customer live in texas, closest office is 150 miles away.
I want to record number of hits to the pages and I'm thinking either plain file-based or SQLite storage.
File-based option
A file includes only an integer incremented with each page visit and every page has a unique file name. If I open a file using a mode, I can write but is it possible to not to close it in order to save from opening and closing the file each time?
SQLite option
A simple table with columns PageName and Hits. Same question again is it possible to not to close it in order to save from opening and closing the db each time?
Google Analytics. Unless you need to do it in-house.
Rolling your own solution in PHP can be as simple or complicated as you like. You can create a table to store ip-address (not always reliable), the page location, and a date. This will allow you to track unique hits for each day. You may want to schedule a task to reduce the amount of records to a simple row of date, numOfUnique.
Another method is parsing your log files. You could do this every 24 hours or so as well.
If you really have to do this in-house, you should go with the sqlite method. While there's a little overhead (due to opening the database file), there can also be notable benefits in storing structured data.
You could for example add a date field, and then get daily/hourly/monthly/whatever data for each page.
You could also add the IP address for each visitor, and then extract visits data. That way you could easily extract data about your site users' behaviours.
You could also store your visitors' user-agent and OS, so you know what browsers you should target or not.
All in all, inserting that kind of data into a database is trivial. You can learn a lot of things from these data, if you take some time to study these. For that reason, databases are usually the way to go, since they're easy to manipulate.
It's not possible in any of your cases. PHP applications are run when user requests something from them, they generate the result and then shuts down. So even if you don't close db connection or file they will be closed automatically. But I don't know why opening db connection or a file to write would be a problem?
It's difficult to give particularly useful answers in the absence of how much traffic you're expecting (other than Jonathan Sampson's comment that you might be better off using Google Analytics).
File-based option:
I don't think it's possible to keep the file open. Also, you'll probably bump into concurrent write problems unless you employ some kind of locking mechanism.
SQLite option:
I think this is probably the way to go, if you've not already got a database open. I doubt that opening/closing the db each time will be a bottleneck - try it and profile.