I have developed a news website in a local language(utf-8) which server average 28k users a day. The site has recently started to show much errors and slow down. I got a call from the host saying that the db is using almost 150GB of space. I believe its way too much for the db and think there something critically wrong however i cannot understand what it could be. The site is in Drupal and the db is Mysql(innoDb). Can any one give directions as to what i should do.
UPDATE: Seems like innoDb dump is using the space. What can be done about it? Whats the standard procedure to deal with this issue.
The question does not have enough info for a specific answer, maybe your code is writing the same data to the DB multiple times, maybe you are logging to the table and the logs have become very big, maybe somebody managed to get access to your site/DB and is misusing it.
You need to login to your database and check which table is taking the most space. Use SHOW TABLE STATUS (link) which will tell you the size of each table. Then manually check the data in the table to figure out what is wrong.
Related
! Actually, I am learning PHP from last couple of months and now I am in a stage where I can program small things like a simple Login Page in PHP and mySQL or a Contact Form. I have wrote a lot of codeblocks like inserting something into a database or selecting something from a database etc. etc. But, I always copy paste my own code-blocks from previous projects while working on a new one. So, I want to know whether this tendency is unique to me only or each of the beginner passes through the same phase once during their journey of being a developer?
Please bear with me because I know this isn't really a programming question and doesn't worth your time as well. I tried finding out in Google as well but this is a snap of what I found:
I mean to say that most of the search results dealt with copy pasting other's code which is not the case of what I am talking about. In order to save time I do copy paste my own code blocks almost everytime. So, how bad is this behaviour of mine?
I again apologize for not posting a question that is worth your time but I am finding it hard to learn to code by myself without having any mentor nearby ( Actually, I searched for a mentor who could teach PHP before giving it a start all by myself, but I found none in my area ) for clearing my doubts and as such Internet is the thing which I mostly depend upon for learning about anything.
This question probably belongs on https://softwareengineering.stackexchange.com but I'll try to give you a decent answer and some guidance.
People re-use their own code all the time. You do not however want to copy/paste if possible. The issue with copy/paste is when you have something used more than a few times - like a MySQL database connection - and it needs updating. I'd rather modify one file (or one small group of files) and have all of my webapps fixed/updated than having to modify 2 or 3 database calls in 9 different web apps...
For things that I use everywhere/all the time - talking with our course management systems API, authenticating a user against our LDAP server, connecting to a MySQL database and running queries, processing forms that are emailed, etc - I've built up my own (or coworkers have) sets of functions, classes, etc. Which I then keep in a single directory, and can include as needed.
If you do this, you want your functions/object methods to be as generic as possible - for example, my MySQL query function takes several arguments - an associative array with connection info (since we have several DB servers based on purpose), a query, and an array of parameters. It returns an array with a status code, and then appropriate data - the record set result for inserts, the ID of the last insert, the count of rows affected (for delete/update). This one function handles 50+ queries and connects to 4 different MySQL servers.
Came across a weird issue today.
Have a Drupal site, been live for a few years now, and this is actually the first time I've had to look at anything with it.
I needed to check out one of the tables in the MySQL database, anything I select for it throws a 500 Server Error popup, but does not do this for all tables.
I thought maybe a repair would do, but of course InnoDB doesn't support that, so I am at a loss to what the issue could be, and how I can get into this table to view/manually modify some records.
Can anyone help?
i am making a comment system, the user will log in with their details on the main page which has been built, but on the second page where the comments will be i want to show each comment in order of time created, would this be better done in mysql database to store the comments or by putting the comments into a file and reading them from the file's on server?
XML [aka a 'flat file' database] may seem preferable for simplicity's sake, but you will find that your performance degrades ridiculously fast once you get a reasonable amount of traffic. Let's say that you have a separate XML file storing comments for each page, and you want to display the last 5 comments with the newest first. The server has to read the entire file from start to finish just figure out which 5 comments are the last.
What happens when you have 10,000 comments on something? What happens when you have 100 posts with 10,000 comments and a 1000 pageviews per second? Basically you're putting so much I/O load on your disk that everything else will grind to a halt for queued I/O.
The point of an RDBMS like mysql is that the information is indexed when it is put into the database, and that index is held in memory. In this way the application doesn't have to re-examine 100% of the data each time a request is made. A mysql query is written to consult the index and have the system retrieve only the desired information, ie: the last 5 comments.
Trust me, I worked for a hosting provider and code like this is constantly causing problems of the variety "it worked fine for months and now my site is slow as hell". You will not regret taking a little extra time to do it right the first time rather than scrambling to re-do the application when it grinds to a halt.
Definitely in MySQL. The actions are easy, and traceability is easy.
Unless you want to go the NoSQL route.
Storing the comments in database will be a better option. You get more power with database. By word more power, I mean you can easily do these:
When use gets deleted, you can decide if you want to delete his comments OR not
You can show top 10(or so) comments
You can show all comments from a user on some other page
Etc
I would recommend a database, a lot easier to access the data that way. XML is always a pain when it comes to laying out the code when pulling out the data. Might be worth getting a more detailed explanation from someone though as i've not had that much experience with XML.
Plus you have more control over moderating the comments.
Hope that helps!
MySQL every time!
It's perfect for doing this and very suitable/flexible/powerful.
I do have 1 million datas in my MySQL database and when I export whole data it is getting stuck in between and showing the download box for long time. sometimes it will export without any issues. but if I do multiple table exports then couple of tables may export and others are getting stuck. why is this happening and what will be the work around for the same??
well I am using PhpMyadmin to export
It is most likely due to the data size. The webserver could have timeout issues or run out of memory when exporting big amounts of data.
I suggest exporting one table at a time with phpMyAdmin (in SQL format, avoid using XLS), but if it still fails, you may consider using mysqldump.
What you do is, delete phpmyadmin from your system, write to the developers, and tell them to immediately discontinue development and destory all copies of the source.
You get everyone who has ever installed phpmyadmin to delete their copies too, and then bing the world will be a better place...
It is alas, but a dream.
PHPMyAdmin is a wart on the arse of the universe and should be eliminated; it is a kind of fungus which poisons any data it touches with a painful, lingering death.
Moreover, the developers appear keen to insist that it is actually useful; it has an interface which makes things which fail appear to work, thus fooling the naive user into believing that it has actually DONE what it was asked to do.
Its backups give an overwhelmingly false sense of security; they cannot be considered to be "backups" insofar as one might hope to restore them.
as the title of the question suggests, my question is simple, which one is better in terms of performance knowing that i'm on a linux shared hosting, siteground.. i'm capable of coding both, i actually coded a one that updates the DB, but from reading around some people suggested to insert and not to update.. any feed back is much appreciated..
thank you.
Use a database! Since you will have multiple people accessing your site, writing to one file will either mean blocking or having the count overwritten.
By using a database and inserting, you don't have to wait for other clients and you are safely allowing concurrent access. you just get the count by doing a select count(*) from countTbl
What are you storing in the database? If it´s just that one number (the page counter), I would not use a database but if you are storing data for each visitor, a database is the way to go.