I'm using jQuery .ajax function to load php page with a mysql query in it that selects data from the database, but my question is: Does refreshing mysql query by jQuery ajax crashs or tiring the database?
Info: Refreshing by 1 second using setInterval().
Edited: This the queries that I use to refresh them.
SELECT * FROM table1 ORDER BY id DESC
SELECT * FROM table1 ORDER BY id ASC LIMIT 10
DELETE FROM table1 WHERE id = 'something'
I would you answer me.
Thanks in advance
MySQL is capable of serving hundreds of queries per second, even when running on low-end hardware.
However, you could still be "tiring" the database server, especially if you're running a very complex query, or if you do not have the necessary indexes on your tables. You may want to use the EXPLAIN syntax to see how MySQL is executing your query.
No it does not crash the database (how could it?). Of course, if you're requesting the script running the mysql_query a lot it might result in a DoS (but normally it requires a very high amount of requests to bring down a server).
EDIT: Your update states you're using setInterval() with 1 second. It this case it depends on how many users will have that page open at the same time. E.g. say 1000 Users are on that site - this would result in 60.000 Requests + Queries being fired every minute. If your query is only a simple select it might not be a problem. If your doing a slow-query you might want to check the slow query log to improve your query - or alternatively change the behaviour of your script.
What is the mysql query? Is it user specific? If not, it would be much better to cache the value server side once per second and use ajax to fetch this cached file rather than have all of your users simultaneously requesting the same data.
Related
My website home page takes more time to load. when i run my website on,it disformed,images are here and there for some time and then after loading it become ok.
There are three tables which fetches data from database(my sql, php).So i have three select queries with UNION.
What should i do either to keep queries on home page OR i should use ajax jquery to populate it.?
How i can improve speed of app?
I do not need to query again and again in an hour,so what i can do?
One query is here
SELECT * FROM ABC
UNION
SELECT * FROM XYZ
Measure which part of code is executing for the longest period of time. You can do this by printing HTML comment with timestamp, than go through page code. After that you will know where bottleneck is and which part you need to optimize.
Few general tips:
Measure time of sql queries. If it is long make sure that you have proper conditions and indexes
Use server-side paging for grids with many records
You can use ajax to load few parts of the page in parallel.
Caching result of long running queries is good way
Caching preprocessed result of query is even better
In some cases the best way is to write something from scratch
I have a big table with customers, t_customer that has 10.000.000 records.
I start a certain PHP script which chooses data from this table and I need to execute an action on each customer.
But as I progress through the data, the SQL inquiry runs more and more slowly, and now terminates with Query execution was interrupted.
My query is:
SELECT id, login FROM t_customer WHERE regdate<1370955715 LIMIT 2600000, 100000;
So the limit doesn't have any effect any more and I don't know what to do about this.
P.S.
SELECT id, login FROM t_customer WHERE regdate<1370955715 LIMIT 2600000, 10;
the above query is executed 30 seconds
P.S.S.
The same result even without a WHERE clause
So you are selecting 100K records in PHP? That is a bad idea.
Lower your batch size to 1K, paginate through your target set and then see how it goes. Make sure you have an index on the regdate too. 100K arrays in PHP are... complicated.
PHP is a scripting language, it's not really C++ :) That's why I write background heavy-lifting workers in C++.
MySQL offers a really clever feature called partitions. http://dev.mysql.com/doc/refman/5.1/en/partitioning.html
It allows you to automatically split huge data sets into smaller files giving you a huge improvement while doing operations on your data. Just like RAID but for SQL :P There was an excellent post on SO on the best configuration for partitions, but can't find it at the moment.
If you improve the performance of your query, PHP should instantly have more time to smash through your loops and arrays.
I'm using a table to do login, the table has a record of 2500. when i try to login it takes nearly 30-40 seconds to load. I'm using PHP and MySQL.I need to make the sql query faster to check the data. Solutions are welcomed thanks in advance pls help me out
When locating the causes of problems of performance issues, there are many things to consider in the your application stack:
Database: Indexes, Joins, Query Formation
Network in between: Routing issues, Bandwith, connectivity speed
Code: Check if your code structure is not creating unecessary delays, forexample some people have their validations span over client and server both in a single method which increases method lifetime longer. Try to put validation's core logic on database side like in stored procedures. Try to use methods with lesser overheads.
You should have included your query so we can examine it.
Don't pull all your records (e.g. don't use Select * from users) and then loop through to find a match.
Instead, use WHERE clause to limit your records to (i.e. one row). That is
SELECT col1,col2,.. FROM Users
WHERE username='username' AND password='password'
You can try that and see the performance...
I just finished programming a plattform on PHP that uses MySQL to save the results of a poll. I will be doing the poll with a group of 24 users that will submit the form at more or less the same time. So that the mysqli_query() function will be executed around 24 times at the same time.
Is this a problem?
Should I worry about the function blocking the access to the db, or is the mysqli_query() safe to use without having to worry about blocking?
In general no, unless you are doing something very interesting in the application logic that requires the data from another row. For example if you are using an ID it should be of the auto increamenting type, pretty standard stuff. You know you are ok if you are are only doing an insert and no select before hand and don't have a crazy stored proc on the database, which you likely do not.
mysql_queries will executed by order in which they arrived to mysql server.
You can don't worry about it
I am writing a PHP/MySQL application (using CodeIgniter) that uses some jQuery functionality for dragging table rows. I have a table in which the user can drag rows to the desired order (kind of a queue for which I need to preserve the rank of each row). I've been trying to figure out how to (and whether I should) update the database each time the user drops a row, in order to simplify the UI and avoid a "Save" button.
I have the jQuery working and can send a serialized list back to the server onDrop, but is it good design practice to run an update query this often? The table will usually have 30-40 rows max, but if the user drags row 1 far down the list, then potentially all the rows would need to be updated to update the rank field.
I've been wondering whether to send a giant query to the server, to loop through the rows in PHP and update each row with its own Update query, to send a small serialized list to a stored procedure to let the server do all the work, or perhaps a better method I haven't considered. I've read that stored procedures in MySQL are not very efficient and use a separate process for each call. Any advice as to the right solution here? Thanks very much for your help!
Any question that includes "The table will usually have 30-40 rows max" ends with "Do whatever you want to it." I can't imagine an operation, however frequently it's performed, that would have any appreciable performance impact on a table that tiny.
The only real question is what the visitor will be doing while your request is going to and returning from the server. Will they be locked out of making other changes? If not, make sure you have a mechanism to ensure that the most recent change is the one that's really taken effect. (It's possible for requests to reach the server out of order, and you wouldn't want an outdated request to get saved as the final state.)