mysql query timeout - "Default" outputted to browser - php

I have an application written in PHP which contains a function to perform a complex MySQL query to gather statistics and export it as CSV. Usually the process takes a good 20-30 seconds to complete due to the complexity of the query but I can live with this as it's just one query once a week.
The issue I have is now and again the server just appears to timeout with the word 'Default' outputted to the browser and nothing else
I'm sure this isn't being set/printed in the application logic because I wrote it myself and after looking at the database class I searched every single file in the application for the word Default with no results
I'm also pretty sure it can't be output from the MySQL server because it can't directlty print output without going through PHP can it?
What could be causing this? I'm thinking the only function that could be printing it is my mysql_query() function. Obviously my aim is to optimize the query to stop the timeout but I'd like to find out what is ouputting text as I don't like errors/messages like that being displayed to our users

Related

Display php browser content successively

I'm currently migrating a webpage to a new server from php 5.0.x to php 5.6
Now I got one page that loops a Database query 7 times and the results are loaded into a table. I know that this is not the best way of doing things but it works quite well and I currently don't have other options to work on that because the results come from different databases. Caching is not an option because I need the content always up to date.
From accessing the site until it is displayed it takes around three to four seconds.. The queries got optimized recently but still take a long time to get executed.
To my problem: The old webserver shows the result table by table, so the user see's that the page is already working on it's request. When the content of Table1 is loaded, it gets displayed and the server cares for the second resultset.
The new webserver generates the website at once and the content is not shown until the 7 resultsets are loaded and the data is mapped to the grids.
Is there an option, maybe in php.ini with that i can reach the same result as on the old webserver? I really dunno how to google it so I'm asking you guys.
On the other hand maybe there's an option to run all the queries at once (multithreading?) and not in sequence?
Sorry for my "improvable" english.
It's probably something related to output buffering. You can force to write buffer contents out to the client with ob_flush function, just add a call to ob_flush after query execution.
php.ini setting is output_buffering. Also check output_compression setting, as it can cause problems with buffer flush.

Logging mysql results with timings

I'm using MySql 5.0.37 on my server and I'm able to get profiling data when I use set profiling=1 in a mysql server window but that only works while queries are executed in that window.
I'm able to log queries without the timings by adding a line similar to "log=/path/to/log" in my my.cnf file.
What I want instead is mysql to produce a log file (as queries are executed) that shows the query and the amount of time spent on each query and the value can be similar to what is displayed in the Duration column when I execute show profiles in the mysql server window.
The queries are executed via mysqli calls in a PHP program which makes the back-end of a website. That's why I need the timings in a log file.
Does anyone know how I can make mysql produce such a log?
Look into
The "general log" -- be cautious; it fills up disk fast. (Or it can go to a table.)
The slowlog, but with long_query_time=0 in order to catch everything. Again it can be a disk hog.

MySQL Query Not Inserting All Records (using PHP)

I have a fairly large amount of data that I'm trying to insert into MySQL. It's a data dump from a provider that is about 47,500 records. Right now I'm simply testing the insert method through a PHP script just to get things dialed in.
What I'm seeing is that, first, the inserts will continue to happen long after the PHP script "finishes". So by the time I can see the browser no longer has an "X" to cancel the request and now has a "reload" (indicating the script is done from the browser perspective) I can see for a good 10+ minutes that inserts are still occurring. I assume this is MySQL caching the queries. Is there any way to keep the script "alive" until all queries have completed? I put a 15 minute timeout on my script.
Second, and more disturbing, is that I won't get every insert. Of the 47,500 records I'll get anywhere between 28,000 and 38,000 records but never more - and that number is random each time I run the script. Anything I can do about that?
Lastly, I have a couple simple echo statements at the end of my script for debugging, these never fire - leading me to believe that a time out might be happening (although I don't get any errors about time-outs or memory outages). I'm thinking this has something to do with the problem but am not sure.
I tried changing my table to an archive table but not only didn't that help but it also means I lose the ability to update the records in the table when I want to, I did it only as a test.
Right now the insert is in a simple loop, it loops each record in the JSON data that I get from the source and runs an insert statement, then on to the next iteration. Should I be trying to instead using the loop to build a massive insert and run a single insert statement at the end? My concern with this is that I fear I would go beyond my max_allowed_packet configuration that is hard coded by my hosting provider.
So I guess the real question is what is the best method to insert nearly 50,000 records into MySQL using PHP based on what I've explained here.

Best way to deliver real time information?

I currently have three screens showing a web page with information taken from a database. This information gets updated every 10 seconds. These three screens call AJAX on a PHP script that does a MySQL query that prints JSON encoded results, and that's what it uses to represent the data. The query is a pretty resource intensive one.
My problem has to do with scalability: If ten people were to enter that page, they would all be making the php script to run at the same time and every 10 seconds, thus overloading the server.
I'm considering doing this query on the background and outputting the results to a text file and then retrieve that text file via AJAX, but I feel like there must be a better way to do this.
So the question is: How do I deal with repetitive and very slow sql queries to allow access for multiple users?

How to debug AJAX (PHP) code that calls SQL statements?

I'm not sure if this is a duplicate of another question, but I have a small PHP file that calls some SQL INSERT and DELETE for an image tagging system. Most of the time both insertions and deletes work, but on some occasions the insertions don't work.
Is there a way to view why the SQL statements failed to execute, something similar to when you use SQL functions in Python or Java, and if it fails, it tells you why (example: duplicate key insertion, unterminated quote etc...)?
There are two things I can think of off the top of my head, and one thing that I stole from amitchhajer:
pg_last_error will tell you the last error in your session. This is awesome for obvious reasons, and you're going to want to log the error to a text file on disk in case the issue is something like the DB going down. If you try to store the error in the DB, you might have some HILARIOUS* hi-jinks in the process of figuring out why.
Log every query to this text file, even the successful ones. Find out if the issue affects identical operations (an issue with your DB or connection, again) or certain queries every time (issue with your app.)
If you have access to the guts of your server (or your shared hosting is good,) enable and examine the database's query log. This won't help if there's a network issue between the app and server, though.
But if I had to guess, I would imagine that when the app fails it's getting weird input. Nine times out of ten the input isn't getting escaped properly or - since you're using PHP, which murders variables as a matter of routine during type conversions - it's being set to FALSE or NULL or something and the system is generating a broken query like INSERT INTO wizards (hats, cloaks, spell_count) VALUES ('Wizard Hat', 'Robes', );
*not actually hilarious
Start monitoring your SQL queries by starting the log. There you can look what all queries are fired and errors if any.
This tutorial to start the logger will help.
Depending on which API your PHP file uses (let's hope it's PDO ;) you could check for errors in your current transaction with s.th. like
$naughtyPdoStatement->execute();
if ($naughtyPdoStatement->errorCode() != '00000')
DebuggerOfChoice::log( implode (' ', $naughtyPdoStatement->errorInfo() );
When using the legacy-APIs there's equivalents like mysql_errno, mysql_error, pg_last_error, etc... which should enable to do the same. DebuggerOfChoice::Log of course can be whatever log function you'd like to utilise

Categories