Hey everyone this issue is giving me gray hair, last week this was all working fine, Monday when I get into work and try running another test it starts failing.
Backround:
I am building a utility for work that will allow the video guys to either upload a file from their system or specify the URL of a video whether it be in Amazon S3 or wherever. The utility will then either upload the video store it and put an entry in the DB to keep track of it, or move it from the URL provided to the dir they are being stored in and do the same with the mysql.
The Issue:
It has recently became an issue where when moving from a URL I get an error stating that the "MySQL Server has gone away" lots of searching has ended up with lots of issues on timeouts, packet size etc. But, the MySQL connection isn't even being opened until after the file has been moved into its directory. When moving a file from a URL I get the error in under 10 seconds. The only tests I can get to go through is when I specify a URL of a file that is really small, like 5mb.
Last week I was able to run successful tests on files up over 500mb (pulling from URL). Now the files will still move over, but I get the server has gone away error.
Here is the code I am using
file_put_contents('vid_bin/'.$fName, fopen($url, 'r'));
$qry = "INSERT INTO videos (".$fields.") VALUES (".$vals.")";
$db = new mydb;
$db->mydb; //connects
$db->select_db("thedbname"); //selects db
$db->query($qry); //runs query
if($db->error) {
die($db->error . "\n". $qry); //oh noes
} else {
...
}
Our server guy hasn't been available so I am left wondering if its my code or if its something that changed on the server. I am pretty sure its not the code but wanted outside opinions on what could be causing MySQL to freak out despite only having strings, dates and ints stored in it.
UPDATE:
If I process the MySQL bits before messing with the file all works well. The problem is, I have the file handling bit of code in a try/catch because if someone provides an invalid URL or something goes haywire I don't want to go back and remove the MySQL record.
Any idea why MySQL would care if the file is handled before it? I was thinking it might have something to do with packet size in the MySQL config but I am not storing the file in the db. Looks like it is going to be an issue for the Server admin to handle, unless someone has some insight into why this would be happening from the code end of things.
Ends up nothing was changed on the server so I'm not sure why the code stopped working as intended. I just handle the MySQL stuff first, then the file. There is a cron that then does some work on the files based on the SQL results so I just added in a check that the file exists and if it doesn't I remove the record. Not exactly how I wanted it to operate but gets the job done.
increase max_allowed_packet in mysql to 128MB
Related
I am currently doing a project on website programming with PHP and mySQL. In general the project allows users to upload their files to the server and perform searches later according to the "file creation date". I use double quotation here since I want to extract the date when the file was created from nothing (not the file modified date).
I have been looking at a lot of programming forums and reference websites while most of them just introduced the use of filemtime() and filectime(). I have tried both of them and they only returned the "file modified time" (the time when the files were uploaded to the server by me in this development stage).
Knowing the actual file creation time is very important to me because I will use it to perform timeline search later as a requirement of the project. I have got stuck here. Really appreciate any kind help and suggestions.
I am using "xampp" for the web and database servers (the app has the configuration by default) and HTTP+PHP for the front end.
When a user uploads a file in a web browser, what actually happens is that three pieces of information are added to the form submission data:
A filename (which a browser will base on the name it had on the user's system)
A file type (which will be the browser's best guess, usually just based on the filename)
The binary contents of the file
Even if both the user's computer and your server have somewhere to record the creation date of the file, this won't be transmitted with the upload.
There's also something very important thing to bear in mind when building any web-based system: you only have the information the user gives you, and that information could be deliberately or accidentally wrong. If there was a timestamp, it might be wrong because the user's clock is wrong; or it might have been sent by a piece of software that let the user manually set it.
I have a report generation PHP program which used to work fine before. I have used 2 3rd party libraries in the program: Google image chart library ( returns image if I supply values in url ) and tcpdf ( for pdf generation ). I am using mysql not mysqli for queries. There are lots of queries and loops in the page.
Before it used to take less than 3 minutes to generate the report, I am using an ajax call to generate the report which gives a completed message once the file generation is done. This program saves the pdf file in a folder and I have a link with same name to download the file.
Recently when I checked its not generating properly.
Error was TCPDF unable to get the image. This was because of the google chart library not returning the image properly. When I access the chart url in browser it gives me the image without any issue but If I give it in an image src inside a php file, its not showing. So I decided to save the file in a folder using functions like file_get_contents,file_put_contents and link it in image src. This part is now working correctly I can see the image.
But now the problem is it is taking a lot of time to generate the report, even in local environment. I tried to generate the report without the chart priniting but even then its taking time. In between it was 25 minutes n all and now its close to 10 minutes to generate a 40 page pdf file.
I really don't know why its taking so much time. All of this was working fine before and now its not working. Only thing that changed was google image chart library but now even without(commented that part and checked) that also its taking time.
How do I speed this up ? Is there any way to check which part of program is slow.
Tried xdebug but its output file is more than 400 mb and webgrind is not able to process it.
Please help.
Your next step is to troubleshoot performance.
Is TCPDF doing a lot of work you don't need done? Presumably you've seen the tips from TCPDF's author on increasing performance, and put them into practice. http://www.tcpdf.org/performances.php
Are some of your MySQL queries inefficient? Obtain an interactive connection to your MySQL server, using phpMyAdmin or a similar command-line tool. While your pdf-creation process is running, repeatedly issue this command
SHOW FULL PROCESSLIST
It presents an INFO column showing the active MySQL query for each connection. It also shows each query's elapsed time in milliseconds. If you have queries that run for many hundreds of milliseconds, you might consider using MySQL's
EXPLAIN command to analyze those queries. Often adding an appropriate index to a MySQL table can dramatically speed things up.
Is the machine running your PDF program short on RAM? use a performance monitor like *nix top or Windows perfmon to take a look.
Is your 40-page report, simply put, a huge job to create? If so, you might consider switching to a faster report-generation program than PHP + TCPDF.
Sorted out.
The issue is with the database, one of the tables has more 120000 records in it. Deleted irrelevant records, not a permanent solution but now it generates the same thing in 2.1 minutes.
Now I can't do the same thing in my production server. I would love to get your inputs on how to optimize the database.
Thank You
So I'm trying to see if something like this is possible WITHOUT using database.
A file is uploaded to the server /files/file1.html
PHP is tracking the upload time by checking last update time in database
If the file (file1.html) has been updated since the last DB time, PHP makes changes; Otherwise, no changes are made
Basically, for a text simulation game (basketball), it outputs HTML files for rosters/stats/standings/etc. and I'd like to be able to insert each team's Logo at the top (which the outputted files don't do). Obviously, it would need to be done often as the outputted files are uploaded to the server daily. I don't want to have to go through each team's roster manually inserting images at the top.
Don't have an example as the league hasn't started.
I've been thinking of just creating a button on the league's website (not created yet) that when pushed would update the pages, but I'm hoping to have PHP do it by itself.
Yes, you could simply let php check for the file creation date (the point in time where the file was created on the server, not the picture itself was made). check http://php.net/manual/en/function.filemtime.php and you should be done within 30mins ;)
sexy quick & dirty unproven code:
$filename = 'somefile.txt';
$timestamp_now = time(); // get timestamp from now (seconds)
if (filemtime($filename) > $timestamp_now) {
// overwrite the file (maybe check for existing file etc first)
}
Let's say I'm using simpleXML to parse weather data from a remote server, and then the remote server crashes so I can no longer recover its live feeds but I don't want my users to get an error message either, how would I go about caching and continuing to display the last piece of data I got from the server before it crashed?
Let's say my xml looks like this:
<weather>
<temperature c="25" f="77">
</weather>
How would I go about displaying the values "25" and "77" until I'm able to reestablish a connection with the remote server?
Apologies if my question isn't entirely clear... my knowledge of server-side technologies is very limited.
First: You do not want to fetch the remote data live when the user requests your site. That works for small sites with few visitors when no problems occur, but as soon as the remote server hangs, your site will also hang until the connection timeout occurs.
What we mostly do is the following:
Create a script that fetches the remote file and stores it locally in some temporary folder. If the remote file cannot be fetched, do not overwrite the old one. This is very important, and #Drazisil code does exactly this.
Call that script with a cron job, or at the end of your normal script every x minutes
Use the local file when creating your normal HTML output instead of the remote one.
In the end, your pages will always be delivered fast and will not crash when the remote server is down.
This isn't the best way, but here is one way you could do it:
To save the information
$file = 'temp_cache.php';
// Open the file to get existing content
$content = '<?php $c="25"; $f="77"; ?>';
// Write the contents to the file
file_put_contents($file, $content);
To load it
include_once('temp.php');
By including the file, your $c and $f variables will be set unless you overwrite them.
Store the values locally and display this information to your users. Update when you want, in such a way that it will only overwrite the local copy when successful; If it fails, you will have your 'local' copy.
I happen to have a database with pictures stored as blob fields. Can't help it, it was the previous developer's choice.
Now I need that data in a new site and the provider won't let me copy the data the easy way (file has become 11Mb big - won't upload that and I don't have shell access).
So I thought I'd write a script that opens a connection in db1, selects all the records, and then copies each into a table in the new db2.
Works all fine if I exclude the blobs. If I want to copy them too, it won't insert.
Anyone had something similar before?
Should I treat the blobs differently when it comes to inserting?
Thanks for any ideas or help.
11MB isn't a huge file, I'm surprised your host has such a low max upload size.
Have you thought about exporting as SQL, splitting the file in two (in Notepad++ or something) then uploading it in smaller sections? Wouldn't take long.
Perhaps check to see if you can increase the max_allowed_packet setting on your mysql DB. I'm not sure if it affects inserts, but I remember having to adjust this setting when I worked on a web-app that allowed users to download 3-5MB binaries from blob fields in the DB.
This link may be helpful, from a quick google search: http://www.astahost.com/info.php/max_allowed_packet-mysql_t2725.html