phpmyadmin hangs and only imports about 50% of the database? - php

I am using linux with phpadmin and trying to import a database. However it hangs about halfway through and just keeps constantly loading. I actually have to reset my computer and when I go check the tables, I get to about letter "F" on the tables list, each and every time. No errors, just a constant loading and a incomplete imported database.
I have already went into php config files and updated the post sizes and upload sizes, ect. But I am still getting this issue. Any help?

phpMyAdmin is limited by the PHP resource limits; memory, execution timeout, and so on. Depending on your situation, your file upload may take so long that the timeout is reached, or uncompressing the file may result in exceeding the memory limits.
You can try a few things:
Uncompress the file locally, then upload the .sql file
Use the "UploadDir" feature to place the file in a folder on your web server, which phpMyAdmin can then access locally.
Upload the .sql file and use the command line client, as suggested by #dognose. I tend to use the 'source' syntax; after connecting to the mysql command line client just use source database.sql. Depending on the .sql file, you may first have to create and/or use the database.

Related

Getting images onto server from separate web host

i have GoDaddy shared webspace with FTP access that contains a folder with images. These images are changing every day.
Im looking for advice on what i need to do to get these images onto my server in the workplace, maybe every hour or so. The server doesnt have IIS installed is there any other way to do this?
Am i able to do a PHP script that can put all the images onto the server using the ip or something?
I had the same issue as I had two images on a remote server which I needed to copy to my local server at a predefined time each day and this is the code I was able to come up with...
try {
if(#copy('url/to/source/image.ext', 'local/absolute/path/on/server/' . date("d-m-Y") . ".gif")) {
} else {
$errors = error_get_last();
throw new Exception($errors['type'] . " - " . $errors['message']);
};
} catch(Exception $e) {
//An Error Occurred Downloading The Requested Image
//Place exception handling code here
};
I then created a cron job which ran this file on a daily basis but you would be able to run is as frequently as you needed to based on how frequently the source image changes on the source server.
A few points to explain the code...
As the code is run in the background I use the # symbol to silence visual references to any errors which occur with the copy command as they won't be viewable anyway. Instead if copy results in an error it returns false and triggers throwing a new exception which takes the error type and error message and throws a new exception for it which can then be handled in the catch block and can be actioned however you need such as making a local log entry, sending an error email, whatever you need.
As the second paramater in the copy command you will see that the path results in the filename being named based on the current date. This name can be anything but needs to be unique in the destination for it to work. If you plan on overwritting the image each time the copy is done then you can statically code the name as the same name but if you want to maintain a history then you would need to come up with a file naming solution on your local server to ensure that the files don't get overwritten each time the cron runs.
The first parameter of the copy statement has been written on the assumption that the source image file is named the same each time, if the name changes the you will need to identify how the naming is achieved and code a variable to build that name and insert it as the source filename.
This code does not alter the format of the source image file so to ensure no corruption occurs and the image can still be shown after the copy you need to ensure that the source image file and the local copy of the image file have the same file extensions, so if the source image file is a .gif file then you need to make sure the file extension in the second copy parameter is also set to .gif
I'll try to answer here instead of continuing the comment-spam ;)
you got your webspace with FTP access. let's just call it webspace;
then you got your server at your workplace. let's just call it workplace;
after all you need one server (also can be webspace for example) where you are able to run PHP. let's call it php-server;
step 1
at the workplace setup a FTP server, for example FileZilla. you setup your FTP server so that you can connect to it. so make an account and set it up so you can access the folder(s) where you want to save your images. also make sure you got access from outside your workplace - firewall settings etc.
step 2
if you can run your PHP scripts on your webspace this would be the easiest way. you can directly access the image files, establish a connection to your FTP server at workplace and upload the files.
if the PHP server is somewhere else, you have to establish a connection from the php-server to your webspace; download your files to your php-server; upload your files to the FTP server at your workplace.
will be a bit of work as you can see, but should be possible to do.
Create a .bat file that uses the ftp command line functions to get the data you want from the server. Save that .bat file somewhere and create a Scheduled Task to run the script every hour.
You can even store the actual sequence of ftp commands in a separate file (e.g. ftpcmd.dat) and call them from the script
ftp -n -s:ftpcmd.dat SERVERNAME.COM
Example ftp command file:
user MyUserName
Password
bin
cd \your\path\images
mget * C:\Temp\
quit

Cannot download using fpassthru

I'm trying to serve a file for download to a user, and I'm having trouble with fpassthru. The function I'm using to download a file is:
http://pastebin.com/eXDpgUqq
Note that the file is successfully created from a blob, and is in fact the file I want the user to download. The script exits successfully, and reports no errors, but the file is not downloaded. I can't for the life of me think what's wrong.
EDIT: I removed the error suppression from fopen(), but it still reports no error. Somehow the data in the output buffer is never being told to be downloaded by the browser.
I tried your code (without the blob part), and it worked fine. I can download a binary file. Based on my experience, here are something to check:
Has the file been completely saved before you initiate the reading? Check the return value of file_put_contents.
How large is the file? fpassthru reads the whole file into memory. If the file is too large, memory might be insufficient. Please refer to http://board.phpbuilder.com/showthread.php?10330609-RESOLVED-php-driven-file-download-using-fpassthru for more information.
Instead of downloading the file to local server (reading the whole file into server’s memory, and letting the client download the file from the server), you can create an SAS URL, and simply redirect the browser to the URL. Azure will take care of download automatically. You many want to refer to http://blogs.msdn.com/b/azureossds/archive/2015/05/12/generating-shared-access-signature-sas-using-php.aspx for a sample.
I was able to download the file by passing a stream obtained with the Azure API directly to fpassthru, without creating a file. Unfortunately, I can't show the code because it belongs to a project that I have finished working on and the code is no longer available to me.

Get PHP to wait until a file is done transferring before moving it

I have a PHP script that moves files out of a specific folder on the server(an IBM AS400). The problem I am running into is that sometimes the script runs while the file is still in the process of being moved in to the folder.
Poor logic on my part assumed that if a file was "in use" that PHP wouldn't attempt to move it but it does which results in a corrupted file.
I thought I could do this:
$oldModifyTime = filemtime('thefile.pdf');
sleep(2);
if($oldModifyTime === filemtime('thefile.pdf'){
rename('thefile.pdf','/folder2/thefile.pdf');
}
But the filemtime functions come up with the same value even while the file is being written. I have also tried fileatime with the same results.
If I do Right Click->Properties in Windows the Modified Date and Access Date are constantly changing as the file is being written.
Any ideas how to determine if a file is finished transferring before doing anything to it?
From the PHP manual entry for filemtime():
Note: The results of this function are cached. See clearstatcache() for more details.
I would also suggest that 2 seconds is a bit short to detect whether the file transfer is complete due to network congestion, buffering, etc.
Transfer it as a temporary name or to a different folder, then rename/copy it to the correct folder after the transfer is complete.

Detect with php if files is being uploaded, or is open

I have a PHP script that opens a local directory in order to copy and process some files. But these files may be incomplete, because they are being uploaded by a slow FTP process, and I do not want to copy or process any files which have not been completely uploaded yet.
Is is possible in PHP to find out if a file is still being copied (that is, read from), or written to?
I need my script to process only those files that have been completely uploaded.
The ftp process now, upload files in parallel, and it take more than 1 second for each filesize to change, so this trick is not working for me now, any other method suggests
Do you have script control over the FTP process? If so, have the script that's doing the uploading upload a [FILENAME].complete file (blank text file) after the primary upload completes, so the processing script knows that the file is complete if there's a matching *.complete file there also.
+1 to #MidnightLightning for his excellent suggestion. If you don't have control over the process you have a couple of options:
If you know what the final size of the file should be then use filesize() to compare the current size to the known size. Keep checking until they match.
If you don't know what the final size should be it gets a little trickier. You could use filesize() to check the size of the file, wait a second or two and check it again. If the size hasn't changed then the upload should be complete. The problem with the second method is if your file upload stalls for whatever reason it could give you a false positive. So the time to wait is key.
You don't specify what kind of OS you're on, but if it's a Unix-type box, you should have fuser and/or lsof available. fuser will report on who's using a particular file, and lsof will list all open files (including sockets, fifos, .so's, etc...). Either of those could most likely be used to monitor your directory.
On the windows end, there's a few free tools from Sysinternals that do the same thing. handle might do the trick

PHP: mysql query skipped/ignored after large file uploads?

Weird... Originally i thought that php was not correctly handling large file uploads (800mb-2gb)
now i've figured out that the file is moved correctly, but the mysql query that enters the file info into the DB seems to be skipped when large files are uploaded.
The mysql query is executed like it should be when small files are uploaded. This problem only seems to arise with larger files
Also, the mysql queries before the file is moved seem to work fine.
Process:
Wait for uploaded file,
check file size,
get md5 of file,
move file from temp folder to uploads folder,
if moving file is successful then mysql query.
The file is where is should be, but the query isnt executed.
Should i put a 10 second delay between after the file is moved and when the mysql query is called?
If the file move is working than the problem is most likely on the last step - the MySQL query. Max out the error level with error_reporting(E_ALL) and setup a PHP error log - this will record any MySQL warnings and any other problems. Log the SQL query you're trying to execute. Does it work from a MySQL client?
It sounds like large files are not being moved from the temp folder successfully? Can you confirm this by seeing if the file is still there in /tmp or wherever?

Categories