Exception Received Retry Command Error:Unexpected response (): - php

we have a cron that runs a PHP script that processes xml files including processing images. (pulling them from a web address, resizing them and then uploading to CloudFiles.
we are finding that after 220 or so images that we get an error: Exception Received Retry Command Error:Unexpected response ():
we have coded the script to try 5 times to upload it (unfortunately it still fails) and then is to go to the NEXT IMAGE
Unfortunately it fails on the next image and then so on.
The container we are uploading to is not full, we only do 1 image at a time so below the 100/sec restrictions. Files are not large example: http://images.realestateview.com.au/pics/543/10157543ao.jpg" format="jpg"/>
We tried to then run the script again via our server with the image that failed and it worked successfully along with other images.
No idea why this is happening, RackSpace advise it is a issue with the script or the cron. But we are not convinced.
Happy to post script if it helps.

Are you doing 5 retries with any backoff time or just as fast as possible? If not currently, add exponential backoff to the retry attempts.

Related

PHP, how will continue to run after the connection cancellation

I'm having a problem like a file upload code.
The user begins to upload files through the site (for large files. Like Wetransfer)
Showing percentage loading with Ajax.
When completed, showing warning.
But the problem starts here.
Because files are huge, it takes time to move to the appropriate folder and ziping.
If the user closes the browser in this process, the process can not be completed.
Even users close the browser, how do I ensure that the operation continues.
I tried to ignore_user_abort. But I was not successful.
So send response to the browser that you are moving file, and or do it as queue and execute it as background job or just do it in your script. That should help: https://stackoverflow.com/a/5997140/4099089

Upload to AWS S3 limits

I'm working with the Amazon S3 and the AWS SDK for PHP. Is there a file size limit to upload files? Is there a simultaneous files upload limit?
It has given me a lot of these errors when I try to send 20 files with 200MB at a time for my bucket:
RequestTimeTooSkewedException: AWS Error Code: RequestTimeTooSkewed,
Status Code: 403, AWS Request ID: 0CE24AEDE4162AC9, AWS Error Type:
client, AWS Error Message: The difference between the request time and
the current time is too large.
RequestTimeoutException: AWS Error Code: RequestTimeout, Status Code:
400, AWS Request ID: 913367E51F2BC5AD, AWS Error Type: client, AWS
Error Message: Your socket connection to the server was not read from
or written to within the timeout period. Idle connections will be
closed.
Or the problem is in my code or PHP?
It seems like your network connection may not fast enough to upload a file that size in one shot. RequestTimeTooSkewedException errors can happen due to out of sync clocks as helloV mentioned. However, if the upload takes too long, the time that the request is signed and the time that the request is completed may be more than 15 minutes apart, which would also cause this error. I suspect this may be happening, because the second error, RequestTimeoutException, is most likely happening because your connection to S3 is too slow.
You either need a better connection or you should consider using the multipart upload API. There are helpers for this: http://docs.aws.amazon.com/aws-sdk-php/guide/latest/service-s3.html#uploading-large-files-using-multipart-uploads
As of now the size limit is 5 TB and your 200MB is well within the max size.
The problem is your local box's system clock is out of sync. Sync up with an NTP server or set it manually and the problem will go away.
For the second error, it is possible you are specifying the file size that is greater than the actual size. If your file size is 200MB, it is possible you are passing a value greater than 200MB in the API.
It is giving this error?
<Error><Code>RequestTimeTooSkewed</Code><Message>The difference between the requ
est time and the current time is too large.</Message><RequestTime>Wed, 08 Apr 20
15 11:50:58 GMT</RequestTime><ServerTime>2015-04-07T11:51:03Z</ServerTime><MaxAl
lowedSkewMilliseconds>900000</MaxAllowedSkewMilliseconds><RequestId>255BFB2CF336
1F0D</RequestId><HostId>DlpwnpVVH9aQku8qD8sAaO6tFCBlrfr9P2Sl3jbBd7FKXbzsJ1SJAwqR
OsNf+2qdSSKElK3mbus=</HostId></Error>
In error you can see requesttime and servier time is different, It must be same. So please check your local pc time and date.

PHP script stops suddenly without any error

I have a PHP script that downloads files with direct link from a remote server that I own. Sometimes large files (~500-600 MB) and sometimes small files (~50-100 MB).
Some code from the script:
$links[0]="file_1";
$links[0]="file_2";
$links[0]="file_3";
for($i=0;$i<count($links);$i++){
$file_link=download_file($links[$i]); //this function downloads the file with curl and returns the path to the downloaded file in local server
echo "Download complete";
rename($file_link,"some other_path/..."); //this moves the downloaded file to some other location
echo "Downloaded file moved";
echo "Download complete";
}
My problem is if I download large file and run the script from web browser, it takes upto 5-10 minutes to complete and the script echos upto "Download complete" then it dies completely. I always find that the file that was being downloaded before the script dies is 100% downloaded.
On the other hand if I download small files like 50-100MB from web browser or run the script from command shell this problem does not occur at all and the script completes fully.
I am using my own VPS for this and do not have any time limit in the server. There is no fatal error or memory overload problem.
I also used ssh2_sftp to copy files from the remote server. But same problem when I run from web browser. It always downloads the file, executes the next line and then dies! Very strange!
What should I do to get over this problem?
To make sure you can download larger files, you will have to make sure that there is:
enough memory available for php
the maximum execution time limit is set high enough.
Judging from what you said about ssh2_sftp (i assume you are running it via php) your problem is the 2nd one. Check your error(-logs) to find if that truly is your error. If so you simply increase the maximum execution time in your settings/php.ini and that should fix it.
Note: I would encourage you not to let PHP handle these large files. Call some program (via system() or exec()) that will do the download for you as PHP still has garbage collection issues.

PHP script on Amazon EC2 giving response 324 on browser

We have a script which downloads acsv file. When we run this script on command line on EC2 console it runs fine; downloads the file and sends success message to the user.
But if we run through a browser then we get:
error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data.
When we checked in backed for the file download, it's there but the success message sent after the download is not received by the browser.
We are using cURL to download from a remote location with authentication. The user group and ownership of the folder is "ec2-user", the folder has full rights ie 777.
To summarize: the file is downloaded but at the browser end we are not getting any data or success message which we print.
P.S.: The problem occurs when the downloaded file size is 8-9MB; if it is a smaller file size say 1MB it works. So Either script executing time or download file size or some ec2 instance config is blocking it from giving browser a response. The same script is working perfectly fine on our Godaddy Linux VPS. We have already changed Max execution time for the script.
Sadly, this is a known problem without a good solution. There's a very long thread on the amazon forum here: https://forums.aws.amazon.com/thread.jspa?threadID=33427. The solution offered there is to send a keep-alive message to keep the connection from dying after 60 seconds. Not a great solution, but I don't think there's a better one unless Amazon fixes the problem, which doesn't seem likely given that the thread has been open for 3 years.

using php to save remote images files at local sever but not completely successfully

I know some people already ask it but my probelm is when downloading remote images files(each file size is smaller than 200KB). But some files won't be completly saved. And some files can not be saved, or some files are saved but not 100%, I will see some gray shadow at image. The worst is everytime is different error ouput.(it's internet problem?)
I try the following methods to save file.
file_get_contents
curl/GD
copy
it all can work but I can't find the perfect method to save whole files.
The followings are error msg.
failed to open stream: HTTP request
failed! HTTP/1.0 408 Request
Time-out in at line "copy"
Maximum
execution time of 60 seconds
exceeded( I increase time)
my php program.
set_time_limit(60);
$imageArray=array(image array............);
for ($k=0;$k<count($imageArray);$k++){
echo '<img src="'.$imageArray[$k].'"><br/>';
$isok=copy($imageArray[$k] , dirname(__FILE__).'/photo/item_'.($k+1).'.jpg');
if(isok==true){
echo' success!';
}else{
echo ' Fail';
}
Most possibly it's an internet problem. Do they load fine in the browser when you try ? If they do, you can try running the code on your machine and see if this would help.
But most probable reason is the remote site which you try to download from - it can throttle you for connections per time-interval. Try sleeping between images - for example 5-6 seconds and see if this helps.
Also try to download smaller batches of images - 1 - 2 at a time to see if it works.
I noticed in your copy() that you have hardcoded in .jpg to the destination output. Are you always uploading .jpg as if you are uploading a .png or .gif and are forcing it to .jpg you might be causing issues there. Just a thought to be honest.

Categories