I am using the Zend Frameworks Zend_Service_Amazon_S3 class to download files from the Amazon S3 server within a codeigniter application.
My script first checks the user is authorised to download the file, then attempts to force download it from the S3 servers. I have it working when the download file is a small test zip (661 bytes), but when I try and download one of the large video zips (150 MB+) I just get a blank screen.
The code is as follows:
$this->load->helper('download');
$s3 = new Zend_Service_Amazon_S3();
force_download($video->filename(), $s3->getObject($video->s3_object_path()));
Where $video is an instance of a model i have representing the videos.
Size seems to be the issue here, that is the only difference i see between the two files.
Ideally I would like the data from S3 to bypass the server the application is hosted on. The force_download function in codeigniter sets the "Content-Disposition: attachment;" header. From reading other related posts on here, this should bypass the server right?
You should turn on error reporting and see what the actual error is, or check your error logs. My guess is that the script may be timing out due to the large file. Try adding set_time_limit(0); right before the call to force_download().
Related
I wish to upload a video file stored on the mobile device on which the app is downloaded, to a folder within my server (ran with PHP). I am trying to use Alamofire to do this, and it said in their documentation that I should NOT use multipart form data for this as it could be too slow/memory intensive. Instead, I am using, as was recommended in the Alamofire documentation, this code to upload the video file directly to the server:
AF.upload(videoURL, to: "[string of url of my server folder]")
Here, videoURL is the local url of the video (an mp4 file) on the phone.
Now, as for the url to my server, if I just put in the the path to the folder, the return string I get just seems to scrape the HTML code of the web page that you get when you past in the link into a browser, and the file is not uploaded.
Then, if I write a path to a new file within the folder that doesn't already exist, the error message just says:
The requested URL was not found on this server.
Lastly, I have tried passing in the url of an already existing mp4 file with a video in the server folder, hoping that when I upload to this url, this video will then be replaced by my video file. However, what gets returned as a message from the server then is a very (very) long string (making Xcode very laggy) that I believe is some representation of the current video already in the file, and on the server the video has not been replaced.
So, I have tried these three different possibilities, but none of them seem to get the file uploaded to the server. Do you have any idea what the problem is? Do I need to do something server side, with PHP code, or is it fine as it is just trying to upload the video like this? According to what I could find in the documentation, this seems to me like it should be ok.
Link to documentation: https://github.com/Alamofire/Alamofire/blob/master/Documentation/Usage.md
I want to generate a thumbnail image for a PDF file stored on google bucket using imagick and PHP
I deploy my application on google app engine(GAE) standard environment
the problem is that I keep getting this error
Fatal error: Uncaught exception 'ImagickException' with message
'UnableToWriteBlob `magick--1noB3XBwJhgfn': Read-only file system
I know that the file system the application is deployed to is not writable, but I need a way to achieve this...
this is my code
<?php
putenv('MAGICK_TEMPORARY_PATH='.sys_get_temp_dir());
$imagick = new Imagick();
// get the content of pdf url
$imagenblob=file_get_contents('http://www.pdf995.com/samples/pdf.pdf');
// read the content and load it inn imagick instance
$imagick->readimageblob($imagenblob);
// point to the first page, because I want thumbnail for first page
$imagick->setIteratorIndex(0);
// set image format
$imagick->setImageFormat('png');
// resize image
$imagick->resizeImage(320,320,Imagick::FILTER_LANCZOS,1,1);
// return the result
header("Content-Type: image/png");
$base64 = 'data:image/png;base64,' . base64_encode($imagick);
exit($base64);
Maybe if I can change the directory which imagick use to write, but I was not able to achieve this !!
ImageMagick requires a temp directory to write artifacts during the delegate decoding/encoding process. I'm not familiar with google-app-engine, but the documents suggest tempnam() & sys_get_temp_dir() should be used.
putenv('MAGICK_TEMPORARY_PATH='.sys_get_temp_dir());
$imagick = new Imagick();
// ...
Also note, Ghostscript is used by ImageMagick for PDF decoding. Verify that the gs binary is installed & available if working w/ PDF files.
You're hitting one of the restrictions of the standard environment sandbox:
An App Engine application cannot:
write to the filesystem. PHP applications can use Google Cloud Storage for storing persistent files. Reading from the filesystem is
allowed, and all application files uploaded with the application are
available.
One option (more costly) would be to use the flexible environment, which doesn't have such restriction.
Another approach would be to try to configure imagick to not use intermediate/temporary files (no clue if it supports that) or to use in-memory file emulation (donno if PHP supports that, I'm a python user) and specify that via Imagick::setFilename.
Another thing to try would be using setFormat instead of setImageFormat, the notes on the setImageFormat suggest that it might be possible to make the transformations before/without writing to the file:
To set the format of the entire object, use the Imagick::setFormat
method. E.g. load TIFF files, then use setFormat('pdf') on the Imagick
object, then writeImagesFile('foo.pdf') or getImagesBlob().
There's no real way to make this work due to the two opposing forces at play:
1) Google App Engine Standard's sandbox limitations
2) Imagick apparently needing local file system access (at least temporarily) in order to work.
So if you can't change how Imagick work, then the only remaining solution is to not use GAE Standard. You can either use GAE Flex or Google Compute Engine. I don't know why Flex is not an option for you; you don't need to move the whole project over. You can just port this part over as a microservice on GAE Flex in the same project. The sole function of the service would be to process the image and get the thumbnail. You can then place the thumbnail in a Google Cloud Storage bucket for the rest of your app (in GAE Standard) to use.
to anyone facing the same problem , I solved mine by created a new Google Compute Engine , with PHP installed on it, I created a function to convert the pdf to image using Imagick and it returns the response as base64 stream.
Imagick does not work well with pdf files on GAE
I'm trying to archive a big file using PHP and send it to the browser for download. The problem is the file is located on a remote machine and the only way to get it is via HTTP. So imagine this is my file: https://dropboxcontent.com/user333/3yjdsgf/video1.mp4
It's a direct link and I can download the file using wget, or curl anything. When a user wants to download it, I first fetch the file to the server, then zip it up and then send it to the user. Well, if the file is really large, the user has to sit there waiting for the server to download it before he sees the download dialog box in his browser. Is there a way for me to start the download of the file https://dropboxcontent.com/user333/3yjdsgf/video1.mp4 (let's say I'm downloading it into a local /tmp/video.mp4) and simultaneously start putting into an archive and streaming it into the user's browser?
I'm using this library to zip it up: https://github.com/barracudanetworks/ArchiveStream-php, which works great, but the bottleneck is still fetching the file to the server's local filesystem.
Here is my code:
$f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4");
$zip->add_file('big/hello.mp4', $f);
The problem is line $f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4"); takes too long if the file is really big.
As suggested in the comments of the original post by Touch Cat Digital Inc, I found the answer here: https://stackoverflow.com/a/6914986/1927991
A chunked stream of the remote file was the answer. Very clever.
I'm trying to serve a file for download to a user, and I'm having trouble with fpassthru. The function I'm using to download a file is:
http://pastebin.com/eXDpgUqq
Note that the file is successfully created from a blob, and is in fact the file I want the user to download. The script exits successfully, and reports no errors, but the file is not downloaded. I can't for the life of me think what's wrong.
EDIT: I removed the error suppression from fopen(), but it still reports no error. Somehow the data in the output buffer is never being told to be downloaded by the browser.
I tried your code (without the blob part), and it worked fine. I can download a binary file. Based on my experience, here are something to check:
Has the file been completely saved before you initiate the reading? Check the return value of file_put_contents.
How large is the file? fpassthru reads the whole file into memory. If the file is too large, memory might be insufficient. Please refer to http://board.phpbuilder.com/showthread.php?10330609-RESOLVED-php-driven-file-download-using-fpassthru for more information.
Instead of downloading the file to local server (reading the whole file into server’s memory, and letting the client download the file from the server), you can create an SAS URL, and simply redirect the browser to the URL. Azure will take care of download automatically. You many want to refer to http://blogs.msdn.com/b/azureossds/archive/2015/05/12/generating-shared-access-signature-sas-using-php.aspx for a sample.
I was able to download the file by passing a stream obtained with the Azure API directly to fpassthru, without creating a file. Unfortunately, I can't show the code because it belongs to a project that I have finished working on and the code is no longer available to me.
solved: using this post : http://www.designedbyaturtle.co.uk/2013/direct-upload-to-s3-with-a-little-help-from-jquery/
edit: I'm looking for a way to directly upload a big zip file from client side to amazon s3 without first uploading it to my server
I'm using a small server for my web application with laravel framework
I have some users trying to upload big files to my servr (around 300-400m), and my server is to weak and my bandwidth is to low for the user to be able to finish uploading.
I want the file to be uploaded directly to amazon s3,
from browsing around i dont think this can be dont with laravel sdk for amazon
I installed the official sdk
and I'm trying to do it like suggested in this post
uploading posted file to amazon s3
but not rally sure how to actually send my file to amazon s3,
what should i put instad of 'string of your file here':
$result = $client->upload(
'BUCKET HERE',
'OBJECT KEY HERE',
'STRING OF YOUR FILE HERE',
'public-read' // public access ACL
);
im getting my file like this :
$myFile = Input::file('file');
putting $myfile instead of 'string of your file here' doesnt work
I solved the problem using this post: solved: using this post : http://www.designedbyaturtle.co.uk/2013/direct-upload-to-s3-with-a-little-help-from-jquery/
You can upload files directly from the client side to S3, but it's a security risk as you can't validate the file, plus giving anyone access to your S3 buckets is just generally a bad idea. I would still look at uploading via your Laravel application.
All files uploaded to PHP are stored on your server, most likely in the /tmp directory for Linux and Mac machines. There's no way to avoid that, the file needs to be stored somewhere. You obtain an object representing the file using Input::file('file').
The file is an instance of Symfony\Component\HttpFoundation\File\UploadedFile, which exposes a getRealPath() method. We can then use fopen() to read the contents of the file as a string. So to upload it to S3, use:
$client->upload(
'bucket',
'key',
fopen($file->getRealPath(), 'r'),
'public-read',
);