After uploading a file to a GCS bucket, that is publically readbale, when I access the file directly (http://storage.googleapis.com//image.png) the browser (Chrome) downloads the file, instead of displaying it.
In the Inspector, I can see it is being served with a content type of binary/octet-stream.
How do I make sure the file has the correct content type when retrieving?
When using the GCS PHP Filesystem Library, you can set the Content-Type of the uploaded object using the file options parameter. There is an example of it in the documentation:
$options = ['gs' => ['Content-Type' => 'text/plain']];
$ctx = stream_context_create($options);
file_put_contents('gs://my_bucket/hello.txt', 'Hello', 0, $ctx);
Related
I'm trying to download files from a Google Drive link from Google server to my web server, to avoid the 100 max size in PHP POST.
<?php
$link = $_POST["linkToUpload"];
$upload = file_put_contents("uploads/test".rand().".txt", fopen($link, 'r'));
header("Location: index.php");
?>
Inserting a normal link like http://example.com/text.txt it works fine. The problem comes linking google drive https://drive.google.com/uc?authuser=0&id=000&export=download. This is a direct link from Google Drive, but it doesn't work. So I tried to insert the link that I obtained downloading the file locally https://doc-08-bo-docs.googleusercontent.com/docs/securesc/000/000/000/000/000/000?e=download and it's still not working. Do you think Google is trying to avoid the server-to-server copies? Or there is another method to do it?
If you want to fetch files with your own application you should use the API (Application Programming Interface) to get these.
Have a look at the file download documentation for Google Drive
Example download snippet in PHP:
$fileId = '0BwwA4oUTeiV1UVNwOHItT0xfa2M';
$response = $driveService->files->get($fileId, array(
'alt' => 'media'));
$content = $response->getBody()->getContents();
I need to serve to my web portal enduser a file stored on a FTP server.
I create a local temp file, I download the FTP's file in my temporary file, and when it finish I send it with Symfony's file HTTP response:
$tempPath = tempnam(sys_get_temp_dir(), 'ftp');
$tempFile = fopen($tempPath, 'a+');
$log = "myLog";
$pass = "myPwd";
$conn_id = ftp_connect('my.ftp.server');
$ftp_login = ftp_login($conn_id, $log, $pass);
ftp_fget($conn_id, $tempFile, $path, FTP_ASCII, 0);
ftp_close($conn_id);
fclose($tempFile);
return $app->sendFile($tempPath, 200, array('Content-Type' => 'application/force-download', 'Content-disposition' => "attachment; filename=myFile.zip"));
It works but I would do better (ensure temp file deletion, improve performance, ...)
I see that Symfony provides HTTP streamed response helper, so I imagine to send the FTP's file without to store it on web server's hard drive disk.
To get it I think I need to connect ftp_fget() function (or other FTP function) to a PHP function that can output it to php://output or standard output (I don't really know this PHP scope).
I find the readfile() function but it take the filename as argument, not a resource like ftp_fget() need... Is there other function compatible with fget_fget or other way to do it?
Your problem consists of two parts:
Reading FTP file as a stream (see an example with fread(): "PHP: How do I read a .txt file from FTP server into a variable?")
Streaming a Response in Symfony2
I have written a Symfony2(PHP MVC Framework) script to download a zip file from the server. But the file download stops in the midway. I have increased the max_execution_time in apache configuration. Still the problem is persisting.
Do anyone have the quick fix for this?
Thanks in advance.
It seems like you may have an issue with a large file (downloading an archive of videos). You should use a StreamedResponse. This way, you don't have to store the entire contents of your file in memory, it will just stream to the client. The way you are currently doing it makes the file load into memory before it can start to download. You can see why this could be a problem. Here is a simple example of how you can stream a file to the client:
$path = "//usr/www/users/jjdqlo/Wellness/web/yoga_videos/archive.zip";
return new StreamedResponse(
function () use ($path) { // first param is a callback, where you do the readfile()
readfile($path);
},
200, // second param is the http status code
array( // third param is an array of header settings
'Content-Disposition' => 'attachment;filename="archive.zip"',
'Content-Type' => 'application/zip'
)
);
Give this a shot. Assuming the problem is because of file size, this should solve the issue.
i just can't get this code to work. I'm getting an image from the URL and storing it in a temporary folder so I can upload it to a bucket on Google Cloud Storage. This is a CodeIgniter project. This function is within a controller and is able to get the image and store it in the project root's 'tmp/entries' folder.
Am I missing something? The file just doesn't upload to Google Cloud Storage. I went to the Blobstore Viewer in my local App Engine dev server and notice that there is a file but, it's empty. I want to be able to upload this file to Google's servers from my dev server as well. The dev server seems to overwrite this option and save all files locally. Please help.
public function get()
{
$filenameIn = 'http://upload.wikimedia.org/wikipedia/commons/1/16/HDRI_Sample_Scene_Balls_(JPEG-HDR).jpg';
$filenameOut = FCPATH . '/tmp/entries/1.jpg';
$contentOrFalseOnFailure = file_get_contents($filenameIn);
$byteCountOrFalseOnFailure = file_put_contents($filenameOut, $contentOrFalseOnFailure);
$options = [ "gs" => [ "Content-Type" => "image/jpeg" ]];
$ctx = stream_context_create($options);
file_put_contents("gs://my-storage/entries/2.jpg", $file, 0, $ctx);
echo 'Saved the Image';
}
As you noticed, the dev app server emulates Cloud Storage locally. So, this is the intended behaviour-- and it lets you test without modifying your production storage.
If you run the deployed app you should see the writes actually going to your GCS bucket.
I am using google visualization api for my php-based website. I am planning to use tcpdf to give a pdf version of the same images that i am generating using Google Visualization API.
is there any way to save these visualizations on my server so that i use them directly in the tcpdf?
If not then can i automatically take screenshots of the images and use them in tcpdf?
The reason why i am not manually taking screenshots and saving is that, all the values in the visualizations are dynamic and keep changing every day, so I am working on something automatic. Suggestions please?
The result of the Google Visualization API can be an image or Flash (on interactive charts). If your result is an image (the most common) you can download it from the server to a server folder and add to the tcpdf from your local path.
you can use for example cURL:
$ch = curl_init($remoteURL); // get the image from this url
$fp = fopen($localFile, "wb"); //put the image in this server path
// set URL and other appropriate options
$options = array(CURLOPT_FILE => $fp,
CURLOPT_HEADER => 0,
CURLOPT_FOLLOWLOCATION => 1,
CURLOPT_TIMEOUT => 60); // 1 minute timeout (should be enough)
curl_setopt_array($ch, $options);
// Execute curl
curl_exec($ch);
curl_close($ch);
// close the file
fclose($fp);
Once the image is in a temporal path of your server you can add the image to the ctpdf using the "Image" method and the path to the file you have saved into your server.
The Server path must be writtable with the web server.This is specially important on *nix environments.
Here more ways to do the file download
http://www.php-mysql-tutorial.com/wikis/php-tutorial/reading-a-remote-file-using-php.aspx