WOPI Host in PHP - Not able to view file - php

I have implemented APIs which are required for WOPI Host, as I am trying to solve Viewing doc problem right now, I implemented CheckFileInfo and GetFile endpoints only.
/wopi/files/{fileId}
$filePath = public_path('sample.docx');
$handle = fopen($filePath, "r");
$size = filesize($filePath);
$contents = fread($handle, $size);
return [
"BaseFileName" => "sample.docx",
"Size" => $size,
"OwnerId" => 1,
"UserId" => 1,
"Version" => rand(),
"FileUrl" => url("sample.docx")
];
/wopi/files/{fileId}/contents
$file = public_path('sample.docx');
return new BinaryFileResponse($file);
BinaryResponse is from - Symfony\Component\HttpFoundation\BinaryFileResponse
I have tried multiple implementations to return "the full binary contents of the file" (To satisfy this: https://wopi.readthedocs.io/projects/wopirest/en/latest/files/GetFile.html)
But I always end up with the following error.
Side Note: We registered with Cloud Storage Program of Microsoft, with domain www.**world.com, and I hosted the app at test-wopi.**world.com

Related

Read the csv file content

I want to read the csv file content using php, google drive api v3
I got the fileid and file name but I am not sure how I can read the file content?
$service = new Drive($client);
$results = $service->files->listFiles();
$fileId="1I****************";
$file = $service->files->get($fileId);
The google drive api is a file storage api. It allows you to upload, download and manage storage of files.
It does not give you access to the contents of the file.
To do this you would need to download the file and open it locally.
Alternately since its a csv file you may want to consider converting it to a google sheet then you could use the google sheets api to access the data within the file programmatically.
Code for downloading a file from Google drive api would look something like this
Full sample can be found here large-file-download.php
// If this is a POST, download the file
if ($_SERVER['REQUEST_METHOD'] == 'POST') {
// Determine the file's size and ID
$fileId = $files[0]->id;
$fileSize = intval($files[0]->size);
// Get the authorized Guzzle HTTP client
$http = $client->authorize();
// Open a file for writing
$fp = fopen('Big File (downloaded)', 'w');
// Download in 1 MB chunks
$chunkSizeBytes = 1 * 1024 * 1024;
$chunkStart = 0;
// Iterate over each chunk and write it to our file
while ($chunkStart < $fileSize) {
$chunkEnd = $chunkStart + $chunkSizeBytes;
$response = $http->request(
'GET',
sprintf('/drive/v3/files/%s', $fileId),
[
'query' => ['alt' => 'media'],
'headers' => [
'Range' => sprintf('bytes=%s-%s', $chunkStart, $chunkEnd)
]
]
);
$chunkStart = $chunkEnd + 1;
fwrite($fp, $response->getBody()->getContents());
}
// close the file pointer
fclose($fp);

How to download latest file or recently added file from Aws S3 using PHP. (Yii2)

I have a non-versioned S3 bucket (VersionId is null for all files), files have different names.
My current code is:
$path = $this->key.'/primary/pdfs/'.$id.'/';
$result = $this->s3->listObjects(['Bucket' => $this->bucket,"Prefix" => $path])->toArray();
//get the last object from s3
$object = end($result['Contents']);
$key = $object['Key'];
$file = $this->s3->getObject([
'Bucket' => $this->bucket,
'Key' => $key
]);
//download the file
header('Content-Type: application/pdf');
echo $file['Body'];
The above is incorrect as it is giving the end file which is not the latest file.
Do I need to use the below api call ? if so, how to use it ?
$result = $this->s3->listObjectVersions(['Bucket' => $this->bucket,"Prefix" => $path])->toArray();
Since the VersionId of all files is null, there should be only one version of the files in the bucket

phpleague flysystem read and write to large file on server

I am using flysystem with IRON IO queue and I am attempting to run a DB query that will be taking ~1.8 million records and while doing 5000 at at time. Here is the error message I am receiving with file sizes of 50+ MB:
PHP Fatal error: Allowed memory size of ########## bytes exhausted
Here are the steps I would like to take:
1) Get the data
2) Turn it into a CSV appropriate string (i.e. implode(',', $dataArray) . "\r\n")
3) Get the file from the server (in this case S3)
4) Read that files' contents and append this new string to it and re-write that content to the S3 file
Here is a brief run down of the code I have:
public function fire($job, $data)
{
// First set the headers and write the initial file to server
$this->filesystem->write($this->filename, implode(',', $this->setHeaders($parameters)) . "\r\n", [
'visibility' => 'public',
'mimetype' => 'text/csv',
]);
// Loop to get new sets of data
$offset = 0;
while ($this->exportResult) {
$this->exportResult = $this->getData($parameters, $offset);
if ($this->exportResult) {
$this->writeToFile($this->exportResult);
$offset += 5000;
}
}
}
private function writeToFile($contentToBeAdded = '')
{
$content = $this->filesystem->read($this->filename);
// Append new data
$content .= $contentToBeAdded;
$this->filesystem->update($this->filename, $content, [
'visibility' => 'public'
]);
}
I'm assuming this is NOT the most efficient? I am going off of these docs:
PHPLeague Flysystem
If anyone can point me in a more appropriate direction, that would be awesome!
Flysystem supports read/write/update stream
Please check latest API https://flysystem.thephpleague.com/api/
$stream = fopen('/path/to/database.backup', 'r+');
$filesystem->writeStream('backups/'.strftime('%G-%m-%d').'.backup', $stream);
// Using write you can also directly set the visibility
$filesystem->writeStream('backups/'.strftime('%G-%m-%d').'.backup', $stream, [
'visibility' => AdapterInterface::VISIBILITY_PRIVATE
]);
if (is_resource($stream)) {
fclose($stream);
}
// Or update a file with stream contents
$filesystem->updateStream('backups/'.strftime('%G-%m-%d').'.backup', $stream);
// Retrieve a read-stream
$stream = $filesystem->readStream('something/is/here.ext');
$contents = stream_get_contents($stream);
fclose($stream);
// Create or overwrite using a stream.
$putStream = tmpfile();
fwrite($putStream, $contents);
rewind($putStream);
$filesystem->putStream('somewhere/here.txt', $putStream);
if (is_resource($putStream)) {
fclose($putStream);
}
If you are working with S3, I would use the AWS SDK for PHP directly to solve this particular problem. Appending to a file is actually very easy using the SDK's S3 streamwrapper, and doesn't force you to read the entire file into memory.
$s3 = \Aws\S3\S3Client::factory($clientConfig);
$s3->registerStreamWrapper();
$appendHandle = fopen("s3://{$bucket}/{$key}", 'a');
fwrite($appendHandle, $data);
fclose($appendHandle);

Download file from Amazon S3 with Laravel

I'm a little sure as to how to launch a download of a file from Amazon S3 with Laravel 4. I'm using the AWS
$result = $s3->getObject(array(
'Bucket' => $bucket,
'Key' => 'data.txt',
));
// temp file
$file = tempnam('../uploads', 'download_');
file_put_contents($file, $result['Body']);
$response = Response::download($file, 'test-file.txt');
//unlink($file);
return $response;
The above works, but I'm stuck with saving the file locally. How can I use the result from S3 correctly with Response::download()?
Thanks!
EDIT: I've found I can use $s3->getObjectUrl($bucket, $file, $expiration) to generate an access URL. This could work, but it still doesn't solve the problem above completely.
EDIT2:
$result = $s3->getObject(array(
'Bucket' => $bucket,
'Key' => 'data.txt',
));
header('Content-type: ' . $result['ContentType']);
header('Content-Disposition: attachment; filename="' . $fileName . '"');
header('Content-length:' . $result['ContentLength']);
echo $result['Body'];
Still don't think it's ideal, though?
The S3Client::getObject() method allows you to specify headers that S3 should use when it sends the response. The getObjectUrl() method uses the GetObject operation to generate the URL, and can accept any valid GetObject parameters in its last argument. You should be able to do a direct S3-to-user download with your desired headers using a pre-signed URL by doing something like this:
$downloadUrl = $s3->getObjectUrl($bucket, 'data.txt', '+5 minutes', array(
'ResponseContentDisposition' => 'attachment; filename="' . $fileName . '"',
));
If you want to stream an S3 object from your server, then you should check out the Streaming Amazon S3 Objects From a Web Server article on the AWS Developer Guide
This question is not answered fully. Initially it was asked to how to save a file locally on the server itself from S3 to make use of it.
So, you can use the SaveAs option with getObject method. You can also specify the version id if you are using versioning on your bucket and want to make use of it.
$result = $this->client->getObject(array(
'Bucket'=> $bucket_name,
'Key' => $file_name,
'SaveAs' => $to_file,
'VersionId' => $version_id));
The answer is somewhat outdated with the new SDK. The following works with v3 SDK.
$client->registerStreamWrapper();
$result = $client->headObject([
'Bucket' => $bucket,
'Key' => $key
]);
$headers = $result->toArray();
header('Content-Type: ' . $headers['ContentType']);
header('Content-Disposition: attachment');
// Stop output buffering
if (ob_get_level()) {
ob_end_flush();
}
flush();
// stream the output
readfile("s3://{$bucket}/{$key}");

File from $_POST and Amazon S3

i have problem with amazon s3 service and a web service writed with php.
This WS receive from $_POST a file base64encoded. I need to take this "string" and save to Amazon S3 bucket.
I didn't find a right solution for do that, and after a week of work I'm looking for help here.
//$file = 'kdK9IWUAAAAdaVRYdENvbW1lbnQAAAAAAENyZWF0ZWQgd'
$file = $_POST['some_file'];
$opt = array(
'fileUpload' => base64_decode($file),
'acl' => AmazonS3::ACL_PUBLIC
);
$s3 = new AmazonS3(AWS_KEY, AWS_SECRET_KEY);
$response = $s3->create_object($bucket, $filename, $opt);
Thanks
Per the docs, fileUpload expects a URL or path, or an fopen resource.
fileUpload - string|resource - Required; Conditional - The URL/path for the file to upload, or an open resource. Either this parameter or body is required.
You should pass the decoded file data via the body parameter instead:
body - string - Required; Conditional - The data to be stored in the object. Either this parameter or fileUpload must be specified.
The imagestring is the base64 encoded string that was passed from POST data. You can encode this image and make a file to anywhere - in my case /photos. I used this is a school server, but if Amazon server is similar enough, it could also work there.
$values = array();
foreach($_POST as $k => $v){
$values[$k] = $v;
}
$imagestring = $values['stringEncodedImage'];
$img = imagecreatefromstring(base64_decode($imagestring));
if($img != false)
{
echo 'making the file!';
imagejpeg($img, 'photos/'.$file_name);
}

Categories