I've written a REST interface for my ownCloud app. I've method getFileFromRemote($path) which should return a JSON object with the file content.
Unfortunately this only works when the file that I've specified in $path is a plaintext file. When I try to call the method for an image or PDF the status code is 200 but the response is empty. For returning the file contents I use file_get_contents for retrieving the content.
Note: I know ownCloud has a WebDAV interface, but I want to solve this with REST only.
EDIT
This is the code server side (ownCloud):
public function synchroniseDown($path)
{
$this->_syncService->download(array($path));//get latest version
$content = file_get_contents($this->_homeFolder.urldecode($path));
return new DataResponse(['path'=>$path, 'fileContent'=>$content]);
}
The first line retrieves downloades the content on the ownCloud server and works completely.
You probably have to base64_encode your file content to make json_encode/decode handle it properly:
return new DataResponse([
'path'=>$path,
'fileContent' => base64_encode($content) // convert binary data to alphanum
]);
and then when receiving file via second side, you will have to always:
$fileContent = base64_decode($response['fileContent']);
It's only one, but ones of easiest way to handle that. Btw, sooner or later you will find that Mime-Type would be useful in response.
Related
I'm currently learning PHP and programming in general. I'm currently writing my first website where I have a simple login system and users can enter some data in plain text. Upon entry, PHP encrypts this data.
I want to add a feature where users can then download an unencrypted copy of the data and I want to do it in such a way that the files on disk remain encrypted the entire time and an unencrypted copy is not stored anywhere.
At the moment I should be able to decrypt the data line by line and store it in an array, but I don't know what I could do once I get to that point. Is there a way for PHP to treat an array as if it was just a text file and then ask the client to download it? Or maybe I could just somehow stream the array to a file on the client, line by line?
I have no code to show at the moment as I'm still trying to work out in my head how it will all be structured.
The solution must also be portable between a windows and Linux server.
Assuming the encrypted file is plain text. Second, writing in pseudo-code, because I do not know your solution
First, you need to decrypt the data in something that can be echoed to the user. Or more generally, something that is represented as a String.
I am doing something simillar with PDF files, so I am showing the similar on TXT file:
$encryptedData = $this->getEncryptedData(); // load what needs to be decrypted. Pseudocode
$filename = 'download.txt'; // file name to download. Will be download.txt
$contents = $this->decrypt($encryptedData) // $contents should be String. Decryption is pseudocode
header("Content-Type: text/plain"); //set header as TXT file
header("Content-Disposition:attachment;filename={$filename}"); //force download prompt from the browser
echo $contents; //print decrypted data to TXT
exit; //stop script after download
I have an automated PHP script which connects to an email box, reads emails and process them to create tickets. Some of these emails contain various types of file attachments. My script uses following code to save files directly to a postgress database. I'm using codeigniter.
public function saveFiles($filename, $fileurl, $jobid) {
$filedata = array();
$filedata['filename']= $filename;
$filedata['filedescription'] = 'Incoming attachment.';
$filedata['fileargid'] = $jobid;
$filedata['fileaddedon'] = date('Y-m-d H:i:s P');
$filedata['filedata'] = pg_escape_bytea(base64_encode(file_get_contents($fileurl)));
$results = $this->db->insert('file', $filedata);
if ($results)
return $this->db->insert_id();
else
return FALSE;
}
However, most of the files are saved without any issue. My problem is some pdf files get corrupted when I deployed this script. Script saves files to local disk before encoding to base64. All these files are healthy as well. I suspect something is happening during pg_escape_bytea(base64_encode(file_get_contents($fileurl))).
I developed this script using php 5.5.9/Ubuntu on my local PC and non of the files get corrupted there. But the script is deployed on a Ubuntu server with php 5.3.10 and files get corrupted there.
I tried to find out what is causing this but no lock so far. Is this because of different php versions?
Looks either:
you're encoding to database in the "escape" format and reading from it in hex-format.
You need to cast "when the client and backend character encoding does not match, and there may be multi-byte stream error. User must then cast to bytea to avoid this error." From pg_escape_bytea documentation, it also counts to unscape.
Check section 8.1 here
If isn't a problem I'd save bin2hex output to the field directly.
I got the docs from the third party that is sending me a file over http protocol and I need to write a script that will successfully receive the sent file. Content-type is set as application/gzip so I can't pick up the file uploaded using a $_FILES variable as it would be easy with multipart/form-data.
This link gave me a hint: http://php.net/manual/en/features.file-upload.post-method.php
Note:
Be sure your file upload form has attribute enctype="multipart/form-data" otherwise the file upload will not work.
I tried to reproduce their "client" side to test my server using the example in the following url http://blog.derakkilgo.com/2009/06/07/send-a-file-via-post-with-curl-and-php/
And to ensure crosdomain posting is available, I used a function posted and explained by #slashingweapon CORS with php headers
There must be a way to do it - Halp!
Hi what I understand is you just need to download the file that has been uploaded to a server, try getting the file in binary mode using the below code:
ftp_fget($conn_id, $file, FTP_BINARY, 0);
Ref: http://php.net/manual/en/function.ftp-get.php
The connection details must be shared with you by the 3rd party who is sharing the file.
Thanks to #fusion3k who pointed me to the right direction.
$rawData = file_get_contents("php://input") gave the the RAW data from which I had to parse the file from. It was painful to extract it because it was a binary file. I used
Explode the data with "\n" as a delimiter $rawData = explode("\n", $rawData)
Skip first few lines (it was 3 lines of header data for me) and take the rest into $valuableData
Convert the last line of data from string to binary using $convertedLine = unpack("H*", $valuableData[count($valuableData)-1])
Cut out the last byte of data from the last row $convertedLine[1] = substr($convertedLine[1], 0, -2) ( have no idea why is `unpack returning an array)
$valuableData[count($valuableData)-1] = pack("H*", $convertedLine[1])
implode("\n", $valuableData) and write that to a file
I am trying to make a download action that downloads a Word doc generated in the 'download' controller using PHPDOCX. So far PHPDOCX is able to save the desired .docx file in the correct folder, but something goes wrong when I try to download it. Since Media Views were deprecated, I must use the CakeResponse file method as suggested in the CakePHP 2.x Cookbook:
// In the controller:
$this->response->file($file['path'], array('download' => true, 'name' => $filename));
return $this->response;
I was able to use this method for to export an RTF with no problem (the RTF was generated using PHPRTFLite), but when I use the method for a .docx file using PHPDOCX I receive the following error in Firefox:
The character encoding declaration of the HTML document was not found
when prescanning the first 1024 bytes of the file. When viewed in a
differently-configured browser, this page will reload automatically.
The encoding declaration needs to be moved to be within the first 1024
bytes of the file.
I would like to use a document generator that accepts HTML, which is why I chose PHPDOCX. Considering the above error, I set off to define the headers and content-type using the following method:
$this->response->header(array('Content-type'=>'application/vnd.openxmlformats-officedocument.wordprocessingml.document'));
But I still receive the same error in CakePHP:
The requested file APP/files/info_sheets/filename.docx was not found or not readable
One thing I was thinking is that PHPDOCX sends many errors when it generates the document and this is interfering with the Mime-type or encoding. But according to the 2.x Cookbook:
Headers are not sent when CakeResponse::header() is called either.
They are just buffered until the response is actually sent.
Another idea is that I need to set the character encoding in the header right after the content-type:
$this->response->header(array('Content-type'=>'application/vnd.openxmlformats-officedocument.wordprocessingml.document;charset=utf-8'));
But this results in garbled text.
Does anyone have any ideas how to resolve this? The "download.ctp" view file is currently blank. Please let me know if you need additional information about this issue.
Thanks!
Chris
First of all, you might try to disable autoRender, otherwise CakePHP might still try to render your view and layout;
$this->autoRender = false;
Also, haven't tested it, but have you tried this to set the header:
// register the type
$this->response->type(array('docx' => 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'));
// set the type for the response
$this->response->type('docx');
See the documentation:
http://book.cakephp.org/2.0/en/controllers/request-response.html#dealing-with-content-types
You can modify the media.php file in the core of the framework and add the mime-type to the array that have the types.
Eg:
'docx' => 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'
I've got a webservice which expects a parameter of type "xs:base64Binary" - this is a file to store in the database.
I'm trying to consume the service using PHP 5's native webservice classes. I've tried a few things:
// Get the posted file
$file = file_get_contents($_FILES['Filedata']['tmp_name']);
// Add the file, encoding it as a base64
$parameters = array("fileBytes" => base64_encode($file));
// Call the webservice
$response = $client->attachFile($parameters);
The result is an error saying "Bad Request." If the file is a text file and I don't base64_encode, it works fine. Problem results when posting a binary file such as an image.
Anyone know the trick here?
EDIT 1
Also problematic is if I encode the text file, it seems to work but of course it's encoded and ends up being junk once downloaded and viewed again (i.e, the text is encoded and doesn't seem to get de-coded by the server).
As far as I know, base64_encode() should be doing the job.
Are you 100% sure $file contains something? Have you made a dump?
Ok, so it seems there is no need to use base64_encode. The file_get_contents already puts it into the required format.
Additionally, the problem was because I had the server side config setting for the maxArrayLength too low.