Decompress GZIP data from PHP - php

I've been trying to decompress GZIP data from web server
Currently Using:
EasyPHP
Code Igniter
PHP
My Problem
I haven't decompressed the data , it is actually a String compression not a file and i've been searching throughout the net and it only gives me file examples.
Some of my outputs: I haven't seen errors but i got this output:
org.apache.http.entity.StringEntity#3e583da8" is this an error on my side? I've been searching this thing and found out that is from java
Here is my PHP method:
$gzdata = gzopen($chunkData,"r");
$uncompressed_file = fopen($chunkData,"w");
while($line = gzgets($gzdata,4096))
{
fwrite($uncompressed_file,$line);
}
log_message('debug','decompressed data '.$var);
fclose($uncompressed_file);
gzclose($gzdata);
return $uncompressed_file;
i was trying to point out is that my method above was the method i used in overcoming in decompressing gzip data from the web and it turns out that it doesn't work do i need to configure something on my php or apache something? or do i need to install gz libraries to make it work? Need your professional advice and suggestions . Thank You
Another code Here: that i'm currently using:
$chunkDecoded = base64_decode($data);
//$decodeToJson = http_chunked_decode($chunkDecoded);
//$zipCode =gzinflate(substr($chunkDecoded,10,-8));
$zipCode = gzdecode($chunkDecoded);
$decoded = json_decode($zipCode);
log_message('debug','gzuncompress '.$decoded);

Related

How to set Content-type in php on mac?

I've the problem. Here is my code of php:
include "connection.php";
$patternJPG = '/.pdf$/i';
$gfs = $db->getGridFS();
$collection=$db->Children;
$selected = array('_id'=> new MongoId($_GET['id']));
$cursor=$collection->find($selected);
foreach($cursor as $b){
$id1 = $b["photoID"];
$criteria = array("_id" => New MongoId ($id1));
$nameOriginal = $gfs->findOne($criteria)->file['filename'];
if(preg_match($patternJPG, $nameOriginal)){
$row = $gfs->findOne($criteria)->getBytes();
$mimeType = 'Content-type: application/pdf';
header("Content-Length: " . strlen($row));
header($mimeType);
ob_end_clean();
flush();
echo($row);
}
}
This code literally works on windows platform. When I try to run that code on mac, it doesn't work. I think that code should be worked on all platform instead. Becase all this time, I never have found the problem like this. I have the code that working on windows, that code also work on mac. Please someone help me for fixing this issue.
There is one post on similar issue
Well, Chrome is meant to be a basic browser that is customizable to your needs with extensions. Our Web App Store features a bunch of free, handy dandy extensions that will let you view PDFs right in your browser.
Please try to install a plugin to view the PDF files in chrome on MacOS
https://chrome.google.com/webstore/search/pdf%20viewer?hl=en
Also, one way to make sure you have your server side code correct is to save the output as a file with .pdf extension and try to open it to make sure it is a correct PDF.
If it is required to force Chrome to download the file instead of viewing it so the file can be viewed by the system PDF viewer. You can use the header Content-Disposition: attachment as per this question.

PHP base64_encode PDF File Corrupted

I have an automated PHP script which connects to an email box, reads emails and process them to create tickets. Some of these emails contain various types of file attachments. My script uses following code to save files directly to a postgress database. I'm using codeigniter.
public function saveFiles($filename, $fileurl, $jobid) {
$filedata = array();
$filedata['filename']= $filename;
$filedata['filedescription'] = 'Incoming attachment.';
$filedata['fileargid'] = $jobid;
$filedata['fileaddedon'] = date('Y-m-d H:i:s P');
$filedata['filedata'] = pg_escape_bytea(base64_encode(file_get_contents($fileurl)));
$results = $this->db->insert('file', $filedata);
if ($results)
return $this->db->insert_id();
else
return FALSE;
}
However, most of the files are saved without any issue. My problem is some pdf files get corrupted when I deployed this script. Script saves files to local disk before encoding to base64. All these files are healthy as well. I suspect something is happening during pg_escape_bytea(base64_encode(file_get_contents($fileurl))).
I developed this script using php 5.5.9/Ubuntu on my local PC and non of the files get corrupted there. But the script is deployed on a Ubuntu server with php 5.3.10 and files get corrupted there.
I tried to find out what is causing this but no lock so far. Is this because of different php versions?
Looks either:
you're encoding to database in the "escape" format and reading from it in hex-format.
You need to cast "when the client and backend character encoding does not match, and there may be multi-byte stream error. User must then cast to bytea to avoid this error." From pg_escape_bytea documentation, it also counts to unscape.
Check section 8.1 here
If isn't a problem I'd save bin2hex output to the field directly.

Amazon S3 StreamWrapper fread Issue In PHP

I am using amazon s3 API and setting the client to read as stream. It is working fine for me to use file_get_contents("s3://{bucket}/{key}"), which read the full data for the file(I am using video file & testing on my local system). However, I am trying to optimize the memory used by the script and thus trying to read and return data by chunk as below:
$stream = #fopen("s3://{bucket}/{key}", 'r');
$buffer = 1024;
while(!feof($stream)) {
echo #fread($stream, $buffer);
flush();
}
This is not working on my local system. I am just wondering what might be the issue using this technique. by searching, I found that this is also a very widely used technique. So, if anybody can please give any suggestion about what might be wrong here or any other approach, I should try with, it will be very helpful. Thanks.
OK, finally got the solution. Somehow, some other output are being added to buffer. I had to put:
ob_get_clean();
header('Content-Type: video/quicktime');
In this way to clean anything if were added. Now its working fine.
Thanks Mark Baker for your valuable support through the debugging process.

Universal replacement for file_get_contents? (PHP)

I'm having a bit of a problem with reading an XML file (RSS feed). I'm using file_get_contents to read the data, which works fine locally on Xampp, but won't work on my server.
I don't want to have to edit my php.ini file, I'm after a out-of-the-box solution if there is one (I want the code to be as portable as possible).
I am currently doing...
// Load Correct Feed
$feed_url = $this->selectFeed($instance);
$content = file_get_contents($feed_url);
$x = new SimpleXmlElement($content);
I've had a look for solutions (ie. cURL) but can't find a way that will be supported universally.
Please could someone point me in the right direction (if it is even possible)?
what about...
$xml = simplexml_load_file($file);

PHP to consume webservice that needs xs:base64Binary

I've got a webservice which expects a parameter of type "xs:base64Binary" - this is a file to store in the database.
I'm trying to consume the service using PHP 5's native webservice classes. I've tried a few things:
// Get the posted file
$file = file_get_contents($_FILES['Filedata']['tmp_name']);
// Add the file, encoding it as a base64
$parameters = array("fileBytes" => base64_encode($file));
// Call the webservice
$response = $client->attachFile($parameters);
The result is an error saying "Bad Request." If the file is a text file and I don't base64_encode, it works fine. Problem results when posting a binary file such as an image.
Anyone know the trick here?
EDIT 1
Also problematic is if I encode the text file, it seems to work but of course it's encoded and ends up being junk once downloaded and viewed again (i.e, the text is encoded and doesn't seem to get de-coded by the server).
As far as I know, base64_encode() should be doing the job.
Are you 100% sure $file contains something? Have you made a dump?
Ok, so it seems there is no need to use base64_encode. The file_get_contents already puts it into the required format.
Additionally, the problem was because I had the server side config setting for the maxArrayLength too low.

Categories