I am using Grégoire's PHP Captcha library, on local host, it is working fine, as of creating images, and sending to browser. But when same code is on remote server, Captcha Image is not being displayed client side. Since in both cases each image request is responded with around 20kB of data, I am sure the image is being downloaded. And from the looks of it, it seems a valid base64 of an image. the only difference I have noticed is that when logging the base64 string, it is usually shorter, around 8kB, while the base64 from localhost is just as much as the real image, ~20kB.
But with my knowledge, I don't know how to use this information. Any Guidelines?
snippet of code:
$captcha = new Gregwar\Captcha\CaptchaBuilder;
$captcha->build(
Config::get('CAPTCHA_WIDTH'),
Config::get('CAPTCHA_HEIGHT')
);
header('Content-type: image/jpeg');
$captcha->output();
Further Info:
I have used wamp on my windows 7, but installed apache2, php5 separately on remote ubuntu 14.04.
Localhost: apache 2.4.9 , php 5.5.12
Remote Server: apache 2.4.7 , php 5.5.9
In both cases No error is being generated whatsoever.
a working base64 from localhost:
data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD//gA7Q1JFQVRPU ... iiigAooooAKKKKACiiigD//2Q==
which is usually around 20 kbytes long.
a not-working base64 from remote server:
data:image/jpeg;base64,/9j/4AAQSkZJRgABAQEAYABgAAD//gA7Q1JFQVRPU ... I0vQ7A
which is usually around 8 kbytes long.
Edit:
I have fixed it with a workaround, but still not sure about the reason of the problem. here it goes:
In the registrationView.php, I had a tag, of which, the 'src' attribute was set to this non-existing url that triggers the CaptchaBuilder code. So maybe this method does not work on remote config, because of the connection delay??
and what I did is that I replaced the 'src' tag with an ajax request that WAITS for the success and then loads the base64data. If anybody could give a confident comment on this suspicion, hopefully we get to the answer..
Related
The following script is trivial and works ok apache without issue
header('Content-Type: image/jpeg');
echo file_get_contents('./photo.jpg');
On NGINX/PHP-FPM I get a blank page. I have tried two different virtual servers. One I created, and the homestead improved box ( https://github.com/Swader/homestead_improved ) which is based on Laravel Homestead.
Error reporting is on, there are no errors. If I remove the header and just use:
echo file_get_contents('./photo.jpg');
I get the binary converted to ASCII and see the strange characters; the file is being loaded correctly.
I thought the issue might be a missing header, so I tried content length:
header('Content-Type: image/jpeg');
$contents = file_get_contents('./photo.jpg');
header('Content-length: ' . strlen($contents));
echo $contents;
This gives a different result: The page never loads, as if the browser never receives all the bytes it's expecting.
If I print strlen($contents) it displays the file size in bytes. PHP is loading the image correctly, but it's never reaching the browser.
The script works on an Apache server so the issue seems to be NGINX or PHP-FPM.
I have tried different images (one 80kb, one 2.2mb), the result is the same. I've also tried readfile instead of file_get_contents.
Update
In Chrome developer tools, the full image is downloaded and shown in the Network tab. The browser is getting the data but it's not displayed.
Your problem lies in process memory. PHP uses a different configuration file when running under PHP-FPM then when running under Apache for instance.
The problem with file_get_contents is that it reads the entire file into memory. In the case of an image file, memory reaches its limit and the http response never completes.
To fix the problem, you can either stream the image using fopen or increase PHP-FPM php's memory limit.
Like Scriptonomy said the memory can be an issue. You can also use readfile
https://secure.php.net/manual/en/function.readfile.php
I have installed the wonderful software wkhtmltopdf on our production Debian server to be used for creating PDFs from URLs. We stream (i hope that's the right term) the resulting PDF to the client, we have no interest in storing them server side.
We only generate PDFs from local URLs (on the own server that is). However, the URL is still based on the domain and not the local IP since there are multiple sites on the server.
Our problem is that for some local pages, all we get back is a entirely blank page (not even a PDF). The response code is 200, but the content-type is text/html. For those pages where a PDF is successfully streamed to the client, the content-type is application/pdf.
I thought that maybe something goes wrong in the generation of my PDF, so i ran exactly the same commands that PHP executes, and then a PDF was being generated successfully.
I am using the library found on this page to make PHP connect with wkhtmltopdf.
$pdf = new WkHtmlToPdf(array(
'margin-left'=>0,
'margin-right'=>0,
'margin-top'=>0,
'margin-bottom'=>0,
'print-media-type',
'disable-smart-shrinking'
));
$url = "http://myserver.se/$url";
$pdf->addPage($url);
$pdf->send();
Why do i get blank pages back for some URLs?
It turned out, the problem was in the library i was using. I can't tell exactly what the problem was, but proc_close in the send method of the wkhtmlpdf class was returning 2 instead of the expected 0 when a PDF was succesfuly created. This lead to that the library thought that no PDF was created, and it simply returned false meaning nothing was outputted to the client. I solved it by checking if the file existed instead by using PHP's file_exists function.
I am currently working on an Android app made with the PhoneGap framework. The app captures a video which is uploaded to a server. I used to set the chunkedMode to FALSE, and for files smaller than 10MB everything works fine. For files bigger than 10MB I get an Out of Memory Exception. To fix that I changed to chunkedMode = true, but right now I always get an connection error (error code 3).
Do I have the change something on the server to accept the chunked mode?
The server is an CentOS server with Apache 2/HTTP 1.1. I use the following line in the PHP (server side) script:
move_uploaded_file($_FILES['file']['tmp_name'], $target)
I really hope someone can help me with this problem.
Thanks!
First thing to check is what your max upload size is in your php.ini file:
upload_max_filesize = 10M
If you are not using PHP5 you will need to upgrade as PHP4 does not support chunked mode.
As far as I know most Apache web severs already have chunked mode turned on. One dev was using Nginx as a web server and had to install this module.
http://wiki.nginx.org/HttpChunkinModule
Mind you there wiki seems to be down right now.
We have a web app using Andrew Valums ajax file uploader, if we kick off 5 - 10 image uploads at once, more often then not at least 2 or 3 will result in the same gd error "Corrupt JPEG data"
Warning: imagecreatefromjpeg() [function.imagecreatefromjpeg]:
gd-jpeg, libjpeg: recoverable error: Corrupt JPEG data:
47 extraneous bytes before marker 0xd9 in ....
However this did not happen on our old test server, or local development box's, only on our new production server.
The file size on the server is the same as the original on my local machine, so it completes the upload but I think the data is being corrupted by the server.
I can "fix" the broken files by deleting them and uploading again, or manually uploading via FTP
We had a shared host on Godaddy and just have started to have this issue on a new box (that I set up, so probably explains a lot :) CentOS 5.5+, Apache 2.2.3, PHP 5.2.10
You can see some example good and bad picture here. http://174.127.115.220/temp/pics.zip
When I BinDiffed them I see a consistent pattern the corruption is always 64 byte blocks, and while the distance between corrupted blocks is not constant the number 4356 comes up a lot.
I really think we can rule out the Internet as error checking and retransmission with TCP is pretty reliable, further there seems to be no difference between browser versions, or if I turn anti-virus and firewalls off.
So I'm picking configuration of Apache / PHP?
Some cameras will append some data inside the file that will get interpreted incorrectly (most likely do to character encoding with in the headers).
A solution I found was to read the file in binary mode like so
$fh = fopen('test.jpg', 'rb');
$str = '';
while($fh !== false && !feof($fh)){
$str .= fread($fh, 1024);
}
$test = #imagecreatefromstring($str);
imagepng($test,'save.png');
Well, i think the problem is jpeg-header data, and as far as i know there is nothing to do with it by PHP, i think the problem is your fileuploader, maybe there are some configuration for it that you are missing.
Hmm - a 64 byte corruption?...or did you mean 64 bit?
I'm going to suggest that the issue is in fact as a result of the PHP script. the problem that regularly comes up here is that the script inserts CRLFs into the data stream being uploaded, and is caused by differences between the Window/*nix standards.
Solution is to force the php script to upload in binary mode (use the +b switch for ALL fopen() commands in the php upload). It is safe to upload a text file in binary mode as at least you can still see the data.
Read here for more information on this issue:
http://us2.php.net/manual/en/function.fopen.php
This can be solved with:
ini_set ('gd.jpeg_ignore_warning', 1);
I had this problem with GoDaddy hosting.
I had created the database on GoDaddy using their cPanel interface. It was created as "latin collation" (or something like that). The database on the development server was UTF8. I've tried all solutions on this page, to no avail. Then I converted the database to UTF8, and it worked.
Database encoding shouldn't affect BLOB data (or so I would think). BLOB stands for BINARY Large Object (something...), to my knowledge!
Also, strangely, the data was copied from the dev to production server while the database was still "latin", and it was not corrupted at all. It's only when inserting new images that the problem appeared. So I guess the image data was being fed to MySQL as text data, and I think there is a way (when using SQL) of inserting binary data, and I did not follow it.
Edit: just took a look at the MySQL export script, here it is:
INSERT INTO ... VALUES (..., _binary 0xFFD8FF ...
Anyway, hope this will help someone. The OP did not indicate what solved his problem...
I have the following setup:
Plain-Server: Delivering php-files as plain text
Proxy-Server: Asking the Plain-Server for the php file and parsing it.
Now my question: How do I configure the Proxy-Server (a fully configurable apache 2.2 with PHP 5.3) to interpret the plain php files from Plain-Server?
Example: Given a small php script "hello.php" on Plain-Server (accessible throw http://plainserver/hello.php):
<?php
echo "Hello World";
?>
Plain-Server only outputs it as plain text, no parsing of php-code.
On the Proxy-Server the file "hello.php" does not exist. But when requesting hello.php from Proxy-Server it should get the hello.php from Plain-Server with mod_proxy (Reverse Proxy). It should also parse and execute the php, saying only "Hello World".
The Reverse Proxy is already running, but the execution of php code not working. I tried mod_filter, but couldn't is work. Any ideas how to that?
You may consider instead sharing the php files from your source server via an nfs mount or something similar to your target server. Tricking the proxy server into doing what you ask seems like the long way around the barn?
I totally agree with jskaggz,
you could build some awfull tricks building some apps that fetch the remote page ,
dowload it into a local file and then redirect the user to that page that could be executed...
but there is a milion security issues and things that might go wrong with that...
Can't you just convert the 'plain server' to a php excuting server and do some traditional reverse proxying
on your 'proxy server'
maybe using mod_proxy:
http://www.apachetutor.org/admin/reverseproxies ?
Answered this on the ServerFault version of this thread: https://serverfault.com/a/399671/48061