Site running under MAMP dies when ob_get_clean is called - php

I'm running MAMP with PHP version 5.5.10. PHP and Apache are both working, except on pages that call ob_start() and ob_get_clean():
<?php
if (array_key_exists('DOCUMENT_ROOT', $_SERVER))
include("{$_SERVER['DOCUMENT_ROOT']}/php-libs/setup.php");
else {
// use include path - under CGI
include("php-libs/setup.php");
}
$page = $site->page();
$page_info = array(
'title' => 'Welcome!',
'page_title' => '',
'page_subtitle' => '',
'page_type' => 'homepage',
'body_class' => 'home full'
);
$page->setup($page_info);
ob_start();
?>
<p>Hello World!</p>
<?php
$page->setContent(ob_get_clean());
$page->display();
The result is that I get a 200 response but no page content and no errors. Nothing shows up in the PHP or Apache error log, so I'm at a complete loss. I've tried multiple different PHP versions and it doesn't seem to matter.
If I comment out the $page->setContent(ob_get_clean()); line then the page loads but the included files aren't included.
Here is my phpinfo output: http://jsfiddle.net/LeyLcr5f/embedded/result/
Also, a colleague of mine is using the same repo on his machine with MAMP PRO without an issue (we're both running OS X Mavericks).

This looks like a library I've used in the past. Try making sure that the web server has write access to the smarty/templates_c directory.

Related

Setting environment variables within PHP works for one user but not another

A user has been helping me with a problem (How to force a curl request in a PHP method to fail for a unit test). They suggested doing putenv('all_proxy=localhost:5678'); so I can force curl to fail dynamically in a unit test (I changed http_proxy / https_proxy to all_proxy because it does all protocols).
This works perfectly on their Ubuntu box, but I can't get it to work on either my Windows 10 box or Ubuntu box. If I set all_proxy from the command prompt, the curl requests always fail, so it is taking notice of the variable when it can find it. I changed their script slightly and that seems to have got it working on Ubuntu.
Is there some setting in php.ini that controls whether putenv() can override variables from the environment or not? Why does the dynamic environment variable work on their setup but not on either of mine Ubuntu but not Windows?
Test script
<?php
function search() {
$url = 'x3m.dev';
$curl = curl_init();
curl_setopt_array($curl, [
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_URL => $url,
]);
$data = curl_exec($curl);
if (!$data) {
throw new Exception('An error occurred while trying to process the request.');
}
return $data;
}
function do_curl_request() {
echo getenv('all_proxy') . "\n\n";
try {
echo search();
}
catch (Exception $e) {
echo $e->getMessage();
}
echo "\n\n";
}
echo "========== first run without proxy\n";
do_curl_request();
putenv('all_proxy=localhost:5678');
echo "========== second run with proxy override\n";
do_curl_request();
It should work the first time and throw an exception the second time. On Windows it works both times if all_proxy is not set as a Windows environment variable, and throws an exception both times if all_proxy is set.
Windows (incorrect)
========== first run without proxy
<html>
<head></head>
<body>.</body>
</html>
========== second run with proxy override
localhost:5678
<html>
<head></head>
<body>.</body>
</html>
Ubuntu (correct)
========== first run without proxy
<html>
<head></head>
<body>.</body>
</html>
========== second run with proxy override
localhost:5678
An error occurred while trying to process the request.
It's important to recognize that PHP does not import anything from the user's environment. It has it's own self-contained environment which is per-request (i.e. it's cleaned up with every RSHUTDOWN event in the interpreter). Anything you do within php (i.e. putenv()) lives strictly within that request. What you do in your shell will have no effect on PHP's environment.
putenv
Adds setting to the server environment. The environment variable will only exist for the duration of the current request. At the end of the request the environment is restored to its original state.
TL;DR
It was a bug and that was patched in PHP 7.3.17 and 7.4.5 (March 2020). There is no known workaround for earlier versions.
Are you sitting comfortably? Then let's begin!
This bug only exists on Windows
I created a PHP bug report
This bug was previously noticed and fixed in PHP 5, but only for non-thread-safe versions:
PHP 5.4.36 non-thread-safe (17 December 2014)
PHP 5.5.20 non-thread-safe (26 November 2014)
PHP 5.6.4 non-thread-safe (27 November 2014)
The bug was marked as "partially fixed", but got accidentally closed anyway, preventing further work on the bug as it fell off people's radar
There are two possible internal calls to get/set environment variables:
SetEnvironmentVariable() / GetEnvironmentVariable() (thread-safe)
putenv() / getenv() (non-thread-safe)
cURL uses getenv() which can't see changes made by SetEnvironmentVariable()
A cURL bug report has now been created
A fix for cURL was merged on 12 February 2020. This was released in cURL 7.69 on 4 March 2020.
A note on the PHP bug report I created says:
libcurl 7.69.1 has been released which fixes this issue, and will
be used for the PHP 7.3.17 and 7.4.5 Windows builds, so I'm
closing this ticket.

laravel-snappy: file was not created

At the moment, I'm trying to trace where this issue arose from given that nothing major was changed.
But at the moment I currently use laravel-snappy to generate pdfs, I haven't had an issue until now when I am all of a sudden receiving the following errors:
The file 'C:\Users\ADMINI~1\AppData\Local\Temp\knp_snappy5a7d3011c11883.41249127.pdf' was not created (command: "C:\Program Files\wkhtmltopdf\bin\wkhtmltopdf" --lowquality --images --enable-javascript --javascript-delay "10" "C:\Users\ADMINI~1\AppData\Local\Temp\knp_snappy5a7d3011b9a179.91650543.html" "C:\Users\ADMINI~1\AppData\Local\Temp\knp_snappy5a7d3011c11883.41249127.pdf").
Unfortunately, it doesn't tell me why it wasn't created. At this point in time, the error handler points to this specific line where it is returning this error:
if (!$this->fileExists($output)) {
throw new \RuntimeException(sprintf(
'The file \'%s\' was not created (command: %s).',
$output, $command
));
}
This line comes from this file: vendor\knplabs\knp-snappy\src\Knp\Snappy\AbstractGenerator.php
My wkhtmltopdf binary is located in the correct place, and nothing has changed in response to the setup of these files. And yes, at the moment these files are hosted and served on a Windows Server platform.
My config for the snappy:
<?php
return array(
'pdf' => array(
'enabled' => true,
'binary' => '"C:\Program Files\wkhtmltopdf\bin\wkhtmltopdf"',
'timeout' => false,
'options' => array(),
'env' => array(),
),
'image' => array(
'enabled' => true,
'binary' => '"C:\Program Files\wkhtmltopdf\bin\wkhtmltoimage"',
'timeout' => false,
'options' => array(),
'env' => array(),
),
);
My files are being generated as such through my controller:
public function downloadPDF(Shipment $shipment) {
$shipment_details = $shipment->shipment_details;
$shipment->print_date = Carbon::now();
$shipment->save();
$pdf = PDF::loadView('shipments.pdf', compact('shipment','shipment_details'))
->setOption('images', true)
->setOption('enable-javascript', true)
->setOption('javascript-delay', 10);
return $pdf->download('shipment'.$shipment->uuid.'.pdf');
$shipment->print_date = Carbon::now();
$shipment->save();
}
Posting this in case someone else googling has the same problem, and they don't like the accepted answer of "just do it in Linux"
For me it was because Visual C++ 2013 wasn't installed - running the file in the command line gave me errors about missing dlls that were included in the redist.
The easiest way to get around this is to exec a raw command, wkhtmltopdf does not have the same command line parameter on Linux/Windows, this means that the snappy wrapper only works with amd64 and fail when it's used with a 64bit windows executable.
exec("C:/path/to/wkhtmltopdf.exe path/to/my.html destination/for/my.pdf");
since this solution is terrible and that wkhtmltopdf functionality are limited on windows, I strongly recommend you to deploy with docker, or to just develop under Linux. Or else you won't be able to use multiples functionality like pdf footer, pdf encoding utf-8 and much more...
Here's a tutorial on how to use docker compose for laravel!

php on windows server

I am trying to create my first dynamic site on windows server 2008 r2. I have created other dynamic sites successfully on linux based systems but I am having a little bit of trouble with this.
So i have php, mysql and phpmyadmin installed on the server.
I have a page (index.php) with the following:
<?php
include 'php/index.php';
mysql_set_charset('utf8');
?>
This works fine and does not show when i do 'view source' on the displayed page in a web browser.
Further down my page i have the following:
<?=$obj->get_email()?>
This statement is to be used to pull information from a database. When i do view source it shows the get_email() command as seen above. Instead of what it should be replaced with .
Is this a PHP issue? Is this type of command not suitable with windows server or am i just doing something completely stupid??
Try it without short tags:
<?php echo $obj->get_email(); ?>
You should check php.ini for the setting: short_open_tag and change it to 1 if you wish to be able to use <? to jump into PHP mode.
Barring that, just changing it to <?php echo $obj->get_email(); ?> will fix your problem.
Reference

passing multiple parameters from php to shell

I want to run .exe (c++) file by php script. I tried very different combinations of exec command, but still my programs returned -2 (argv<6) or array (?), now I tried to use shell_exec
<?php
$params = array ('nnn.jpg', 'fff.jp2', '300', '300', '50');
$params_string = implode(" ", $params);
shell_exec('demo.exe '.$params_string);
echo 'demo.exe '.$params_string
?>
but it is not working too... I echoed the string I used, and it's just okay
I got it to send parameters properly, but program exits with error caused by write problem. I've changed all perms in target folder to "full control". Maybe there is something with php settings? (xampp on win7 x64)
demo.exe nnn.jpg fff.jp2 300 300 50
any ideas?
I got it to send parameters properly, but program exits with error caused by write problem. I've changed all perms in target folder to "full control". Maybe there is something with php settings? (xampp on win7 x64)
ERROR: Exception: demo.exe: no decode delegate for this image format `kush.jpg' # error/constitute.c/ReadImage/532-5
but as I said before, all goes well through cmd...
Problem was more server issue, dar7yl was almost correct, problem was apache haven't access to imagemagick lib, located in program files... I had to change apache user to my account, and now all works fine ;)

Codeigniter cron job from CLI throws memcached errors

I'm trying to set up the cron jobs for my Codigniter application, however when I run the cron, it throws me memcached errors:
PHP Fatal error: Call to a member function get() on a non-object in /var/www/domain.com/www/dev/system/libraries/Cache/drivers/Cache_memcached.php on line 50
Fatal error: Call to a member function get() on a non-object in /var/www/domain.com/www/dev/system/libraries/Cache/drivers/Cache_memcached.php on line 50
Although I have no idea why this is throwing all the time, I can't find any errors in my cron job file, nor how to solve this problem because I don't know where this is being called, I looked into my autoloaded libraries and helpers, none of them seem to be wrong.
I also can confirm that memcached is installed, if I visit my site, memcached indeed works.
I tried suppressing the get() in Cached_memcached.php with a #, but this didn't help because no output is shown (but there is supposed to be output shown).
The command I run for the cron (user: www-data) is:
/usr/bin/php -q /var/www/domain.com/www/dev/index.php cron run cron
I'm running Ubuntu 11.10 x86_64.
This is my cron file:
<?php if ( ! defined('BASEPATH')) exit('No direct script access allowed');
class Cron extends CI_Controller {
var $current_cron_tasks = array('cron');
public function run($mode)
{
if($this->input->is_cli_request())
{
if(isset($mode) || !empty($mode))
{
if(in_array($mode, $this->current_cron_tasks))
{
$this->benchmark->mark('cron_start');
if($mode == 'cron')
{
if($this->cache->memcached->get('currency_cache'))
{
if($this->cache->memcached->delete('currency_cache'))
{
$this->load->library('convert');
$this->convert->get_cache(true);
}
}
echo $mode . ' executed successfully';
}
$this->benchmark->mark('cron_end');
$elapsed_time = $this->benchmark->elapsed_time('cron_start', 'cron_end');
echo $elapsed_time;
}
}
}
}
}
The first thing to try would be the following to determine if memcached is supported.
var_dump($this->cache->memcached->is_supported());
The second thing to ensure is that you've got a memcached.php file in application/config/
It should contain a multidimensional array of memcached hosts with the following keys:
host
port
weight
The following example defines two servers. The array keys server_1 and server_2 are irrelevant, they can be named however.
$config = array(
'server_1' => array(
'host' => '127.0.0.1',
'port' => 11211,
'weight' => 1
),
'server_2' => array(
'host' => '127.0.0.2',
'port' => 11211,
'weight' => 1
)
);
The next thing I'd try is check to see if the controller can be run in the web browser as opposed to the CLI or if you get the same error.
Also, explicitly loading the memcached driver might be worthwhile trying. The following will load the memcached driver, and failing that call upon the file cache driver.
$this->load->driver('cache', array('adapter' => 'memcached', 'backup' => 'file'));
Using this method allows you to call $this->cache->get(); to take into account the fallback too.
Another thing to check is that you're not using separate php.ini files for web and CLI.
On Ubuntu it's located in
/etc/php5/cli/php.ini
And you should ensure that the following line is present, and not commented out
extension=memcache.so
Alternatively, you can create a file /etc/php5/cond.d/memcache.ini with the same contents.
Don't forget to restart services after changing configuration files.
You can check memcached is indeed set up correctly using the CLI by executing the following
php -i | grep memcache
The problem is that $this->cache->memcached is NULL (or otherwise uninitialized), meaning that the cache hasn't been initialized.
An easy fix would be to simply create the memcache object yourself. The proper fix, however, would be to look through the source and trace how the memcache object normally gets instantiated (look for new Memcache and set a debug_print_backtrace() there. Trace the debug stack back and compare it with what your cron does - look where it goes wrong then correct it). This is basic debugging btw, sorry.
Also, make sure you do load the drivers. If your cron uses a different bootstrap function than your normal index (never used CI, is that even possible?) then make sure that the memcache init is placed in the right location.
-edit-
$this->cache->memcached probably isn't actually NULL, but the actual connection to the Memcache server definitely wasn't made before you started calling get().

Categories