CodeIgniter with PHPABTest - php

I'm building a CodeIgniter site, and attempting to use php ABTest in the controller.
I saved the phpabtest.php file as phpabtest_helper.php in the "helpers" folder, and loaded it in the controller. It's initialized in the PHP logic as such:
public function view($name)
{
$this->load->helper('phpab');
$testpopup = new phpab('test_popup');
$testpopup->add_variation("popup");
$type = $this->Types->getTypeBySlug($name);
$data['type'] = $type;
$data['items'] = $this->Items->getItemsByType($type->id);
$alltypes = $this->Types->getAll();
$headerdata['alltypes'] = $alltypes;
$headerdata['current'] = $type->id;
$this->load->view('header');
$this->load->view('typeheader', $headerdata);
if($testpopup->get_user_segment()=='popup'){
$this->load->view('type_new', $data);
} else{
$this->load->view('type', $data);
}
$this->load->view('footer');
}
It works fine on my localhost, but when I upload it to the server, it breaks, just displaying a blank page. I've isolated the problem to the initialization of the new phpab object. In the helper, it does ob_start(array($this, 'execute')); and this line seems to be what is breaking the code.
What server settings should I be looking at to get it to work? I'm assuming it's a server setting issue because it works fine on my localhost. If I'm wrong and it's some other issue, how do I fix this?

You might want to check your PHP settings. You'll find a setting called output_buffer in your php.ini file that might be set to Off.
Exert from php.ini:
; Output buffering allows you to send header lines (including cookies) even
; after you send body content, at the price of slowing PHP's output layer a
; bit. You can enable output buffering during runtime by calling the output
; buffering functions. You can also enable output buffering for all files by
; setting this directive to On. If you wish to limit the size of the buffer
; to a certain size - you can use a maximum number of bytes instead of 'On', as
; a value for this directive (e.g., output_buffering=4096).
output_buffering = 4096

Related

PHP replace a row in csv works fine on my localhost but does not replace the row when uploaded to cpanel?

Hello I am relatively new to PHP and I was trying to replace a row in a csv file, i didnt find an optimal solution so I concocted script (a work around) which suits my needs for the time being till I grasp a better understanding of PHP
I tested it on my localhost using XAMPP and everything was working fine , it was replacing the row as intended but when i uploaded the files to my cpanel it stopped replacing and instead it just goes the normal route and write the row on new line.
this is my code :
$fileName = 'Usecase.csv'; //This is the CSV file
$tempName = 'temp.csv';
$inFile = fopen($fileName, 'r');
$outFile = fopen($tempName,'w');
while (($line = fgetcsv($inFile)) !== FALSE)
{
if(($line[0] == "$fin") ) //Here I am checking the value of the variable to see if same value exists in the array then i am replacing the array which will be later written into the csv file
{
$line = explode (",", "$tempstr10");
$asd=$asd+1; //this is the variable that i defined and assigned value 0 in the top most section, this is used later in the code
}
fputcsv($outFile, $line );
}
fclose($inFile);
fclose($outFile);
unlink($fileName);
rename($tempName, $fileName);
if( $asd==0 && filesize("Usecase.csv")>0) // here its checking if the value is 0 , if value is 0 then that means the above code didnt execute which means the value wasnt present in the file , this is to avoid writing the same string again into the file
{ file_put_contents("Usecase.csv", "$tempstr10\r\n",FILE_APPEND | LOCK_EX); }
if( $asd==0 && filesize("Usecase.csv")==0)
{ file_put_contents("Usecase.csv", "$tempstr10\r\n",FILE_APPEND | LOCK_EX); }
and as I mentioned above , its working on the localhost but not on the cpanel , can someone point out if something is wrong with the code ? or if its something else ?
thank you
The most likely problem is that your local version of PHP or your local configuration of PHP is different from what is on the server.
For example, fopen is a feature that can be disabled on some shared servers.
You can check this by creating a php file with the following conents:
<?php phpinfo();
Then visit that PHP file in your browser. Do this for both your local dev environment and your cPanel server to compare the configuration to identify the differences that may be contributing to the differing behavior.
You should also check the error logs. They can be found in multiple different places depending on how your hosting provider has things configured. If you can't find them, you'll need to ask your hosting provider to know for sure where the error logs are.
Typical locations are:
The "Errors" icon in cPanel
A file named "error_log" in one of the folders of your site. Via ssh or the Terminal icon in cPanel you can use this command to find those files: find $PWD -name error_log
If your server is configured to use PHP-FPM, the php error log is located at ~/logs/yourdomain_tld.php.error.log
You should also consider turning on error reporting for the script by putting this at the very top. Please note that this should only be used temporarily while you are actively debugging the application. Leaving this kind of debugging output on could expose details about your application that may invite additional security risks.
<?php
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
... Your code here ...

Laravel 5.8 multi-upload inputs

I want to be able to upload multiple (dynamic) files (pdf).
So I have the following lay-out:
As you can see, the form has 4 input fields for files, but it also has 2 text fields and for every file upload row, it has a checkbox. The "flow" is the following:
Add title and year
Check the classes (Initiatie, Recreatie, Toersime, and Sport) you want to enable (and upload a PDF for)
Upload 1 PDF file per class.
The files are PDF files (1 per class). I tried the following code in PHP to upload the files, but I can only upload one, sometimes 2 files at a time, depending on how large the files are.
public function postGapersritAddResults(Request $request): RedirectResponse
{
// Handle upload
$path = 'documents/gapersrit/'.$request->get('year').'/results';
foreach (['initiatie', 'recreatie', 'toerisme', 'sport'] as $item) {
if ($request->hasFile('file_'.$item)) {
$request->file('file_'.$item)->storeAs($path, $item.'.'.$request->file('file_'.$item)->getClientOriginalExtension(), 'webdav');
}
}
// Handle database
$result = new SiteGapersritResults();
$result->title = $request->get('title');
$result->year = $request->get('year');
$result->initiatie = filter_var($request->get('active_initiatie'), FILTER_VALIDATE_BOOLEAN);
$result->recreatie = filter_var($request->get('active_recreatie'), FILTER_VALIDATE_BOOLEAN);
$result->toerisme = filter_var($request->get('active_toerisme'), FILTER_VALIDATE_BOOLEAN);
$result->sport = filter_var($request->get('active_sport'), FILTER_VALIDATE_BOOLEAN);
$result->save();
toastr()->success('Saved the results for year '.$result->year.'.', 'Success', ['timeOut' => 5000]);
return redirect()->to('admin/gapersrit/results');
}
If someone has a better idea of how I could do this, please help me out.
Ideally, I want to select all the files and be able to upload them one by one (like in my code), but for some reason, this doesn't work and throws most of the time the too large error, however, I guess I'm uploading one file at a time?
Edit
The limit for upload sizes is 100M in php.ini and my Nginx configuration.
Edit 2
I get the following error on my current code:
curl_exec(): CURLOPT_INFILE resource has gone away, resetting to default
full trace: https://pastebin.com/rqUeEhGa
This might be because the upload size is limited by the php.ini config of your server.
If you have access to the file or the php settings, try changing these values:
upload_max_filesize = 20M
post_max_size = 20M
Edit also see see https://stackoverflow.com/a/23686617/7584725
As you said, the error is curl_exec(): CURLOPT_INFILE resource has gone away, resetting to default
The code snippet you posted does not contain anything related to curl, the problem is the "webdav" disk in the saveAs function.
Looking at the trace, this seems like there is a problem in the League\Flysystem\WebDAV\ package, maybe
https://github.com/thephpleague/flysystem-webdav/issues/49 or
https://github.com/thephpleague/flysystem-webdav/issues/50

How to parse file without using eval - PHP?

Is there any other way to parse the file without eval()? I'm trying to render the php code without using php tags inside index.gs and so far i can do it only with eval(). The problem is not only to parse vars, but custom template characters.
here is the sample of code below.
$render = file_get_contents($this->file);
$render = $this->parse_extends($render);
$render = $this->parse_assets($render);
$render = $this->parse_vars($render);
$render = $this->parse_vars_skip($render);
try {
ob_start();
eval('?>' . $render);
$render = ob_get_contents();
} finally {
ob_get_clean();
}
return $render;
The return $render - return to View::class code for response
If the allow_url_include directive is enabled in php.ini, then it’s possible to execute this code using
include "data://text/plain;base64," . base64_encode($render);
but this setting is disabled by default, and cannot be changed within user code, but only through editing the php.ini file; so unless explicitly enabled in php.ini (and there normally isn’t any good reason why it should be), then it isn’t really an option.
An alternative is to create a temporary file, write the code there, and then execute it using include:
$tempFilename = tempnam("/tmp", "MyTemplate");
file_put_contents($tempFilename, $render);
include $tempFilename;
unlink($tempFilename);
But both have similar issues and dangers to eval().

PHP fwrite and file_put_contents create partial damaged tif and pdf files [duplicate]

I am writing a PHP script that goes through a table and extracts the varbinary(max) blob data from each record into an external file. The code is working perfectly (I used virtually the same code to go through some images) except when a file is over 4096b - the data is truncated at exactly 4096.
I've modified the values for mssql.textlimit, mssql.textsize, and odbc.defaultlrl without any success.
Am I missing something here?
<?php
ini_set("mssql.textlimit" , "2147483647");
ini_set("mssql.textsize" , "2147483647");
ini_set("odbc.defaultlrl", "0");
include_once('common.php'); //Connection to DB takes place here.
$id=$_REQUEST['i'];
$q = odbc_exec($connect, "Select id,filename,documentBin from Projectdocuments where id = $id");
if (odbc_fetch_row($q)){
echo "Trying $filename ... ";
$fileName="projectPhotos/docs/".odbc_result($q,"filename");
if (file_exists($fileName)){
unlink($fileName);
}
if($fh = fopen($fileName, "wb")) {
$binData=odbc_result($q,"documentBin");
fwrite($fh, $binData) ;
fclose($fh);
$size = filesize($fileName);
echo ("$fileName<br />Done ($size)<br><br>");
}else {
echo ("$fileName Failed<br>");
}
}
?>
OUTPUT
Trying ... projectPhotos/docs/file1.pdf
Done (4096)
Trying ... projectPhotos/docs/file2.zip Done (4096)
Trying ...
projectPhotos/docsv3.pdf Done (4096)
etc..
Instead of setting odbc.defaultlrl to 0, try setting it to an actual value instead:
ini_set("odbc.defaultlrl", "100K");
If you're using mssql (freetds), look in /etc/freetds.conf for a setting called "text size". Mine was set to 64512 and that's exactly what my images were being truncated to. I set it to 5MB (5242880) and it's working like a charm now.
text size = 5242880
According to this comment in the manual, you must set the INI settings before connecting, which does not seem to be your case.
I know this is ancient, but I solved this differently. ini_set does not wok for mssql.textlimit or mssql.textsize, this is documented on php.net.
Both of these default to 4096b (4k) in the php.ini file. Reset these to a higher value and it will work fine.
Dont forget to restart httpd service after.

PHP 'copy' not working

Can anybody tell me why this function isn't copying the file at all?
$pluginfile = get_bloginfo('template_url') . '/wp-content/plugins/supersqueeze/supersqueeze.php';
$urlparts = get_bloginfo('template_url');
$homeurl = home_url();
$urlstrip = str_replace($homeurl, '..', $urlparts);
$urldest = $urlstrip . '/supersqueeze.php';
function copyemz(){
global $pluginfile; global $urldest;
if(!#copy($pluginfile,$urldest)) {
$errors= error_get_last();
}
}
This file is run from /public_html/wp-admin/plugins.php
I need it to copy the file at ($pluginfile) /public_html/wp-content/plugins/supersqueeze/supersqueeze.php
to ($urldest) /public_html/wp-content/themes/[active wordpress theme] - of course replacing [active wordpress theme] with the directory of the theme.
You need to ensure that you have write permissions to /public_html/wp-content/themes/[active wordpress theme] as well as any other files you may be overwriting.
So, the second parameter to copy() must be a local file. Make sure it is also a writable destination (chmod) like webbiedave said.
$desturl = "./supersqueeze.php";
The reason is two-fold. PHPs http stream wrappers don't support POSTing or PUTing files, which a write-to action would require. Second, your webserver probably wouldn't support HTTP PUT either. (Though a small requesthandler script could handle such.)

Categories