I'm using PHP 7.1.11 on a machine that runs on Windows 10 Home Single Language edition.
I'm using XAMPP server on my machine.
I'm using following browsers on this machine :
Google Chrome(Version 62.0.3202.94 (Official Build) (64-bit))
Firefox Quantum(57.0.1 (64-bit))
Opera(Version : 49.0.2725.47)
Microsoft Edge 41.16299.15.0
I know the details of header() function and how it works.
But following program is behaving in really a weird way on all of the above four web browsers. Even after sending output to the client the header() function is working.
How can this be possible?
Below is my code(It's getting redirected to the URL I mentioned) :
<!DOCTYPE html>
<html>
<body>
<p>Welcome to my website!</p><br />
<?php
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
if($test) {
echo 'Welcome to my website<br />You\'re in!';
} else {
header('Location: http://www.espncricinfo.com');
}
?>
</body>
</html>
I was expecting to get the warning 'Cannot modify header information - headers already sent by' but surprisingly it's getting redirected me to the URL http://www.espncricinfo.com/?
Why?
If you are somehow using output buffering — either manually, or because your PHP is configured to automatically do this — you are still allowed to add headers, so long as the initial buffer has not been flushed.
What output buffering does is what the name hints at: put output in a buffer to be sent at a later stage, in stead of immediately outputting data. Because of this, so long as no body data of the HTTP response message has been sent, you are still able to send header data.
To check whether PHP is configured to automatically buffer output, you can do one of the following:
check the php.ini configuration file and search for the string output_buffering
var_dump( ini_get( 'output_buffering' ) );
phpinfo(); (dumps the whole configuration) and search for the string output_buffering
If output buffering is enabled, the value will either be the buffer size in bytes (if configured as such), or "On" or true, or something to that effect.
I'm in the process of making an app I write for my company more professional.
Until now, I always displayed all errors and simply tested the application so much that no errors won't occur anymore. If they still would, people would have to be able tell me the error message.
As the client base is growing a lot (affiliated companies also wanting to use the app), I cannot do this any longer. So I looked for a way to log error messages instead of displaying them.
This is what I'm using:
error_reporting(E_ALL);
ini_set('display_errors', 1);
ini_set("log_errors", 1);
ini_set("error_log", "logs/php-error.log");
1) undefined_function(); // resulting error doesn't get logged
2) error_log( "user call without parameters" ); // doesn't get logged
3) error_log( "user call with explicit parameters", 3, "logs/php-error.log") // gets logged
4) error_log( "user call with explicit parameters", 1, "admin#company.com") // sends mail as expected
Does anyone have a clue why 1) and 2) don't work, but 3) and 4) do?
I made sure the folder and file have 777-permission on the server.
It is an nginx server by the way, and I don't have direct access to its configuration. With the correct hints, I may convince the admin to make the changes though.
Thanks in advance.
NEWS
I just found this line in phpinfo:
error_log: /var/log/php-fpm/helpdesk-error.log
This means the server overrides my value or simply ignores it, right? How do I change this behaviour?
Hello i need some help with this code in the install.php which has to run first before the program but it brings an error pointing on the fflush i don't know what to do please help?
<?php
fflush();
authTableCreate();
announceTableCreate();
classTableCreate();
studentTableCreate();
classmembersTableCreate();
attendanceTableCreate();
assignmentTableCreate();
messageTableCreate();
supportTableCreate();
if (!authCheckUserExists('admin')) { // this is their first install probably
$randpass = 'admin' . mt_rand();
authAddUser('admin', $randpass, 100, 100); // create default superuser account
announceAddAnnouncement('Welcome', 'This is the default start page for IntraSchool. You should change this to something that describes how the system works within your school.');
?>
<p>Congratulations! You have successfully setup <em>IntraSchool</em>. You may now login with the username <em>admin</em> and password <em><?=$randpass?></em>. Please change the password to something you can remember.</p>
<?php
} else {
?>
<p>The installation script has reinitialized any deleted tables.</p>
<?php
}
page_footer();
?>
fflush() requires the handle of the file to be flushed. It is likely a typo for flush(), however as it's apparently at the start of the file that would do nothing at all. You should just delete the line.
It's only a warning though, so the rest of the script has probably been executed. If it's a once-only setup script then you probably do not need to run it again.
Here's the documentation - always a good place to start.
My understanding of your code is limited, so I'm not sure what you're trying to accomplish here (in particular, it looks like you're doing database operations, for which fflush should not be necessary). That said, here's a little background:
fflush flushes an open file to disk. You need to provide it with a file handle to flush.
When you're writing to a file on your disk, the operating system will often store up a bunch of your data and write it all to the disk at one time, rather than writing each byte as you send it. This is primarily for performance reasons. Sometimes, however, you need to get that data written at a particular point in your program. That's what fflush is for. But for fflush to work, you need to tell it what file you're talking about - that's the file handle mentioned in the documentation.
I have been trying to fix this wired php session issue for some time now.
Setup: running on IIS V6.0, php for windows V 5.2.6
Issue:
At totally random times, the next php line after session_start() times out.
I have a file, auth.php that gets included on every page on an extranet site (to check valid logins)
auth.php
session_start();
if (isset($_SESSION['auth']==1) { <---- timesout here
do something ...
}
...
When using the site, I get random "maximum execution time of 30 seconds exceeded" errors at the line 2: if (isset($_SESSION['auth']==1) {
If I modify this script to
session_start();
echo 'testing'; <---- timesout here
if (isset($_SESSION['auth']==1) {
do something ...
}
...
The random error now happens on line 2 as well (echo 'testing'), which is a simple echo statement, strange.
It looks like session_start() is randomly causing issues, preventing line of code right after it to throw a timeout error (even for a simple echo statement) ....
This is happening on all sorts of page on the site (db intensive, relatively static ...) which is making it difficult to troubleshoot. I have been tweaking the session variables and timeouts in php.ini without any luck
Has anyone encountered something like that, or could suggest possible places to look at ?
thanks !
A quick search suggests that you should be using session_write_close() to close the session when you are done using it if you are on an NTFS file system. Starting a session locks the session file so no other file can access it while code is running. For some reason, the lock sometimes doesn't release automatically reliably on Windows/NTFS, so you should manually close the session when you are done with it.
I have a hefty PHP script.
So much so that I have had to do
ini_set('memory_limit', '3000M');
set_time_limit (0);
It runs fine on one server, but on another I get: Out of memory (allocated 1653342208) (tried to allocate 71 bytes) in /home/writeabo/public_html/propturk/feedgenerator/simple_html_dom.php on line 848
Both are on the same package from the same host, but different servers.
Above Problem solved new problem below for bounty
Update: The script is so big because it rawls a site and parsers data from 252 pages, including over 60,000 images, which it makes two copies of. I have since broken it down into parts.
I have another problem now though. when I am writing the image from outside site to server like this:
try {
$imgcont = file_get_contents($va); // $va is an img src from an array of thousands of srcs
$h = fopen($writeTo,'w');
fwrite($h,$imgcont);
fclose($h);
} catch(Exception $e) {
$error .= (!isset($error)) ? "error with <img src='" . $va . "' />" : "<br/>And <img src='" . $va . "' />";
}
All of a sudden it goes to a 500 internal server error page and I have to do it again, at which point it works, because files are only copied it they don't already exist. Is there anyway I can receive the 500 response code and send it back it to the url to make it go again? As this is to all be an automated process?
If this is memory related, I would personally use copy() rather than file_get_contents(). It supports the file wrappers the same way, and I don't see any advantage in loading the whole file in memory just to write it back on the filesystem.
Otherwise, your error_log might give you more information as of why the 500 happens.
There are three parties involved here:
Remote - The server(s) that contain the images you're after
Server - The computer that is running your php script
Client - Your home computer if you are running the script from a web browser, or the same computer as the server if you are running it from Cron.
Is the 500 error you are seeing being generated by 'Remote' and seen by 'Server' (i.e. the images are temporarily unavailable);
Or is it being generated by 'Server' and seen by 'Client' (i.e. there is a problem with your script).
If it is being generated by 'Remote', then see Ali's answer for how to retry.
If it is being generated by your script on 'Server', then you need to identify exactly what the error is - the php error logs should give you more information. I can think of two likely causes:
Reaching PHP's time limit. PHP will only spend a certain amount of time working before returning a 500 error. You can set this to a higher value, or regularly re-set the timer with a call to set_time_limit(), but that won't work if your server is configured in safe mode.
Reaching PHP's memory limit. You seem to have encoutered this already, but worth making sure you're script still isn't eating lots of memory. Consider outputing debug data (possibly only if you set $config['debug_mode'] = true or something). I'd suggest:
try {
echo 'Getting '.$va.'...';
$imgcont = file_get_contents($va); // $va is an img src from an array of thousands of srcs
$h = fopen($writeTo,'w');
fwrite($h,$imgcont);
fclose($h);
echo 'saved. Memory usage: '.(memory_get_usage() / (1024 * 1024)).' <br />';
unset($imgcont);
} catch(Exception $e) {
$error .= (!isset($error)) ? "error with <img src='" . $va . "' />" : "<br/>And <img src='" . $va . "' />";
}
I've also added a line to remove the image from memory, incase PHP isn't doing this correctly itself (in theory that line shouldn't be necessary).
You can avoid both problems by making your script process fewer images at a time and calling it regularly - either using Cron on the server (the ideal solution, although not all shared webhosts allow this), or some software on your desktop computer. If you do this, make sure you consider what will happen if there are two copies of the script running at the same time - will they both fetch the same image at the same time?
So it sounds like you're running this process via a web browser. I'm guessing that you may be getting the 500 error from Apache timing out somehow after a certain period of time or the process dies or something funky. I would suggest you do one of the following:
A) Move the image downloading to a background process, you can run the crawl script in the browser which will write the urls of the images to be downloaded to the db or something and another script will fire up via cron and fetch all the images. You could also have this script work in batches of 100 or so at a time to keep memory consumption down
B) Call the script directly from the command line (this is really the preferred method for something like this anyway, and you should still probably separate the image fetching to another script)
C) If the command line is not an option for some reason, have your browser loaded script touch a file, and have a cron that runs every minute and looks for the file to exist. Then it fires up your script, you can have the output written to a file for you to check later or send an email when it's completed
Is there anyway I can receive the 500 response code and send it back it to the url to make it go again? As this is to all be an automated process?
Here's the simple version of how I would do it:
function getImage($va, $writeTo, $retries = 3)
{
while ($retries > 0) {
if ($imgcont = file_get_contents($va)) {
file_put_contents($writeTo, $imgcont);
return true;
}
$retries--;
}
return false;
}
This doesn't create the file unless we successfully get our image file, and will retry three times by default. You will of course need to add any require exception handling, error checking, etc.
I would definitely stop using file_get_contents() and write the files in chunks, like this:
$read = fopen($url, 'rb');
$write = fope($local, 'wb');
$chunk = 8096;
while (!feof($read)) {
fwrite($write, fread($read, $chunk));
}
fclose($fp);
This will be nicer to your server, and should hopefully solve your 500 problems. As for "catching" a 500 error, this is simply not possible. It is an irretrievable error thrown by your script and written to the client by the web server.
I'm with Swish, this is not really the kind of task that PHP is intended for - you'de be much better using some sort of server side scripting.
Is there anyway I can receive the 500 response code and send it back it to the url to make it go again?
Have you considered using another library? Fetching files from an external server seems to me more like a job for curl or ftp than file_get_content &etc. If the error is external, and you're using curl, you can detect the 500 return code and handle it appropriately without crashing. If not, then maybe you should split your program into two files - one of which fetches a single file/image, and the other that uses curl to repeatedly call the first one. Unless the 500 error means that all php execution crashes, you would be able to detect the failure and handle it.
Something like this pseudocode:
file1.php:
foreach(list_of_files as filename){
do {
x = call_curl('file2.php', filename);
}
while(x == 500);
}
file2.php:
filename=$_GET['filename'];
results = use_curl_to_get_page(filename);
echo results;
Thanks for all your input. I had seperated everything by the time I wrote this question, so the crawler, fired the image grabber, etc.
I took on board the solution to split the number of images, and that also helped.
I also added a try, catch round the file read.
This was only being called from the browser during testing, but now that it is all up and running it is going to be a cron job.
Thanks Swish and Benubird for your particularly detailed and educational answers. Unfortunately I had no cooperation with the developers on the backend where the images are coming from (long and complicated story).
Anyway, all good now so thanks. (Swish how do you call a script from the command line, my knowledge of this field is severely lacking?)