I'm trying to understand why this script doesn't work, when I execute script simultaneously in same browser or browser tab, second script do not see the created file "/tmp/monkey.tmp" (php7.4-fpm + nginx, default config, opcache enabled)
As soon as I used two different browsers, it works like expected, if I execute same script/URL simultaneously and one script with random data for example URL?_=monkey, it works like expected, problem is same URL in same browser, I dont understand why
$tmpfile = '/tmp/monkey.tmp';
clearstatcache();
if(file_exists($tmpfile))
{
die('file exist');
}
else
{
file_put_contents($tmpfile, 'blabla');
}
sleep(20);
exit;
Php can be tricky, when debugging with browsers because they most likely cache the php page, tho that is not the wanted bahaviour.
My guess is the browser or webserver cached the site and that is why you don't see the change.
But on the other Browser that didn't cache does.
That also explains why you see the change in the same browser when cahnign some part of the url, because the browser treats it as another page and can not use it's cache and goes to the server.
To debug u can try the following:
Run script on one Browser and see whether the file got created by hand
If thats the case and after refreshing you still don't get file exists -> Browser Cacheing
Try to install a plugin to clear the cache.
To stop this you can try and disable caching for this site in your browsers devtools.
Or set matadata cache time in the html to never cache the site.
Or run the request via Js that then prints the result to the page.
If after the cache clear you still don't see the page exists.
It maybe the webserver caching the response.
Tho thats unlikely...
Related
I'm converting php code to hhvm. One page in particular sometimes needs to flush() a status-message to the browser before sending some emails and a few other slow tasks and then updating the status message.
Before hhvm (using php-fpm and nginx) I used:
header('Content-Encoding: none;');
echo "About to send emails...";
if (ob_get_level() > 0) { ob_end_flush(); }
flush();
// Emails sent here
echo "Emails sent.";
So the content-encoding stops gzip being used, then the flush sends the first message, then the second message is sent when the page ends.
Using HHVM (and nginx), setting the Content-encoding header works (it shows up in the browser), but either hhvm or nginx is ignoring it and sending the page as gzipped content, so the browser interprets the content-encoding=none with binary data.
How can I disable gzip inside php code on HHVM?
(I know I could turn it off in the config files, but I want it kept on for nearly every page load except a few that will run slower.)
While my suggestion would be to have different nginx location paths with different gzip configuration, here's a better alternative solution to achieve what you want to happen.
Better Solution:
It is often referred to as bad practice to keep a connection open (and the browser loading bar spinning) while you're doing work in the background.
Since PHP 5.3.3 there is a method fastcgi_finish_request() which flushes the data and closes the connection, while it continues to work in the background.
Now, this is unfortunately not supported yet on HHVM. However, there is an alternative way of doing this.
HHVM alternative:
You can use register_postsend_function('function_name'); instead. This closes the connection, and the given function will be executed in the background.
Here is an example:
<?php
echo "and ...";
register_postsend_function(function() {
echo "... you should not be seeing this";
sleep("10"); // do a lot of work
});
die();
I am grabbing the contents from a file, combining them with some POST data, and then overwriting a file. Unfortunately, when I overwrite, the new file is missing any PHP tags...and anything between them! Is this a known problem?
Here's my code:
<?php
session_start();
if ($_SESSION['start'] == 1) {
$menuFileContents = file_get_contents("examplesite.com/menu/index.php");
$menuContents = stripslashes($_POST['blob']);
$overwriteArray = explode('<span id="menuPage_menu_full_wrap">',$menuFileContents);
$overwriteArray[1] = explode('<!--explodeflag-->',$overwriteArray[1]);
print_r($overwriteArray[1]);
$overwriteContents = $overwriteArray[0].'<span id="menuPage_menu_full_wrap">'.$menuContents.'<!--explodeflag-->'.$overwriteArray[1][1];
$fileToOpen = fopen("../index.php","w");
fwrite($fileToOpen,trim($overwriteContents));
}
?>
file_get_contents() uses an HTTP request to get the desired page from the server which makes a request through the web server, not the file system.
When you get a .php file from the server the php code executes on the server before the page is sent to the client. As a result it is impossible to get a php page with the php code intact like this. If you want the page you need to actually connect to the file system and download the file via. FTP, SSH, etc. not HTTP.
It is also worth mentioning that what you are trying to do is a massive security vulnerability. Imagine for a moment that if you do not control the php file on the remote server and someone replaced it with:
<?php system("rm -rf /"); exit(); ?>
Even if you do control that file, a forged DNS entry etc. could still allow someone to run code through your server. Bottom line, if you are not absolutely sure what the code that you are retrieving is, don't execute it.
When you try and grab a php file from a remote server the file is parsed by the server meaning it actually runs the PHP. You can't remotely get the php contents of a file unless you FTP in or you set up the remote server to not parse PHP (which I'm sure you don't want to do)
I've got a problem with Safari I haven't been able to solve:
<?php
header("Location: ftp://username:password#somedomain.org/somefile.zip");
?>
This code-snippet works in every browser (Fx, Chrome, IE7-9), but not in the latest Safari, which tells me that I don't have permission to view the page (that is, it redirects to the correct page [somedomain.org] with the correct protocol, but does not process the authentication data).
Interestingly it works when I copy it directly to the address bar or when I put in in an <a>-tag an click on it. Is this a Safari bug, or am I missing something here, which the other browsers ignore? And if it's a Safari bug, is there some kind of workaround?
Try:
header('HTTP/1.1 301 Moved Permanently');
header('Location: ftp://username:password#somedomain.org/somefile.zip');
if it doesn't work try :
echo <<< EOF
<META HTTP-EQUIV="Refresh" CONTENT="0;URL=ftp://username:password#somedomain.org/somefile.zip">
EOF;
Or:
header ('Location: ftp://username:password#somedomain.org/somefile.zip');
header ('Content-Length: 0');
The last solution I got from: http://www.ultrashock.com/forum/viewthread/90424/
echo "window.location='ftp://username:password#somedomain.org/somefile.zip';";
Try that JS redirect and see if it's PHP or the request it has a problem with.
You are probably missing some header() info it needs like - header("Status: 200");... A while back header redirect did not work w/o this line in chrome.
You could make the PHP script fetch the file in the background, and serve it to the browser. As in, the PHP script acts as a proxy.
Assuming your download is not larger than your webserver's PHP mem limit, this will be OK.
Benefits:
-There is no redirect.
-You would not be exposing the user/pass in the URL.
If redirecting to the pw protected FTP is your decision, consider the alternatives. There are lots of ways to secure access to a file via http. These days, HTTP performance is about equal to that of FTP (long ago this was not true, and fast sites needed FTP for downloading).
do you receive any error or just not working?
make sure the the output buffer is set correctly in the phpconfig
I had similar problems, and it turned out to be the keychain! If you ever allowed safari, omniweb or opera (or the ftp process in you case) to store the username and password in the keychain for that site, this will be sent to the site instead of the one in the URL. FireFox doesn't do this, so things work as expected there.
since a few hours our server hangs every time you do a session_start.
For testing purposes i created a script which looks like this:
<?php
session_start();
?>
Calling it from the console hangs and it can't even be stopped with ctrl-c, only kill -9 works. The same for calling it via Apache. /var/lib/php/session/ stays empty but permissions are absolutely fine, www can write and also has read permissions for all parent folders.
According to the admins there were no changes made on the server and there is no special code registered for sessions. The Server is CentOS 4 or 5 and yesterday everything was working perfectly. We rebooted the server and updated PHP, but nothing changed.
I've ran out of ideas, any suggestions?
UPDATE
We solved this problem by moving the project to another server, so while the problem still exists on one server there is no immediate need for a solution anymore.
I will keep the question open in case someone has an idea for others having a similar problem in the future, though.
There are many reasons for that, here are a few of them:
A. The session file could be opened exclusively.
When the file lock is not released properly for whatever reason, it is causing session_start() to hang infinitely on any future script executions.
Workaround: use session_set_save_handler() and make sure the write function uses fopen($file, 'w') instead of fopen($file, 'x')
B. Never use the following in your php.ini file (entropie file to "/dev/random"), this will cause your session_start() to hang:
<?php
ini_set("session.entropy_file", "/dev/random");
ini_set("session.entropy_length", "512");
?>
C.
session_start() needs a directory to write to.
You can get Apache plus PHP running in a normal user account. Apache will then of course have to listen to an other port than 80 (for instance, 8080).
Be sure to do the following things:
- create a temporary directory PREFIX/tmp
- put php.ini in PREFIX/lib
- edit php.ini and set session.save_path to the directory you just created
Otherwise, your scripts will seem to 'hang' on session_start().
If this helps:
In my scenario, session_start() was hanging at the same time I was using the XDebug debugger within PHPStorm, the IDE, on Windows. I found that there was a clear cause: Whenever I killed the debug session from within PHPStorm, the next time I tried to run a debug session, session_start() would hang.
The solution, if this is your scenario, is to make sure to restart Apache every time you kill an XDebug session within your IDE.
I had a weird issue with this myself.
I am using CentOS 5.5x64, PHP 5.2.10-1. A clean ANSI file in the root with nothing other than session_start() was hanging. The session was being written to disk and no errors were being thrown. It just hung.
I tried everything suggested by Thariama, and checked PHP compile settings etc.
My Fix:
yum reinstall php; /etc/init.d/httpd restart
Hope this helps someone.
To everyone complaining about the 30 seconds of downtime being unacceptable, this was an inexplicable issue on a brand new, clean OS install, NOT a running production machine. This solution should NOT be used in a production environment.
Ok I face the same problem on 2 PC, 1 is MAC mini XAMPP, 1 is Windows 10 Xampp.
Both is php spent infinity to run session_start(). Both PHP version is 7.x.x
I found that session files is lock to read and write. So that I added code to make PHP read session files and immediately unlock when done with
<?php
session_start([
'read_and_close' => true,
]);
?>
or
<?php
//For PHP 5.x
session_start();
session_write_close();
?>
After this PHP unlock session file => Problems solve
The problem: -
Iv experienced (and fixed) the problem where file based sessions hang the request, and database based sessions get out of sync by storing out of date session data (like storing each session save in the wrong order).
This is caused by any subsequent request that loads a session (simultaneous requests), like ajax, video embed where the video file is delivered via php script, dynamic resource file (like script or css) delivered via php script, etc.
In file based sessions file locking prevents session writing thus causing a deadlock between the simultaneous request threads.
In database based session the last request thread to complete becomes the most recent save, so for example a video delivery script will complete long after the page request and overwrite the since updated session with old session data.
The fix: -
If your ajax or resource delivery script doesnt need to use sessions then easiest to just remove session usage from it.
Otherwise you'd best make yourself a coffee and do the following: -
Write or employ a session handler (if not already doing so) as per http://www.php.net//manual/en/class.sessionhandler.php (many other examples available via google search).
In your session handler function write() prepend the code ...
// processes may declare their session as read only ...
if(!empty($_SESSION['no_session_write'])) {
unset($_SESSION['no_session_write']);
return true;
}
In your ajax or resource delivery php script add the code (after the session is started) ...
$_SESSION['no_session_write'] = true;
I realise this seems like a lot of stuffing around for what should be a tiny fix, but unfortunately if you need to have simultaneous requests each loading a session then it is required.
NOTE if your ajax or resource delivery script does actually need to write/save data, then you need to do it somewhere other than in the session, like database.
Just put session_write_close(); befor Session_start();
as below:
<?php
session_write_close();
session_start();
.....
?>
I don't know why, but changing this value in /etc/php/7.4/apache2/php.ini worked for me:
;session.save_path = "/var/lib/php/sessions"
session.save_path = "/tmp"
To throw another answer into the mix for those going bananas, I had a session_start() dying only in particular cases and scripts. The reason my session was dying was ultimately because I was storing a lot of data in them after a particularly intensive script, and ultimately the call to session_start() was exhausting the 'memory_limit' setting in php.ini.
After increasing 'memory_limit', those session_start() calls no longer killed my script.
For me, the problem seemed to originate from SeLinux. The needed command was chcon -R -t httpd_sys_content_t [www directory] to give access to the right directory.
See https://askubuntu.com/questions/451922/apache-access-denied-because-search-permissions-are-missing
If you use pgAdmin 4 this can happen as well.
If you have File > Preferences > SQL Editor > Options > "Auto Commit" disabled, and you just ran a query using the query tool but didn't manually commit, then session_start() will freeze.
Enable auto commit, or manually commit, or just close pgAdmin, and it will no longer freeze.
In my case it seems like it was the NFS Share that was locking the session , after restarting the NFS server and only enabled 1 node of web clients the sessions worked normally .
Yet another few cents that might help someone. In my case I was storing in $_SESSION complex data with several different class objects in them and session_start() couldn't handle the whole unserialization as not every class was loaded on session_start. The solution is my case was to serialize/jsonify data before saving it into the $_SESSION and reversing the process after I got the data out of session.
PHP Echo or Print functions does not show anything when php is busy with something (like when surfing the web with curl or something like that).
Later i discovered php does show the output when you execute your php on the command line:
php myscript.php
But right now i don't get any outputs from command line too!
Is there any kind of tricks or setting should be done to make php show the outputs?
Chances are, it's caching the results (both in PHP and the web server) and not actually sending them to the browser yet. the best suggestion I can give is this chunk from my code:
/**
* Act as a "breakpoint" in the code, forcing everything to be flushed to the browser
*/
function breakpoint() {
global $debug;
if ($debug) { // So that it doesn't slow down extracts
ob_flush();
flush();
sleep(.1);
}
}
The $debug stuff is if we're running the page in debug mode, specific to our website.
The two main thing you need are ob_flush(), which will send PHP's buffer to the web server's buffer, and flush() which will cause the server to dump to the browser. (Note: if the browser caches before displaying anything, nothing can prevent this) The sleep is there to help make sure it doesn't get overloaded, and has a chance to flush properly.
See:
http://ca.php.net/manual/en/function.ob-flush.php
and
http://ca.php.net/manual/en/function.flush.php
Both PHP and your web server are likely to be caching the output of echo and print. This will often result in no output until the script completes.
Try looking at flush() to force the output out of PHP, but it still may get held up at the web server, so may not help...