i'm having some really strange problem.
i wrote a filemanager in PHP with the ability to download files -- which works fine.
the whole script is built as one big file.
now, while downloading a big file i'm not able to use the script at the same time for, say, browsing folder contents. it does nothing but keep loading. as soon as the download is finished everything works again.
is there something that prevents PHP from parsing the same file concurrently? because other scripts work like a charm, no matter if i'm downloading or not.
help or links to documentation are highly appreciated :)
Do you use sessions?
If yes, then that's probably the problem. The default session handler uses files which have to be locked while session-enabled code is executed. Practically this means that each user executes PHP files sequentially. To solve this you must use a custom session handler that uses a DB. Read this.
Edit: I want to point out that writing a custom session handler with no locking can be difficult and introduce various subtle bugs. Read more docs on this if you need to do it!
Edit 2: Sometimes using session_write_close() to close the session when no longer needed is enough (see the comments).
Daremon is correct, but you shouldn't need to use a different session handler. If you call session_write_close() before you start sending the file, the lock on the session file will be released and your other scripts should be able to continue.
Related
Is this normal, as I have tried two different scanners and they both seem to produce the same errors, so I'm assuming maybe it has to do with the way scanners work - as I don't get these errors when viewing the site normally (the site also functions fine - as expected).
Ofcourse, I could just ignore it and clear the error log (when I've finished scanning my site), but I'm just curious and would like to know why (and if there is a 'fix')?
FYI:
When the site is being scanned; I get the same error repeatedly (on pages where session_regenerate_id(true) is included).
The scanners which I have tried are both free Firefox Add-ons; 'SQL Inject Me' (by Security Compass) and Websecurify.
I'm running these tests on my site (via 'localhost').
The sessions are file based.
The issue is with your code and the way you handle sessions.
You are running a scanner that is going through your website and trying to manipulate various inputs and than analyzing the responses to see if the attack succeeded.
When you have session_regenerate_id(true) on a page, that method call tries to delete the current session and start a new one.
The reason that this happens on the scanner is because the scanner is probably taking some paths through the applications that a typical user wouldn't take. So if the scanner hits a URL that executes PHP code that has session_regenerate_id(true) inside of it and there isn't a current session, than you will receive that error.
Ideally you should have a check before calling session_regenerate_id(true) to see if a current session exists. If it doesn't than you should take the user to the login page and ask them to re-authenticate.
Please let me know if you have any questions.
Ofcourse, I could just ignore it and clear the error log (when I've finished scanning my site), but I'm just curious and would like to know why (and if there is a 'fix')?
Upgrade your PHP version to the current stable one: PHP 5.4.7.
Run the security checks again. If the error still happens (which I highly doubt), this needs further inspection. In that case (and even earlier) you should have added your session configuration (it is larger than you might think) to the question.
Also the code which is triggering this is unknown. The function session_regenerate_id() is normally used in the context of the session management.
Because the concept of session is not that easy to grasp many developers (incl. me) do errors in that field more often. Some useful debugging functions:
session_status
headers_list
Try using session_write_close(); before session_regenerate_id(true).
A possible reason for the problem is that your code is creating parallel queries. I had the same trouble and this helped me.
I have a web-service serving from a MySQL database. I would like to create cache file to improve the performance. The idea is once a while we read data from DB and generate a text file. My question is:
What if a client-side user is accessing the file while we are generating it?
We are using LAMP. In PHP there is flock() handles concurrency problem, but my understanding is that it's only for when 2 PHP processes accessing the file simultaneously. Our case is different.
I don't know whether this will cause issues at all. If so, how can I prevent it?
Thanks,
don't use locking;
if your cachefile is /tmp/cache.txt then you should always regenerate the cache to /tmp/cache2.txt and then do a
mv /tmp/cache2.txt /tmp/cache.txt
or
rename('/tmp/cache2.txt','/tmp/cache.txt')
the mv/rename operation is atomic if it happens inside the same filesystem; no locking needed
All sorts of optimisation options here;
1) Are you using the MySQL queryCache - that can take a huge load off the database to start with.
2) You could pull the file through a web proxy like squid (or Apache configured as a reverse caching proxy). I do this all the time and it's a really handy technique - generate the file by fetching it from a url using wget for example (that way you can have it in a cron job). The web proxy takes care of either delivering the same file that was there before, or regenerating it if needs be.
3) You don't want to be rolling your own file locking solution in this scenario.
Depending on your scenario, you could also consider cacheing pages in something like memcache which is fantastic for high traffic scenarios, but possibly beyond the scope of this question.
You can use A -> B switching to avoid this issue.
E.g. : Let there be two copies of this cache file A and B, program should read these via a symlink, C.
When program is building the cache, it would modify the file that is not "current" I.e. if C link to A, update B. Once update is complete, switch symlink to B.
next time, update A and switch symlink to A once update is complete.
this way clients would never read a file while it is being updated.
When a client-side access the file, it reads it as it is in that moment.
flock() is for when 2 PHP processes accessing the file simultaneously.
I would solve it like this:
While generating the new text file, save it to a temporary file (cache.tmp), that way the old file (cache.txt) is being accessed like before.
When generation is done, delete the old file and rename the new file
To avoid problems during that short period of time, your code should check wether cache.txt exists and retry for a short period of time.
Trivial but that should do the trick
I'm not quite sure how to phrase this question correctly, so I'll start with the scenario I encountered.
I have a bit of processing in my web app that takes longer than I'd like the user to wait to have control of the page again, so I decided to have it processed by an ajax request.
The problem is, even though I offloaded this request into an ajax request, it seems that apache won't process any further requests until the original processor heavy request is complete.
I originally wanted to know how I could get around this issue, but have since decided that it might be a bad idea in general.
However, I'm still curious if anyone knows why apache behaves this way, and what (if any) configuration directive controls it. My original thought was KeepAlive, but disabling that didn't seem to change the behavior.
I am running php through mod_php if that makes a difference.
I appreciate any help getting pointed in the right direction!
Are you using file-based sessions? PHP will lock session files for each request and maintain that lock until you do a session_write_close() or the script terminates/exits. The side effect of this is that all requests become serial, since they're all contending for the same single resource (the session file).
I am sure it's the session file. I have the same problem. I run a request that is long such as a PHPMyAdmin SQL insert which takes multiple minutes to process. While it is processing I try to open a new tab in the same browser and go to any page on my website and it will not go there until the original PHPMyAdmin request is done.
If I open an incognito window in Chrome which is the same browser it works fine. If I open the website in any other browser it is fine.
So it is probably the file based session which is the default for PHP.
Others have mentioned going to memcached. You could also save sessions in the database.
Before having to go to memcached you could do all the session based stuff at the beginning. Copy the session variable into a temporary variable so you can close it then close it. And then if you need to set a session value later open it and make the change and then close it quickly.
Can you point to evidence that it's apache? Unless your apache setup isn't optimal, most likely your page wait is something else, maybe you have set your ajax call to be non-async?
I'm trying to implement SwfUpload in my web page, and I'm using php to save files on server. As is the first time I use this component, I choose to run the algoritm is suggested from the SwfUpload team (http://swfupload.org/forum/generaldiscussion/214): I've put it in a file and I've said the control to use it as code file.
It didn't work, as I'm asking help, but what really drives me crazy is that i really don't know how to debug that stuff! The request to the file is encapsulated into the flash object, and i can't get any feedback from it if something goes wrong.
Anyone is more experienced than me about this control?
Thanks
One thing you can do if all else fails is to make upload.php append to a log file your debug messages. Example: file_put_contents("swfupload.log", print_r($_REQUEST, 1), FILE_APPEND);
The only tricky thing about swfupload that you need to understand, is that the flash component is run with a different cookie jar (for security reasons), so you need to manually tell it (via a flash param) the session_id you currently have on the server, so when it makes the http request to upload.php it passes that session_id in a $_GET param and the php script starts the session with that specific id: session_start($_GET['SESSION_ID']);. From that point on, upload.php behaves just like any other php code with your session data available. You get the $_FILES, move them to their respective folder, save them in db, and that's it.
Well SWFUpload needs to upload the file to some script, so you can log to a file from within that script based on data received...
Oh, that's a pain to debug.
One way to get to the script output (which might be some fatal errors that you can't log or something) is to use a proxy like fiddler that shows you all http traffic.
It's sometimes hard to get flash using the proxy. You may need to configure the proxy in IE even if you use another browser.
I have an application on an Apache2. The applications plays some media files like video files, mp3 files and wav files using a php file in order to avoid direct download from not-registered users. Now, I'm having problems because during the media file is loading, I cannot go to another page on the application until the media file is fully loaded.
I don't know if it's an issue about Apache2 or PHP. Can anybody help me to find out a solution for this issue? It seems that the same client cannot load two instances of PHP pages in the same site.
I'm looking forward your answers. Thanks in advance
If you are using session, I suggest you session_write_close() before you output the file to the browser.
This is because when the session is opened on one page, you cannot load another page until the session has been written and released. session_write_close() is called automatically when your script ends, but because your outputting process takes time before your script end, your session file is locked and thus other pages cannot be viewed.
However, if you are using different browser and/or system, it will be ok because the session file locked is unique to each SESSION ID.
Look at: http://php.net/manual/en/function.session-write-close.php
However do take note that after session_write_close(), you cannot call session_start() or there will be an E_WARNING warning. Also if you make changes to $_SESSION, it will not take effect.