I do not know, but whenever the PHPSESSID cookie is created, all my scripts that use curl are getting slow, someone could tell me why?
even without the session_start(). doesn't matter.
What version of php are you using? because i know there have been very big performance update from php4 to php5.
Sessions write the information to a file, so there might be write issues.
if you are using your own sessions handler that can cause a slow down too. try using only php basic session methods rather than writing your own.
Maybe you have activated the php.ini setting session.auto_start.
http://www.php.net/manual/en/session.configuration.php#ini.session.auto-start
As mentioned before: How is cURL and PHP related in your question? Please provide some code.
Without seeing more info / code all I can suggest is that you call session_write_close() before making any cURL calls and see if that improves anything.
However, the most probable scenario is that your cURL speed is not related to the PHPSESSID cookie.
Related
I am a bit puzzled on using PHP's session. Are there any specific requirements or conditions to use sessions? From all other forums, I can only see that session_start() should be the first line of the php code and nothing else beyond that.
In my test server (with PHP 7.3) it took me a while to get it to work, the only thing I found out is that when I load my site using http, the session doesn't save across load. And when I load it using https, it works fine.
While at this topic, I am also confuse why this tutorial site: https://www.w3schools.com/php/php_sessions.asp doesn't work despite it loaded using https.
I want to share a problem I had, the fix I found and then ask a question about the reason behind the fix.
The problem
After upgrading to wampserver 2.2, one of my webpages consistently didn't work the first time it was loaded in the browser. This happened with internet explorer, chrome, firefox and safari. When reloaded the page worked in all browsers.
the fix
I decided to implement a better debugging solution and while doing so inadvertently fixed my problem. When I set output_buffering =On in php.ini the page worked correctly.
my code
I'm not going to go into detail here. I'm more interested in theory for how output_buffering could be causing problems. Also I think my code will be more of an eyesore than a help.
I used ajax and joomla sessions (external script) to retrieve
information for the page.
I believe that when output_buffering was off, the joomla session
was not able to retrieve values. I'm not able to confirm this yet
though.
My question
In what ways can output_buffering= Off adversely affect code? Why?
Output buffering simply allows you to hold off on displaying data that would otherwise be immediately printed to the browser. These are used largely with templating engines in order to store unrendered templates so that they can be filled in with values. I'm guessing Joomla depends on output buffering to fill in the correct values for its templates, which would explain why you were seeing invalid output.
"I used ajax and joomla sessions (external script) to retrieve information for the page."
That is your problem. You're retrieving a content that varies within a certain time delay.
Refer to this, it may help you understand how it works: https://stackoverflow.com/a/2832179/817419
As it turned out one of the files being called by the webpage was encoded incorrectly. Once I encoded it as UTF8 without BOM my problem was largely fixed. My script would work without output_buffering turned on.
The other part of the problem was that some of the scripts that used Firebug complained of headers already being sent. This stopped the code in its tracks.
I have a script that users can input a image URL (from another website) and then crop it using JS and have it saved on my server.
My question is... when getting the image from another server is it safer to use CURL or allow_url_fopen (via file_get_contents())? Or is there a preferred/safer method available?
Security is a big concern for me as I know this is a very dangerous procedure - The script will only need to work for image files if that makes a difference.
Thanks
curl's error handling is much better than file_get_contents(). If you care about that, curl's probably the way to go. If a simple "oops, that didn't work" is enough for you, though, file_get_contents() is a perfectly acceptable shortcut.
First of all, if you want to get into a deep security discussion. Downloading files is in fact a security concern if you don't know what you are doing.
You can overwrite vital files or even overwrite system files in some cases. Uploading scripts,etc on the server with intention of executing them via web server is also an issue.
So it's not sunshine and rainbows like people pointing out here.
Back to your question, allow_url_fopen is a configuration directive. I assume you meant file_get_contents(). Either will do fine. As others pointed out Curl is a bit more verbose and it's also faster.
If you do end up using file_get_contents(), make sure you never include an unfiltered variable as a parameter.
Downloading a file is not a security concern at all, no matter whether it's your server or your own computer, or the application/code you are using to download it :) It's whether you are executing the file :D
All you have to do is just make sure you are not going to EXECUTE / INCLUDE anything in the file. Since you are only going to crop the image, I think you are good to go :)
I suggest cURL tho, allow_url_fopen may raise security problems in other places in your code.
cURL has more options ans possibilities.
Both are equally safe (or unsafe if misused).
It is wiser to use cURL because you will uprise your experience with a more powerful function, which may serve you in future projects.
Also, if this very project needs new functionalities later on, you will not have to rewrite everything with cURL if file_get_contents is not enough.
The answer of this thread shows a nice cURL function: Replace file_get_content() with cURL?
curl would generally be a safer way. It'd take an explicit design/coding decision on your part to allow the results from curl to directly affect your program, whereas allowing urls in the f*() functions would let
include('http://example.com/really_nasty_remote_takeover.php');
occur without error.
i use file_get_contents function to grab data from sites and store the data in database. it will be very inconvenient for me, if one day the script will start not working.
I know, that it can start not working, if they change the structure of site, but now i'm afraid, that maybe there are mechanisms to disable the working of this function, maybe from server?
i tried to find documentation about it, but can't get, so maybe you will help me?
Thanks
I know, that it can start not working,
if they change the structure of site,
but now i'm afraid, that maybe there
are mechanisms to disable the working
of this function, maybe from server?
Yes, it can be disabled from php.ini with allow_url_fopen option. You have other options such as CURL extension too.
Note also that you will need to have openssl extension turned on from php.ini if you are going to use the file_get_contents function to read from a secure protocol.
So in case file_get_contents is/gets disabled, you can go for CURL extension.
It is possible to disable certain functions using disable_function. Furthermore the support of URLs with filesystem functions like file_get_contents can be disabled with allow_url_fopen. So chances are that file_get_contents might not work as expected one day.
There are at least two PHP configuration directives that can break your script :
If allow_url_fopen is disabled, then, file_get_contents() will not be able to fetch files that are not on the local disk
i.e. it will not be able to load remote pages via HTTP.
Note : I've seen that option disabled quite a few times
And, of course, with disable_functions, any PHP function can be disabled.
Chances are pretty low that file_get_contents() itself will ever get disabled...
But remote-file loading... Well, it might be wise to add an alternative loading mecanism to your script, that would use curl in case allow_url_fopen is disabled.
We've recently enabled APC on our servers, and occasionally when we publish new code or changes we discover that the source files that were changed start throwing errors that aren't reflected in the code, usually parse errors describing a token that doesn't exist. We have verified this by running php -l on the files the error logs say are affected. Usually a republish fixes the problem. We're using PHP 5.2.0 and APC 3.01.9. My question is, has anyone else experienced this problem, or does anyone recognize what our problem is? If so, how did you fix it or how could we fix it?
Edit: I should probably add in some details about our publishing process. The content is being pushed to the production servers via rsync from a staging server. We enabled apc.stat_ctime because it said this helps things run smoother with rsync. apc.write_lock is on by default and we haven't disabled it. Ditto for apc.file_update_protection.
Sounds like a part-published file is being read and cached as broken. apc.file_update_protection is designed to help stop this.
in php.ini: apc.file_update_protection integer
apc.file_update_protection setting
puts a delay on caching brand new
files. The default is 2 seconds which
means that if the modification
timestamp (mtime) on a file shows that
it is less than 2 seconds old when it
is accessed, it will not be cached.
The unfortunate person who accessed
this half-written file will still see
weirdness, but at least it won't
persist.
Following the question being edited: One reason I don't see these kinds of problems is that I push a whole new copy of the site (with SVN export). Only after that is fully completed does it become visable to Apache/Mod_php (see my answer How to get started deploying PHP applications from a subversion repository? )
The other thing that may happen of course, is that if you are updating in place, you may be updating files that depend on others that have not yet been uploaded. Rsync can only guarantee atomic updates for individual files, not the entire collection that is being changed/uploaded. Another reason I think to upload the site en-mass, and only then put into use.
It sounds like APC isn't preforming or getting the correct file stat info. You could check it to make sure the APC configuration apc.stat is set correctly. Another thing you could do it force the cache to clear with apc_clear_cache() when you publish new code.
Never saw that before, even if i'm a huge user of APC.
Maybe try to trigger a script that empty the APC opcode everytime you send new code on the server ?
When you get a file with a parse error, back it up, then repubish. Take that same file that now works and do a diff against the file with the parse error.
ctime means creation time. You will want to manually flush your entire cache every time you do updates.
You can easily do this, by putting the apc.php script somewhere on your server. This script gives you cache statistics, and will allow you to drop the cache altogether.
The script comes with APC.
Hopet his helps,
Evert
This is probably happening because there's a mismatch between your code, and the cached versions of the code.
For example, APC has a cached version of User.php, but you made changes to User.php or to the data that User uses. The cached version is still running even after your deploy, because it hasn't expired yet.
If you clear your APC cache entries when you deploy, this issue should disappear.