What's the difference when using GET or POST method? Which one is more secure? What are (dis)advantages of each of them?
(similar question)
It's not a matter of security. The HTTP protocol defines GET-type requests as being idempotent, while POSTs may have side effects. In plain English, that means that GET is used for viewing something, without changing it, while POST is used for changing something. For example, a search page should use GET, while a form that changes your password should use POST.
Also, note that PHP confuses the concepts a bit. A POST request gets input from the query string and through the request body. A GET request just gets input from the query string. So a POST request is a superset of a GET request; you can use $_GET in a POST request, and it may even make sense to have parameters with the same name in $_POST and $_GET that mean different things.
For example, let's say you have a form for editing an article. The article-id may be in the query string (and, so, available through $_GET['id']), but let's say that you want to change the article-id. The new id may then be present in the request body ($_POST['id']). OK, perhaps that's not the best example, but I hope it illustrates the difference between the two.
When the user enters information in a form and clicks Submit , there are two ways the information can be sent from the browser to the server: in the URL, or within the body of the HTTP request.
The GET method, which was used in the example earlier, appends name/value pairs to the URL. Unfortunately, the length of a URL is limited, so this method only works if there are only a few parameters. The URL could be truncated if the form uses a large number of parameters, or if the parameters contain large amounts of data. Also, parameters passed on the URL are visible in the address field of the browser not the best place for a password to be displayed.
The alternative to the GET method is the POST method. This method packages the name/value pairs inside the body of the HTTP request, which makes for a cleaner URL and imposes no size limitations on the forms output. It is also more secure.
The best answer was the first one.
You are using:
GET when you want to retrieve data (GET DATA).
POST when you want to send data (POST DATA).
There are two common "security" implications to using GET. Since data appears in the URL string its possible someone looking over your shoulder at Address Bar/URL may be able to view something they should not be privy to such as a session cookie that could potentially be used to hijack your session. Keep in mind everyone has camera phones.
The other security implication of GET has to do with GET variables being logged to most web servers access log as part of the requesting URL. Depending on the situation, regulatory climate and general sensitivity of the data this can potentially raise concerns.
Some clients/firewalls/IDS systems may frown upon GET requests containing an excessive amount of data and may therefore provide unreliable results.
POST supports advanced functionality such as support for multi-part binary input used for file uploads to web servers.
POST requires a content-length header which may increase the complexity of an application specific client implementation as the size of data submitted must be known in advance preventing a client request from being formed in an exclusively single-pass incremental mode. Perhaps a minor issue for those choosing to abuse HTTP by using it as an RPC (Remote Procedure Call) transport.
Others have already done a good job in covering the semantic differences and the "when" part of this question.
I use GET when I'm retrieving information from a URL and POST when I'm sending information to a URL.
You should use POST if there is a lot of data, or sort-of sensitive information (really sensitive stuff needs a secure connection as well).
Use GET if you want people to be able to bookmark your page, because all the data is included with the bookmark.
Just be careful of people hitting REFRESH with the GET method, because the data will be sent again every time without warning the user (POST sometimes warns the user about resending data).
This W3C document explains the use of HTTP GET and POST.
I think it is an authoritative source.
The summary is (section 1.3 of the document):
Use GET if the interaction is more like a question (i.e., it is a safe operation such as a query, read operation, or lookup).
Use POST if:
The interaction is more like an order, or
The interaction changes the state of the resource in a way that the
user would perceive (e.g., a subscription to a service), or
The user be held accountable for the results of the interaction.
Get and Post methods have nothing to do with the server technology you are using, it works the same in php, asp.net or ruby. GET and POST are part of HTTP protocol.
As mark noted, POST is more secure. POST forms are also not cached by the browser.
POST is also used to transfer large quantities of data.
The reason for using POST when making changes to data:
A web accelerator like Google Web Accelerator will click all (GET) links on a page and cache them. This is very bad if the links make changes to things.
A browser caches GET requests so even if the user clicks the link it may not send a request to the server to execute the change.
To protect your site/application against CSRF you must use POST. To completely secure your app you must then also generate a unique identifier on the server and send that along in the request.
Also, don't put sensitive information in the query string (only option with GET) because it shows up in the address bar, bookmarks and server logs.
Hopefully this explains why people say POST is 'secure'. If you are transmitting sensitive data you must use SSL.
GET and POST are HTTP methods which can achieve similar goals
GET is basically for just getting (retrieving) data, A GET should not have a body, so aside from cookies, the only place to pass info is in the URL and URLs are limited in length , GET is less secure compared to POST because data sent is part of the URL
Never use GET when sending passwords, credit card or other sensitive information!, Data is visible to everyone in the URL, Can be cached data .
GET is harmless when we are reloading or calling back button, it will be book marked, parameters remain in browser history, only ASCII characters allowed.
POST may involve anything, like storing or updating data, or ordering a product, or sending e-mail. POST method has a body.
POST method is secured for passing sensitive and confidential information to server it will not visible in query parameters in URL and parameters are not saved in browser history. There are no restrictions on data length. When we are reloading the browser should alert the user that the data are about to be re-submitted. POST method cannot be bookmarked
All or perhaps most of the answers in this question and in other questions on SO relating to GET and POST are misguided. They are technically correct and they explain the standards correctly, but in practice it's completely different. Let me explain:
GET is considered to be idempotent, but it doesn't have to be. You can pass parameters in a GET to a server script that makes permanent changes to data. Conversely, POST is considered not idempotent, but you can POST to a script that makes no changes to the server. So this is a false dichotomy and irrelevant in practice.
Further, it is a mistake to say that GET cannot harm anything if reloaded - of course it can if the script it calls and the parameters it passes are making a permanent change (like deleting data for example). And so can POST!
Now, we know that POST is (by far) more secure because it doesn't expose the parameters being passed, and it is not cached. Plus you can pass more data with POST and it also gives you a clean, non-confusing URL. And it does everything that GET can do. So it is simply better. At least in production.
So in practice, when should you use GET vs. POST? I use GET during development so I can see and tweak the parameters I am passing. I use it to quickly try different values (to test conditions for example) or even different parameters. I can do that without having to build a form and having to modify it if I need a different set of parameters. I simply edit the URL in my browser as needed.
Once development is done, or at least stable, I switch everything to POST.
If you can think of any technical reason that this is incorrect, I would be very happy to learn.
GET method is use to send the less sensitive data whereas POST method is use to send the sensitive data.
Using the POST method you can send large amount of data compared to GET method.
Data sent by GET method is visible in browser header bar whereas data send by POST method is invisible.
Use GET method if you want to retrieve the resources from URL. You could always see the last page if you hit the back button of your browser, and it could be bookmarked, so it is not as secure as POST method.
Use POST method if you want to 'submit' something to the URL. For example you want to create a google account and you may need to fill in all the detailed information, then you hit 'submit' button (POST method is called here), once you submit successfully, and try to hit back button of your browser, you will get error or a new blank form, instead of last page with filled form.
I find this list pretty helpful
GET
GET requests can be cached
GET requests remain in the browser history
GET requests can be bookmarked
GET requests should (almost) never be used when dealing with sensitive data
GET requests have length restrictions
GET requests should be used only to retrieve data
POST
POST requests are not cached
POST requests do not remain in the browser history
POST requests cannot be bookmarked
POST requests have no restrictions on data length
The GET method:
It is used only for sending 256 character date
When using this method, the information can be seen on the browser
It is the default method used by forms
It is not so secured.
The POST method:
It is used for sending unlimited data.
With this method, the information cannot be seen on the browser
You can explicitly mention the POST method
It is more secured than the GET method
It provides more advanced features
This is a relatively simple question to which I haven't been able to find the answer (or my search skills are just rubbish), say I send $_POST content through jQuery/AJAX, the content is handled by PHP and a result is returned, finishing this function.
How long does this $_POST content still "live"? Is it "destroyed" as soon as the AJAX / PHP function is done?
Or does it remain in the system for a while?
HTTP is a stateless protocol, so the web server does not need to keep any data for the duration of multiple requests. When your web server sends multiple requests to PHP (usually PHP-FPM), it treats every request separately.
So if you send an HTTP POST request (which has a Content-Type of application/x-www-form-urlencoded or multipart/form-data) to PHP, the data will be loaded into $_POST before your script is executed. You can access it while your script is running, but afterwards all variables are deleted from memory, including $_POST.
If you want to use data across multiple requests, you need to persist it somehow, for example using Sessions, files, databases, shared memory, APCu.
I'm using AJAX to insert data in MySQL database. During the AJAX request, there is a PHP function that loops inside a JSON array in order to get data and to insert it inside the DB. Everything works fine.
But, I would like to know if there is a way to pass, during the AJAX request a PHP var to jQuery in order to append it in HTML or to retrieve the data with console.log. I can get these info on AJAX complete but is it possible to get info during AJAX request?
I think you can just echo the php var on the page?
E.g. echo "<label>".$phpVarToAppend."</label>";
Nope. HTTP is stateless. You make a request and you get a result.
You must use different techniques to check the request's processing advancements from server.
Given that you store a record of the progression during the processing somewhere (db, cache or whatever), the simpler trick is using another AJAX call to a simple function that returns the last processing record.
This is a traditional polling mechanism.
A more advanced solution could be using a different connection upgraded to websockets. This will be a true realtime channel.
On top of these there's a world of possibilities. It only depends on what you need to manage with your POST request and how long does it take the processing.
For big payloads it's usually better to return immediately and start processing in a different task. (and thus a pooling mechanism to check progression)
Is it possible to achieve true multi-threading with Ajax? If so, how? Please give me some related information, websites or books.
It depends on what you mean by "multithreaded".
Javascript code is distinctly singlethreaded. No Javascript code will interrupt any other Javascript code currently executing on the same page. An AJAX (XHR) request will trigger the browser to do something and (typically) call a callback when it completes.
On the server each Ajax request is a separate HTTP request. Each of these will execute on their own thread. Depending on th Web server config, they may not even execute on the same machine. But each PHP script instance will be entirely separate, even if calling the same script. There is no shared state per se.
Now browsers typically cap the number of simultaneous Ajax requests a page can make on a per host basis. This number is typically 2. I believe you can change it but since the majority of people will have the default value, you have to assume it will be 2. More requests than that will queue until an existing request completes. This can lead to having to do annoying things like creating multiple host names like req1.example.com, req2.example.com, etc.
The one exception is sessions but they aren't multithreaded. Starting a session will block all other scripts attempting to start the exact same session (based on the cookie). This is one reason why you need to minimize the amount of time a session is opened for. Arguably you could use a database or something like memcache to kludge inter-script communication but it's not really what PHP is about.
PHP is best used for simple request processing. A request is received. It is processed and a response is returned. That response could be HTML, XML, text, JSON or whatever. The request could be an HTTP request from the browser or an AJAX request.
Each of these request-response cycles should, where possible, be treated as separate entities.
Another technique used is long-polling. An HTTP request is sent to the server and may not return for a long time. This is used for Web-based chat and other "server push" type scenarios. Sometimes partial responses will be flushed without ending the request.
The last option (on Unix/Linux at least) is that PHP can spawn processes but that doesn't seem to be what you're referring to.
So what is it exactly you're trying to do?
You can't actually multi-thread but what a lot of larger websites do is flush the output for a page and then use Ajax to load additional components on the fly so that the user sees content even while the browser is still requesting new information. Its a good technique to know but, like everything else, you need to be careful how you use it.
I have a page that I am performing an AJAX request on. The purpose of the page is to return the headers of an e-mail, which I have working fine. The problem is that this is called for each e-mail in a mailbox. Which means it will be called once per mail in the box. The reason this is a problem is because the execution time of the imap_open function is about a second, so each time it is called, this is executed. Is there a way to make an AJAX call which will return the information as it is available and keep executing to prevent multiple calls to a function with a slow execution time?
Cheers,
Gazler.
There are technologies out there that allow you to configure your server and Javascript to allow for essentially "reverse AJAX" (look on Google/Wikipedia for "comet" or "reverse AJAX"). However, it's not incredibly simple and for what you're doing, it's probably not worth all of the work that goes into setting that up.
It sounds like you have a very common problem which is essentially you're firing off a number of AJAX requests and they each need to do a bit of work that realistically only one of them needs to do once and then you'd be good.
I don't work in PHP, but if it's possible to persist the return value of imap_open or whatever it's side effects are across requests, then you should try to do that and then just reuse that saved resource.
Some pseudocode:
if (!persisted_resource) {
persisted_resource = imap_open()
}
persisted_resource.use()....
where persisted_resource should be some variable stored in session scope, application scope or whatever PHP has available that's longer lived than a request.
Then you can either have each request check this variable so only one request will have to call imap_open or you could initialize it while you're loading the page. Hopefully that's helpful.
Batch your results. Between loading all emails vs loading a single email at a time, you could batch the email headers and send it back. Tweak this number till you find a good fit between responsiveness and content.
The PHP script would receive a range request in this case such as
emailHeaders.php?start=25&end=50
Javascript will maintain state and request data in chunks until all data is loaded. Or you could do some fancy stuff such as create client-side policies on when to request data and what data to request.
The browser is another bottleneck as most browsers only allow 2 outgoing connections at any given time.
It sounds as though you need to process as many e-mails as have been received with each call. At that point, you can return data for all of them together and parse it out on the client side. However, that process cannot go on forever, and the server cannot initiate the return of additional data after the http request has been responded to, so you will have to make subsequent calls to process more e-mails later.
The server-side PHP script can be configured to send the output as soon as its generated. You basically need to disable all functionality that can cause buffering, such as output_buffering, output_handler, HTTP compression, intermediate proxies...
The difficult part is that you'd need that your JavaScript library is able to handle partial input. That is to say: you need to have access to downloaded data as soon as it's received. I believe it's technically possible but some popular libraries like jQuery only allow to read data when the transfer is complete.