POST Considered Inconvenient

The POST method is vastly overused in HTML forms, with negative consequences for the user experience. POST requests cannot be bookmarked, linked to, indexed, searched, or cached. For example, recently I wanted to link to a list of top independent albums from RIAA Radar. However I couldn’t becuase the search request was sent via POST. The URL was http://www.riaaradar.com/search.asp, but there was nothing in the URL to indicate whether I was searching for Black Box Recorder or Britney Spears.

All safe requests should be done with GET. GET is vastly friendlier to users and dramatically improves your search engine placement. (In fact, form results sent with POST might as well be invisible to search engines.) POST should only be used for requests that cause some action to be taken: a book to be ordered, a page to be printed, a contract to be signed, etc. Search results and many other things don’t fall into this category.

I later discovered that there are URLs for the searches I was trying at RIAA Radar that do include all relevant information in the URL so I can link to them. However the fact remains that at least one form on their site specifies POST for search requests. Out of curiosity I saved a local copy and changed the method to GET. Wouldn’t you know it? The form worked. That is, their specific server side script accepts the same requests via GET or POST.

This dual method approach is a common antipattern in web apps. You should never have the same script that blithely accepts the same parameters via GET or POST and returns the same value. The script should pick one and stick to it. If the script is safe, then use GET. If the script is unsafe (i.e. has side effects) use POST. However it is not appropriate to pick both. That either loses search engine juice and linkability (using POST where GET is called for) or opens security holes (using GET where POST is called for).

On rare occasions you will see a script that does different things when invoked via GET or POST. For instance, a GET to http://www.example.com/forums/birds might return an index page showing the current posts in the forum, while a POST to that page might add a new entry in the forum. However, a different script is used (or at the very least a different action is taken) and different parameters are sent depending on whether a GET or POST is sent to that URL. It would still be wrong if one could add the POST’s parameters to the URL’s query string, and thereby add a new entry in the forum.

Many sites still tell you, incorrectly, that whether to use GET or POST has something to do with how much data you’re sending. That’s false. It’s a statement based on bugs in early browsers that hasn’t been true any time this millennium. Choosing GET or POST has nothing to do with how much data you’re sending and everything to do with whether you want readers to be able to find you. GET allows bookmarks, links, and search engine placement. POST doesn’t. It’s as simple as that.

20 Responses to “POST Considered Inconvenient”

  1. Mike Colbert Says:

    JSF anyone?

  2. Charlie Thomas Says:

    “Many sites still tell you, incorrectly, that whether to use GET or POST has something to do with how much data you’re sending. That’s false. It’s a statement based on bugs in early browsers that hasn’t been true any time this millennium.”

    Yes, warnings about using GET when you’re sending a lot of data are all over the Web. But I’ve found it very difficult to find any authoritative information on just what limits — if any — there are for modern browsers using GET. Does anyone know what the limits really are for IE, Firefox, etc. these days? Or are you saying that GET requests can be essentially unlimited in modern browers. Thanks in advance for any light you can shed on this.

  3. Noah Campbell Says:

    Theoretically, you’re right. The length of the url can be unlimited. I think in practice it is no so convenient.

    You’ll also need to consider any proxies, firewalls, and caches in between the browser and the server. There are a lot of independent pieces involved in serving web pages, and a lot of them may not be as forgiving as a modern browser.

  4. Charlie Thomas Says:

    Noah, thanks for the reminder that the maximum size for a GET is limited by more than the browser. I would be interested to know what others consider a “safe” number of bytes to send as a GET these days. One thousand? Five thousand? More? I’m talking about a situation where the Web application will be used by many different locations which will have different servers, configurations, etc. As a developer, where would you decide that the size required using POST instead of GET? Thanks in advance.

  5. Elliotte Rusty Harold Says:

    If there is a limit, it’s so high as to be bandwidth limited. I have sent very large (megabytes) amounts of data through single GET requests, and the only limit I have encountered in years is in server side frameworks like PHP. When that’s a problem, you fix the server side framework and reinstall it. The clients and intermediaries just work.

  6. Ed Davies Says:

    If all else fails, read the standard. RFC 2616 section 3.2.1 says:

    The HTTP protocol does not place any a priori limit on the
    length of a URI. Servers MUST be able to handle the URI of any
    resource they serve, and SHOULD be able to handle URIs of
    unbounded length if they provide GET-based forms that could
    generate such URIs. A server SHOULD return 414 (Request-URI
    Too Long) status if a URI is longer than the server can handle
    (see section 10.4.15).

    but then it goes on to say:

    Note: Servers ought to be cautious about depending on URI
    lengths above 255 bytes, because some older client or proxy
    implementations might not properly support these lengths.

    which is probably the source of the idea that there could be a problem. I guess it’s a matter of statistics: how many real proxies, etc, out there actually have a limit.

  7. Charlie Thomas Says:

    Elliotte and Ed, thanks for the info. This will definitely cause me to revisit my decision to use POST — solely because of size concerns — in some situations where I’d otherwise use GET. If 10K or so can safely be sent in a GET request, then that’s enough for my likely worst-case situation.

  8. bugfox blog » Blog Archive » POST Considered Inconvenient Says:

    […] More from Elliotte Rusty Harold on a common abuse of HTTP. […]

  9. Christof Höke Says:

    Regarding the question how long an URL might actually be I did a quick googling last week in another context but there are limits in browsers. See e.g. http://support.microsoft.com/kb/208427 where it is said the at least IE has a limit of 2083 characters. See also http://www.boutell.com/newfaq/misc/urllength.html where more data is provided.

    So I guess you should not use GET if you have lots of data but for the above mentioned search query it certainly is sufficient.

  10. Gavri Fernandez Says:

    About half the times I find a website that uses POST and I want to bookmark the result of a form submission, I notice that the application does not differentiate between GET and POST and I can bookmark it anyway (I use the Web Developer FF extension for this purpose. A couple of clicks. No need to save the page locally)

    I’m surprised that you claim there are no length limits to urls in browsers. The most widely used browser IE6 has a length limit of 2083 characters (I hit this limit recently. No warning. No error message. Nothing. The browser simply does not make the request)

    http://support.microsoft.com/kb/208427

  11. Sergey Kornilov Says:

    IE6 won’t submit a GET form if combined amount of request is more than 2000 bytes. Works fine in Firefox though.

    If you use Javascript to submit such a from it gives you syntax error.
    If you use simple Submit button it does nothing.

  12. Charlie Thomas Says:

    Sergey, did you find documentation on the IE6 GET limit somewhere? Or did you have to find this out through experimentation. When I tried to look into this a few years ago I couldn’t find much explicit documentation — just a lot of veiled warnings and “urban legends”. Maybe things haven’t improved much. Thanks.

  13. Sergey Kornilov Says:

    We develop a software named PHPRunner that generates PHP application. After we changed search page to use GET instead of POST one of our users found that his search page doesn’t work anymore. Experimenting with it I found that GET limit in IE is about 2000 bytes.
    Haven’t tested this with IE7 yet.

    Found it:
    Maximum URL length is 2,083 characters in Internet Explorer
    http://support.microsoft.com/kb/208427

  14. Charlie Thomas Says:

    Sergey, thank you! It’s not good news but the documentation is just what I was looking for. Looks like I’ll have to stay with POST for a while after all.

  15. warpedvisions.org » Blog Archive » POST a PITA? Says:

    […] November 1st, 2006 in Links HTTP POST considerd inconvinent. Not quite as snappy as “considered harmful”, but an interesting read. […]

  16. Mike Smith Says:

    If there is a limit, it’s so high as to be bandwidth limited. I have sent very large (megabytes) amounts of data through single GET requests, and the only limit I have encountered in years is in server side frameworks like PHP. When that’s a problem, you fix the server side framework and reinstall it. The clients and intermediaries just work.

  17. aardvark Says:

    Two years ago, in this millennium we developed an application that submitted user-generated data using GET. At the time our large QA lab found large data failed to make it to the servers. Indeed, there are browsers with GET limits and those limits are greater, but reachable, with POST. Yep, some urban legends are based in truth. I’m just a tad bit surprised that the author didn’t run just a FEW tests before trying to bust the “myth”.

  18. koray Says:

    As for people that say XHTML1 or XHTML1.1 is hard to learn. I started learning html for the first time at the beginning of February 2006, now 11 months later I am able to write all my sites using valid XHTML1.1. So if I can do it surely others can pick it up too

  19. ahm Says:

    The failure of XHTML is because the whole concept is not-logical. For instance, you define XML to be eXtensible and that implicate that tags are always closed. Ok, this is understandable when tags are not defined a priori, , you cannot know if second wee is inside first or not then you disambiguate using end tags always e.g. .

  20. justin Says:

    Theoretically, you’re right. The length of the url can be unlimited. I think in practice it is no so convenient.