406 Error on jquery AJAX request - php

This is driving me mad. I'm getting a 406 error when I make an AJAX request at my hosting (Linux), but not on my local machine (Windows 7). Both using Apache. All other requests that have the same format, e.g.
http://cms.hogsmill.com/Lib/actionCMS.php?Action=saveContentText&SectionId=155&Text=...
where ... is the text I'm sending - all work fine. The text is Javascript encodeURI'ed before sending. The text that breaks is the following; seems to be something wrong with the "by default all users..." paragraph.
Any ideas?
To create a new page, click on 'New Page'. To edit an existing page click the edit page icon. In both cases, the editor on the left will popup.
Type in the name of the page in the 'Label' field. The URL for the page will be automatically populated. Only edit the URL if it doesn't look sensible. You can only change the URL for a new page; if you need to change an existing URL, contact Hogsmill.
By default, all pages are accessible by everybody. To limit access just to logged in users, unclick 'All' and select a different user type.
Typically there are two types of users:
Site Admin - able to edit content, and see all logged in only pages.
User - cannot edit content, but can see all logged in only pages.

Probably, the best, it's to analyze the HTTP headers of Your client request. It should contain some "Accept" headers, which conflicts with server response, as I understood the nature of this anomaly.
Details for 406 status code

this recently happened to me, using Chrome, headers viewed through Chrome's developer tools seemed to indicate everything was fine, then the next test, it worked without an error. so I have no idea what "fixed" it.

Related

Posting via Facebook API but image does not appear

I am using the Facebook PHP SDK to post content to our company page. 9 times out of 10, it works, but on the odd occasion like today the post is submitted and its content appears on the page but without the image provided by the og:image meta tag.
In today's case, a URL was submitted but no image is being displayed. According to Facebook's Sharing Debugger, there is a warning stating the "provided og:image URL encountered an unexpected error", despite the server access logs showing a request made to that image URL by the Facebook crawler at around the time the URL was submitted and a code 200 being returned along with 1MB+ of data.
The Object Debugger says different. It claims the web server is not running or Facebook's crawlers are being blocked, which according to the firewall logs is not the case. There are also warnings stating the og:url and fb:app_id tags are missing even though they appear in the raw output.
Scraping the URL again does not pull the image through; I have to manually use the "refresh share attachment" feature to make the image appear.
Is there any way I can rescape a URL and refresh the share attachment using PHP without having to do it manually?
Scraping the URL again does not pull the image through; I have to manually use the "refresh share attachment" feature to make the image appear.
Of course it doesn’t, otherwise I would change the kitten picture of that article of mine you posted three days ago, to “buy cheap viagra here” today, and you would automatically have that reflected on your timeline …
Is there any way I can rescape a URL and refresh the share attachment using PHP without having to do it manually?
You can (re-)scrape URLs via API – https://developers.facebook.com/docs/sharing/opengraph/using-objects#update
But I am not sure whether updating a post with the exact same values would actually count as an update, and refresh the attachment. You can give it a try (https://developers.facebook.com/docs/graph-api/reference/v3.1/post#updating), but if it doesn’t work, then your only option is to make sure everything is working properly before you make the post. (The JSON data returned from that endpoint should contain the image URL if everything worked properly, I suppose.)

Why after posting from Curl does the page only display correctly in Incognito mode?

This is a really interesting problem related to cookies, I believe.
My index.php has a form, and it posts to post.php. post.php then manipulates the data and uses curl to then post it to webinarjam.net, a 3rd-party service not controlled by me.
webinarjam.net then displays a short success message that basically just contains a unique URL. The unique URL is to a thank-you page (hosted by webinarjam.net).
This all works beautifully. But here is the problem:
Clicking the URL only works in Incognito mode (?!) (i.e. clearing browser cookies first). Otherwise, webinarjam.net simply displays "Internal Server Error".
I have no idea why the presence of cookies would PREVENT a page from displaying.
How could I change my post.php such that the unique thank-you page URL will display correctly even without using Incognito mode of the browser?
Figuring this out would enable me to simply automatically redirect the user to that thank-you page URL upon form submission. (Currently, redirecting would just bring her to a page that says "Internal Server Error".)
P.S. In case this helps, I've also used the Advanced Rest Client extension within Chrome to try to post the same query data to webinarjam.net; the resulting unique thank-you page URL is then able to be displayed without using Incognito mode. So... what I need help figuring out is: what difference exists between the way my post.php and the Advanced Rest Client extension are posting to webinarjam.net?
I finally noticed that it wasn't just Incognito mode that worked; it was any browser OTHER than my main one.
And I eventually figured out that it was because my main browser was logged into the EverWebinar site.
I guess EverWebinar barfs (fails) whenever a logged-in admin of a webinar submits a form to sign up for that webinar (even using a different email address).
It seems like upon form submission, the system notices the session/cookies of the logged-in admin, and it says "no matter what, someone who owns this webinar shouldn't be signing up for it."

Jquery form doesn't show submission message on web server but it shows submission message on local host

i am messed up with a strange problem. My Form works fine on local machine but when i upload this to some web server, it does not work fine . Without uploading online, when i add some values and click Submit. It shows ThankYou message. But When it is uploaded, after adding any value, when some values are added and Submit pressed, values are entered to database but it doesn't show Thankyou message rather it just remains as it is. Please, suggest some solution. What should i do to make it work fine online ?? Should i send the single HTML file that contains the form ? Have a look at form here: http://dl.dropbox.com/u/33855631/lon_dec/form.htm
i also tried it by uploading to different servers like bluehost etc but no solution.
You're indeed making a cross-domain request (to http://www.londondeclaration.com/wp-admin/admin-ajax.php), which your browser doesn't allow. Either host the front-end and back-end on the same domain, or (if that's not possible) host a proxy to the external source on your own domain.
As #PPvG already mentioned this looks like cross-domain scripting. In general, it is possible to perform croos-domain scripting, but you must set the according HTTP headers as specified here. That's what happens in detail:
user accesses a web page on DomainA including some JavaScript (i.e. jQuery)
user submits button and jQuery fires request to your server on DomainB
result is returned to the users browser, but per plicy the client forbids the scripts from DomainA to examine the response retrieved from DomainB. It's important to understand that security is enforced on the client.
How to solve the problem: Your application on DomainB must set the correct HTTP response header, so that the browser allows your jQuery script from DomainA to work with the response from DomainB:
Access-Control-Allow-Origin:DomainA
This may still not work in all situations. I.e. Internet Explorer does enforce fairly rigid rules when it comes to HTTPS, if I remember correcly cookie management is a problem as well.
EDIT: In Google Chrome you can easily see that this is the problem:

GET an POST data for Mozilla

I am doing my work in PHP.
I have 3 pages,
A is plain HTML and contains a search field.
B is .php and returns results of the search.
C is also php and allows user to update some details for the displayed results.
When I'm doing Refresh my B page or Go-Back from C to then I
get this message
"To display this page, Firefox must
send information that will repeat any
action (such as a search or order
confirmation) that was performed
earlier."
I saw "When i'm using "POST" method then I get this message, if I'm used GET then
I don't.
Any buddy Explain me ,why???
The GET method should be used to obtain information from a web page.
The POST method should be used to send information to a web page.
The reason it asks you to confirm whether or not to send information again is because it's not always the user's intention to repost a form if they press back. One example is at an online store, you would not want to repost a form to purchase a product twice, otherwise you could be billed for the product twice. This is theoretical of course since someone who makes an online store should ensure that an accidental purchase can't happen.
Also, if you use GET, then all information is appended to the URL of the PHP page. This is a potential security issue, especially if the form contents are private. For such forms, you should be using POST.
A wild guess,
POST is not written in the URL, so you need to resend it, while GET, when you click to return to B, the arguments are still in the URL so you dont need to resend.
Mozilla added this message to warn you from sending the information twice.
Like in the form of registration, you don't want to register twice.
Firefox developers added that warning for POST method. It will warn you for POST in case of back/forward also.
This is an added safeguard for users. Because, most shopping carts/banking portals use POST method for checkout/transaction confirmation (actually I have not seen or developed any web app to use get method for this purpose).
So, Firefox (and most other common browsers) warn you in this scenario (when your are sending POST request indirectly, i.e. using back/forward/refresh button). This prevents the user from multiple checkout.
Another reason to add this warning is, sometimes chekout is time consuming. So, when some time is passed after the original submission, some impatient users think that the browser/server has stopped working. So, they tend to press the refresh button. This warning gives them a good hint.
I think the point is that GET requests should be used to get information without changing anything on the server so if you reload the same information there's no issue. POST requests should be used to change data on the server so when you reload the page that may have undesirable effects.
Firefox should normally allow you to navigate back to your B page from your C page. However if your B page is not in the cache, possibly because it sends a Cache-control: no-store header, then you will get the POSTDATA warning.
On the other hand explicitly reloading page B will always generate a POSTDATA warning.
When you submit data in the POST method, it sends headers to the page you submit to. When you refresh the page or go back, your browser repeats your POST request and Firefox warns you of this.

using Live HTTP Headers

On a website, I enter some parameters in a form, click on search and then get a page with a message "retrieving your results". After the search is complete, I get another page with my results displayed.
I am trying to recreate this programatically and I used Live HTTP Headers to get a peek of what is going on behind i.e the url, form variables,etc. However, I'm only getting information of what goes on up to the page which shows "retrieving your results". Live HTTP Header is not giving me information up to the page which contains the final results.
What can I do to get this final bit of information (i.e the url, form variables, etc)
I use Charles HTTP Proxy for all my HTTP troubleshooting needs. It has a ton of options and works with any browser.
"Web Developer" does this:
https://addons.mozilla.org/en-US/firefox/addon/60
#Mark Harrison
I have webdeveloper installed. Initially, I used it to turn off meta-redirects and referrers to get a clearer picture of the http interaction. But when i do this, the website does not work (i.e it is not able to complete the process of retrieving my search results) so i turned it back on.
I'm wondering if anyone has had to capture http information for a site that has a processing page in between the user input page and the results page
That sounds weird? I'm pretty sure that LiveHttpHeaders should show this. Can you double check that you aren't missing something? Otherwise try with Firebug. It has a tab for "network", which shows all requests made.
I'm using Fiddler2, which is a free (as in beer), highly configurable proxy; works with all browsers, allows header inspection/editing/automodification on request/response.
Disclaimer: I'm in no way affiliated with Fiddler, just a (very happy) user.
I for such problems always fire-on an Ethereal or similar network spying tool, to see exactly, what is going on.
The document is creating a browser component called XMLHTTPRequest , on submit event the object method send() is called, during the waiting time for server response an html element is replaced with a "Waiting message" on succesfull response a callback is called with the new html elements and then inserted in the selected html element. (That's called ajax).
If you want to follow that process you can use Firefox Live HTTP Headers Extension , or Wireshark to view full HTTP headers and actions (get/post/).

Categories