PHP proxy server working half of the time - php

I know next to nothing about PHP. I'm trying to give away a set of financial calculators that rely on web service which is accessed via either a .NET proxy or PHP proxy. I've in stalled the PHP proxy on 3 different servers (windows and linux) and the setup always works for me. Yet, I have webmaster write and they can't get it to run.
I was hoping someone with debugging experice can give these a try:
http://www.pine-grove.com/online-calculators/pgs-html-calculators.htm
Here's more background to save you some time.
There an install PDF included. But basically unzip in a folder. Suggest "calculators". Locate js/calculator.js. At about row 11, edit this line to point to the proxy that is installed:
var strWebService = 'http://{www.your-server.com}/Calculators/proxies/calculators.php';
That's all that should be required. The HTTPRequest object's responseText field contains this error:
soap:ReceiverServer was unable to process request. ---> '\' is an unexpected token. The expected token is '"' or '''. Line 1, position 15.**
This seems to be working for most people, but for a handfull, it doesn't.
thanks in advance and I hope someone can shed some light on this problem.

A few things:
curl_setopt($ch, CURLOPT_HEADER, 1); is probably not what you want. This causes the response headers to be included as text at the head of $result. By echoing them back, they do not magically become response headers of the request to calculator.php; they are part of the body of the response.
A four second timeout is probably too small. If it works for you, but not the webmaster, then I'm guessing that the CURL request performed by calculator.php timed out for the webmaster and a warning stating this fact was sent back.
These look wrong:
$header[] = "Content-Type: text/xml; charset=utf-8";
$header[] = 'http://com.pine-grove/' + $wsMethod;
Get rid of the PHP close tag ?> at the end. It's generally not needed and you risk sending back extra whitespace, as in this case, where calculator.php is inadvertently appending CR+LF to the bodies of all proxied responses.

Related

REST (PHP, CURL) PUT/POST XML Issue: 400 Bad Request, "Invalid URL" Response with Walmart OAuth API (Postman to test)

There's a lot to unpack here. First of all, I've edited the title because I realize while eventually my REST request will be implemented into PHP code, right now I've stripped this down to Postman to test JUST the REST, so I've stripped it as low and basic as possible. I can officially say the problem is with my request.
Basically, I'm making a POST request and also testing with a PUT request to Walmart's API using the "new" OAuth authentication. Sounds grand. GET works BEAUTIFULLY in Postman and in my actual PHP code. POST and PUT immediately return the exact same error, no matter what and how I do: 400 Bad Request, Invalid URL. In the case of my PUT test, which I was doing because it's a simpler and faster text with far less XML to try to comb through, here's the exact response in HTML headers:
<HTML>
<HEAD>
<TITLE>Invalid URL</TITLE>
</HEAD>
<BODY>
<H1>Invalid URL</H1>
The requested URL "http://%5bNo%20Host%5d/v3/inventory?", is invalid.
<p>
Reference #9.c9384317.1556319123.8c89b8dc
</BODY>
</HTML>
I have left testing in PHP through my server and moved into Postman to try to locate the exact issue I'm having, and GET requests work beautifully. I am generating a new Token every 15 minutes or so. I have done... SO many minor changes, but the way the Feed examples and requests work, for all that I can tell I'm doing everything right. I honestly think I'm losing my marbles at this point.
What is most frustrating to me is that GET works. My TOKEN is working. My OAuth is working just fine. A lot of the headers that GET uses for the Walmart API are the exact same between PUT/POST/GET. The difference here is ONLY that the link has query parameters AND XML being shoved into the body. Edit: What I mean is that my headers do not change between the GET and the POST; the only thing that changes in what I am supplying is that XML is being sent in the body, and that query params are required. This is the only thing that changes between a successful GET and an unsuccessful 400 bad request PUT/POST. This leads me to believe something is wrong with how I'm processing the query params or my XML, but considering in the below example I've copy/pasted the XML... I'm not sure. It is an existing item in our catalog, I know for a fact.
Something I have noticed that I'm not quite knowledgeable enough to know if it's an issue or not with Postman is that Walmart's API requests that content-type be multipart/form-data. I've noticed it uses the term "example" when stating this, however, it usually says "this or this" if it'll accept something else. If I switch content-type in Postman to multipart/form-data, however, the Body automatically becomes raw: text instead of raw: XML(application/xml) or text/xml. If I try to swap the raw to those types, it flips my content-type automatically to application/xml, so that's a little... hinky.
I am not going through a Proxy. I've turned off Global Proxy Configuration and Use System Proxy. Request timeout is set to 0. There's nothing Client Certificates. I mean, GET works, and my Token is successfully generated via outside PHP code (not in Postman, couldn't get that to work, said heck it).
HEADERS
PUT URL: https://marketplace.walmartapis.com/v3/inventory?sku=0xyz0
AUTHORIZATION
Bearer Token: Bearer Basic --insert token here--
WM_SVC.NAME: Walmart Marketplace
WM_QOS.CORRELATION_ID: randomString123
WM_SEC.ACCESS_TOKEN: --insert token here--
Accept: application/xml
Host: https://marketplace.walmartapis.com
Content-type: multipart/form-data
BODY
raw: XML(application/xml)
<?xml version="1.0" encoding="UTF-8"?>
<inventory xmlns="http://walmart.com/">
<sku>0xyz0</sku>
<quantity>
<unit>EACH</unit>
<amount>7</amount>
</quantity>
<fulfillmentLagTime>1</fulfillmentLagTime>
</inventory>
Exact response
400 Bad Request
<HTML>
<HEAD>
<TITLE>Invalid URL</TITLE>
</HEAD>
<BODY>
<H1>Invalid URL</H1>
The requested URL "http://%5bNo%20Host%5d/v3/inventory?", is invalid.
<p>
Reference #9.c9384317.1556320429.8ca752c4
</BODY>
</HTML>
Please send help, I think I've been staring at this so long I'm going to leave this physical world behind. Walmart relatively recently updated their authentication to OAuth and they've made vague passes at saying their old authentication will be deprecated and phased out, so I obviously want to try to get this to work.I tried to copy paste everything as best as possible. That XML is copy-pasted almost letter for letter from their example, with my own product switched in.
Also, the reference number down there always changes every time I run this, so it's not something I can actually look up. I've only supplied the Postman side of things because frankly if I can get that to work, my PHP will be fine, I've already knocked out some minor issues with the successful GET request.
If it's a semi-colon issue, I'll scream.
API Documentation: https://developer.walmart.com/#/apicenter/marketPlace/latest#updateInventoryForAnItem
Well, I've figured it out.
You'll notice I'm required to supply a "Host" with my headers. That host is replacing my URl that I'm trying to connect to via POST/PUT/GET, so if my Host is https://marketplace.walmartapis.com, then my request URL is https://https://marketplace.walmartapis.com.
Once I took the https:// out of the host, the entire thing granted me a 200 response. The times I got a correct GET response, I had actually copy-pasted the correct HOST without the HTTPS by pure chance, so I completely missed this between my two separate test cases.

How to do a HTTP (POST) Request in PHP without requiring any configuration

I am creating a PHP package that I want anyone to be able to use.
I've not done any PHP dev in a few years and I'm unfamiliar with pear and pecl.
The first part of my question is related to Pecl and Pear:
It seems to me that Pear and pecl are updating my computer, rather than doing anything to my code base, which leads me to the assumption that anything I do with them will also need to be duplicated by anyone wanting to use my package. Is that correct?
The 2nd part of my question is specific, I just want to do a simple HTTP (POST) request, and ideally I'd like to do it without any config required by those who use my package.
These are options I'm considering :
HTTPRequest seems like the perfect option, but it says "Fatal error: Uncaught Error: Class 'HttpRequest' not found" when I try and use it out of the box, and when I follow these instructions for installing it I get, "autoheader: error: AC_CONFIG_HEADERS not found in configure.in
ERROR: `phpize' failed" -- I don't want to debug something crazy like that in order to do a simple HTTP request, nor do I want someone using my package to have to struggle through something like that.
I've used HTTP_Request2 via a pear install and it works for me, but there is nothing added to my codebase at all, so presumably this will break for someone trying to use my package unless they follow the same install steps?
I know that I can use CURL but the syntax for that seems way over the top for such a simple action (I want my code to be really easy to read)
I guess I can use file_get_contents() .. is that the best option?
and perhaps I'll phrase the 2nd part of my question as :
Is there an approach that is considered best practice for (1) doing a HTTP request in PHP, and (2) for creating a package that is able to be easily used by anyone?
This really depends on what you need your request for. While it can be daunting when first learning it, I prefer to use cURL requests most of the time unless all I need to do is query the page with no headers. It becomes pretty readable once you get used to the syntax and the various options in my opinion. When all I need to do is query a page with no headers, I will usually use file_get_contents as this is a lot nicer looking and simpler. I also think most PHP developers can agree with me on this standpoint. I recommend using cURL requests as, when you need to set headers, they're very organized and more popular than messing with file_get_contents.
EDIT
When learning how to do cURL in PHP, the list of options on the documentation page is your friend! http://php.net/manual/en/function.curl-setopt.php
Here's an example of a simple POST request using PHP that will return the response text:
$data = array("arg1" => "val1", "arg2" => true); // POST data included in your query
$ch = curl_init("http://example.com"); // Set url to query
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST"); // Send via POST
curl_setopt($ch, CURLOPT_POSTFIELDS, http_build_query($data)); // Set POST data
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); // Return response text
curl_setopt($ch, CURLOPT_HEADER, "Content-Type: application/x-www-form-urlencoded"); // send POST data as form data
$response = curl_exec($ch);
curl_close($ch);

PHP cURL doesn't work with - in subdomains

I have a PHP script that I'm trying to get the contents of a page. The code im using is below
$url = "http://test.tumblr.com";
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$txt = curl_exec($ch);
curl_close($ch);
echo "$txt";
It works fine for me as it is now. The problem I'm having is, if I change the string URL to
$url = "http://-test.tumblr.com"; or $url = "http://test-.tumblr.com";
It will not work. I understand that -test.example.com or test-.example.com is not a valid hostnames but with Tumblr they do exists. Is there a work around for this?
I even tried creating a header redirect on another php file so cURL would be first getting a valid hostname but works the same way.
Thank you
Domain Names with hyphens
As you can see in a previous question about the allowed characters in a subdomain, - is not a valid character to start or end a subdomain with. So this is actually correct behavior.
The same problem was reported over the curl mailing list some time ago but since curl follows the standard, there is actually nothing to change on their site.
Most likely tumblr knows about this and therefore offers some alternative address leading to the same site.
Possible workaround
However you could try using nslookup to manually lookup the IP and then send your request directly to this IP (and manually setting the hostname to the correct value). I didn't try this out, but it seems as if nslookup is capable to resolve malformatted domain names that start or end in a hyphen.
curl
Additionally you should know, that the php curl function should be a direct interface to the curl command line tool and therefore, if you would encounter special behavior it would most likely be due to the logic in the curl command line tool and not the php function.

Slow HTTP POST request in php

I'm trying to POSTing some data (a JSON string) from a php script to a java server (all written by myself) and getting the response back.
I tried the following code:
$url="http://localhost:8000/hashmap";
$opts = array('http' => array('method' => 'POST', 'content' => $JSONDATA,'header'=>"Content-Type: application/x-www-form-urlencoded"));
$st = stream_context_create($opts);
echo file_get_contents($url, false,$st);
Now, this code actually works (I get back as result the right answer), but file_get_contents hangs everytime 20 seconds while being executed (I printed the time before and after the instruction). The operations performed by the server are executed in a small amount of time, and I'm sure it's not normal to wait all this time to get the response.
Am I missing something?
Badly mis-configured server maybe that doesn't send the right content-size and using HTTP/1.1.
Either fix the server or request the data as HTTP/1.0
Try adding Connection: close and Content-Length: strlen($JSONDATA) headers to the $opts.
Also, if you want to avoid using extensions, have a look at this class I wrote some time ago to perform HTTP requests using PHP core only. It works on PHP4 (which is why I wrote it) and PHP5, and the only extension it ever requires is OpenSSL, and you only need that if you want to do an HTTPS request. Documented(ish) in comments at the top.
Supports all sorts of stuff - GET, POST, PUT and more, including file uploads, cookies, automatic redirect handling. I have used it quite a lot on a platform I work with regularly that is stuck with PHP/4.3.10 and it works beautifully... Even if I do say so myself...

curl sending GET instead of POST

Actually, it's gotten so messy that I'm not even sure curl is the culprit. So, here's the php:
$creds = array(
'pw' => "xxxx",
'login' => "user"
);
$login_url = "https://www.example.net/login-form"; //action value in real form.
$loginpage = curl_init();
curl_setopt($loginpage, CURLOPT_HEADER, 1);
curl_setopt($loginpage, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($loginpage, CURLOPT_URL, $login_url);
curl_setopt($loginpage, CURLOPT_POST, 1);
curl_setopt($loginpage, CURLOPT_POSTFIELDS, $creds);
$response = curl_exec($loginpage);
echo $response;
I get the headers (which match the headers of a normal, successful request), followed by the login page (I'm guessing curl captured this due to a redirect) which has an error to the effect of "Bad contact type".
I thought the problem was that the request had the host set to the requesting server, not the remote server, but then I noticed (in Firebug), that the request is sent as GET, not POST.
If I copy the login site's form, strip it down to just the form elements with values, and put the full URL for the action, it works just great. So I would think this isn't a security issue where the login request has to originate on the same server, etc. (I even get rid of the empty hidden values and all of the JS which set some of the other cookies).
Then again, I get confused pretty quickly.
Any ideas why it's showing up as GET, or why it's not working, for that matter?
When troubleshooting the entire class of PHP-cURL-related problems, you simply have to turn on CURLOPT_VERBOSE and give CURLOPT_STDERR a file handle.
tail -f your file, compare the headers and response to the ones you see in Firebug, and the problem should become clear.
The request is made from the server, and will not show up in Firebug. (You probably confused it with another request by your browser). Use wireshark to find out what really happens. You are not setting CURLOPT_FOLLOWLOCATION; redirects should not be followed.
Summarizing: Guess less, post more. Link to a pcap dump, and we will be able to tell exactly what you're doing wrong; or post the exact output of the php script, and we might.
The shown code does a multipart formpost (since you pass a hash array to the POSTFIELDS option), which probably is not what the target server expects.
try throwing in a print_r(curl_getinfo($loginpage)) at the end, see what the header data it sent back as.
also, if your trying to fake that your logging in from their site, your going to want to make sure your sending the correct referrer with your post, so that they "think" you were on the website when you sent it.

Categories