IIS with PHP: How to get the data from a SOAP envelope? - php

An external application is calling my PHP script and passing URL parameters and a SOAP envelope. I can access the URL parameters in the $_GET array. However I cannot access the SOAP envelope. I was expecting it in the $_POST array, but that is empty.
The IIS log clearly shows that the SOAP envelope is being received by IIS but somehow I cannot access it in PHP - or I am using the wrong methods/variables.
When I check the "failed requests log" in IIS, I see the SOAP data as "GENERAL_REQUEST_ENTITY".
However I am at a loss when trying to access said data from within my PHP script. Is this kind of content stored in a different variable (other than $_GET, $_POST or $_REQUEST)? Or is IIS not forwarding this to my script and if so, why not?
Edit: I am not tied to PHP if PHP is not capable of doing this, I am open to any language that gets the job done. However my knowledge of ASP.NET is more limited than that of PHP. Is there a better language for capturing the missing data from the request?
This problem is not going away and I need to find a solution. So far no matter how much I search via Google, I keep running into dead ends.
Edit #2: Here is a screenshot of the log file for clarification. It clearly shows the XML and so does Fiddler, which I have also just now tested:
How can I capture this XML with PHP (or ASP.NET for that matter)?
Best regards
Max

Related

Connecting to SimPro API via Web

This is a very similar to the question posed at Use php and simPro API to list customers.
The response to that question suggests downloading examples from GitHub. I have downloaded the SimPro examples at GitHub and have them functioning on the commandline.
I want to be able to use a web page as an intermediary between my FileMaker Pro database and the SimPro API. I can pass data to a web page written in PHP. It can convert the data to JSON and form a call to SimPro receive a response and display the success or failure as a web page.
Presently I have my JSON data hardcoded so that I can test the process. When run from a web browser I don't get a result, though the code works perfectly well running from the command line.
I'm not sure what I need to do make the examples web compatible. Can someone push me in the right direction?
Would be more than happy to help - could you post a snippet of your code - minus the auth info of course - so that I can try to assist?
Do you have your code printing anything else to the browser first - so that you make sure the script is actually executing as it should from the web server side - and have you checked your web server logs for any errors?
In some cases I've seen this is usually due to PHP config (there are separate configs for CLI and CGI in most setups) and can mean whilst libraries are loaded in one environment they may not be available to the other. The web server also usually requires a reload if you have just loaded libraries for use within your script.
Hope that helps.

PHP fopen not working on one particular domain

I'm trying to download file from remote url with fopen function.
Problem it's function return false from one website that i need. From other domains functions works fine.
How could it be? Maybe have some options in php? Or that website can protect file from some access(but from browser file available)?
There are a number of checks the server side can do to prevent "miss usage" of their service. One example is a check of the "HTTP Referer Header" which indicates that your request is done by a browser navigating from a link to the object.
You can simulate all that if you want to, but for that you have to find out exactly what the difference is between your request and one the browser successfully makes. Two things to do for that:
find out the exact error message you receive back. Easiest for that is to use php's cURL extension instead of file_open() for your request, it allows you to dump everything you get back. There might be valuable information like a reason in the reply.
monitor both requests by means of a network sniffer, for example tcpdump or wireshark. The comparison of both requests allows to tell the exact difference. That again is the information you need to precisely rebuilt the browsers request in your script.
On some shared hosting or some VPS fopen not work or are disabled inside PHP. Try to use CURL to get contnt. If that not work, the last solution (only if you send some informations via GET but not to recive data) is to use <img> tag and inside "src" to send request. That work ONLY if you send informations, but if you need to recive something, you need to use or AJAX or cURL.

PHP redirect file post stream

I'm writing an API using php to wrap a website functionality and returning everything in json\xml. I've been using curl and so far it's working great.
The website has a standard file upload Post that accepts file(s) up to 1GB.
So the problem is how to redirect the file upload stream to the correspondent website?
I could download the file and after that upload it, but I'm limited by my server to just 20MG. And it seems a poor solution.
Is it even possible to control the stream and redirect it directly to the website?
I preserverd original at the bottom for posterity, but as it turns out, there is a way to do this
What you need to use is a combination of HTTP put method (which unfortunately isn't available in native browser forms), the PHP php://input wrapper, and a streaming php Socket. This gets around several limitations - PHP disallows php://input for post data, but it does nothing with regards to PUT filedata - clever!
If you're going to attempt this with apache, you're going to need mod_actions installed an activated. You're also going to need to specify a PUT script with the Script directive in your virtualhost/.htaccess.
http://blog.haraldkraft.de/2009/08/invalid-command-script-in-apache-configuration/
This allows put methods only for one url endpoint. This is the file that will open the socket and forward its data elsewhere. In my example case below, this is just index.php
I've prepared a boilerplate example of this using the python requests module as the client sending the put request with an image file. If you run the remote_server.py it will open a service that just listens on a port and awaits the forwarded message from php. put.py sends the actual put request to PHP. You're going to need to set the hosts put.py and index.php to the ones you define in your virtual host depending on your setup.
Running put.py will open the included image file, send it to your php virtual host, which will, in turn, open a socket and stream the received data to the python pseudo-service and print it to stdout in the terminal. Streaming PHP forwarder!
There's nothing stopping you from using any remote service that listens on a TCP port in the same way, in another language entirely. The client could be rewritten the same way, so long as it can send a PUT request.
The complete example is here:
https://github.com/DeaconDesperado/php-streamer
I actually had a lot of fun with this problem. Please let me know how it works and we can patch it together.
Begin original answer
There is no native way in php to pass a file asynchronously as it comes in with the request body without saving its state down to disc in some manner. This means you are hard bound by the memory limit on your server (20MB). The manner in which the $_FILES superglobal is initialized after the request is received depends upon this, as it will attempt to migrate that multipart data to a tmp directory.
Something similar can be acheived with the use of sockets, as this will circumvent the HTTP protocol at least, but if the file is passed in the HTTP request, php is still going to attempt to save it statefully in memory before it does anything at all with it. You'd have the tail end of the process set up with no practical way of getting that far.
There is the Stream library comes close, but still relies on reading the file out of memory on the server side - it's got to already be there.
What you are describing is a little bit outside of the HTTP protocol, especially since the request body is so large. HTTP is a request/response based mechanism, and one depends upon the other... it's very difficult to accomplish a in-place, streaming upload at an intermediary point since this would imply some protocol that uploads while the bits are streamed in.
One might argue this is more a limitation of HTTP than PHP, and since PHP is designed expressedly with the HTTP protocol in mind, you are moving about outside its comfort zone.
Deployments like this are regularly attempted with high success using other scripting languages (Twisted in Python for example, lot of people are getting onboard with NodeJS for its concurrent design patter, there are alternatives in Ruby or Java that I know much less about.)

PHP client communicate with C++ Server over TCP

Im trying to get a Php page to be able to communicate with a C++ server. It must be able to recive and send at all time (and the same time). When the php page recives a string it shall make some changes in Mysql db but at the same time send other strings to the server.
You can think of it as a chat, but then the php recive a string it shall be formated.
The C++ server is up and running only the php client side is a mystery for me.
Is there any example code that anyone can help me with?
Thx!
Well first of all you need to understand and remember that PHP in itself is stateless. You can have PHP receive a request and parse it and do something with the gained information, but after that the request closes.
One solution you could try is having your C++ app HTTP POST to a PHP script. This script can parse and analyze the request and take action upon it.
Store something in the database
Create a response
Output the response
The C++ app can then do something with the response.
This will work but if your actual use case is a chatroom or something of the likes there are better solutions than using PHP.

Best Method/App for batch posting xml data to a php script

I've got a website that receives Posted XML data from a third party.
I'm looking for a method so I can batch post a number of XML files to this script for development/debugging purposes.
I have built a php script that loops through an array of files and uses curl to post each file separately. However due to the number of files i wish to post, I feel PHP isn't the best method as the script times out.
Ideally, I'm looking for a terminal process/os x application that will pick up all files in a given directory and post the contents of each one to a defined URL one by one.
Any suggestions/ideas greatly received.
Jim.
I'm a little confused. Why not just set the CURLOPT_TIMEOUT option appropriately on your posting application ? Otherwise your PHP solution seems to be functioning ok ?
If you don't want a PHP solution, have you looked at posting via HTTP in parallel, and use some scripting language with threading support (e.g. ruby or similar). The only headache is that you're now going to be loading your server more for the benefit of your script running faster, and you need to determine what sort of load your server-side process can handle.

Categories