I'm trying to get around an access control origin error for a web service by building a local proxy, but I'm not sure how to do it. The web service looks like the attached file, and is accessed directly by using the following URL:
https://url.com/SparkService.asmx?op=InsertConsumer
How would I write something locally that carried out this URL's functionality?
I built a PHP file that will pull the web service URLs contents, but it doesn't seem to carry out the functionality of that web service:
<?php
$op = htmlspecialchars($_GET["op"]);
$proxyURL = 'https://url.com/SparkService.asmx?op=' . $op;
die( file_get_contents($proxyURL) );
?>
The image shows you have to use a POST, which you can't do with bare-bones file_get_contents - it defaults to using a GET query. You'll have to use CURL, or set up a stream to configure and perform a POST.
I am not sure about your intension of doing this.
you can use WSO2 ESB proxy[1] to send messages to the real service through a local end point.
Or else you can create your own service by using WSO2 AS[2] and deploying a sample POJO as a web service.
[1] http://wso2.org/project/esb/java/4.0.0/docs/samples/proxy_samples.html
[2] http://wso2.org/project/app-server/
Related
I'm trying to get my blog's RSS feed and manipulate it in PHP. Accord to the documentation, the XML feed for all Wordpress blogs can be downloaded at this address:
http://www.example.com/feed/atom/
I've written some simple code that works fine on a test server, but won't work on my hosted server:
$feedUrl = 'http://www.example.com/blog/feed/atom/';
$rawFeed = file_get_contents($feedUrl);
$feedXML = new SimpleXmlElement($rawFeed);
The reason for this is because my hosting provider prevents scripts making HTTP (port 80) connections back to the same server that they're running on.
How can I get access to the feed without needing to do a HTTP request to the same server?
I have tried accessing the URL directly (i.e. /home/example.com/blog/feed/atom), but nothing is found because it needs a proper request to generate the XML RSS feed. I've also tried a CURL request, but I got the same result.
It's a tricky problem! Thanks for any help!
Note: My solution needs to run on a non-WP page.
Some hosting providers might let you set up CRON jobs through their admin console, without having access to the command line. In a situation like that, you may be able to use a WP-CLI command to retrieve the output of the feeds, and save it to a file using something like "> filename.txt" at the end of the command.
See here: http://wp-cli.org/
And possibly here: http://wp-cli.org/commands/eval-file/
I've seen blocks of code like this
use Win32::OLE;
my $Notes = Win32::OLE->new('Notes.NotesSession')
or die "Cannot start Lotus Notes Session object.\n";
my $database = $Notes->GetDatabase("",'mail\VIMM.nsf');
but my script is running on a virtual webfusion apache service so how do you establish a connection to database on my domino server, I have control of its acl and its a website so can pass in a username & password. The script & 'POST' data is sent by a third party gateway with results of the transaction (Success/ Fail + name value pairs etc) which I need to capture. I can't do it directly on the domino server because although Domino supports PERL scripts, they will only work if PERL is also installed on the server which isn't an option.
Lotus Domino is also a web application server so you can communicate with the server using HTTP (GET and POST) from your perl script on the Apache server.
This might require changes to the Domino application in question in order for it to serve the content you expect.
Also, you can provide a WebService on your Domino Server.
For OLE/COM to work, Perl and the script have to be located on a server where Notes and/or Domino are installed. Otherwise, the OLE/COM classes are not installed and not available.
As Per and Klaus mentioned, if you can't put Notes/Domino on the machine with Perl on it, you have to switch to some sort of webbased communication.
If you are not limited to COM/OLE, you could use the IBM Lotus Domino Data Service, which is new in Domino Designer 8.5.3 Upgrade Pack 1:
The IBM® Lotus® Domino® Data Service is a REST API that accesses
databases on Domino servers. It is part of Domino Access Services.
The Domino Data Service receives requests and sends responses using
HTTP and HTTPS protocols with body content in JSON format.
The Domino Data Service allows you to obtain information on databases,
views, folders, and documents. You can update, add, and delete
documents.
The underlying problem is that a url such as http://www.mysite.com/thankyou?orderno=123 won't work in a Lotus Domino website because the ? is a special character (eg ?openagent, ?opendatabase) to the Domino web engine. You also can't use http://www.mysite.com/(thankyou)?openagent?orderno=456 (I tried), in both cases all you get is 404 page not found error and a domino log error "don't understand the url". The question was originally asking for help with Perl to solve the problem but I couldn't Perl scripts to run on my webfusion community server but fortunately quickly had success with this simple php script:
<?php
$params = "";
$url = "http://www.mywebsite.co.uk/";
$path = "wpx/website.nsf/httpagent?openagent";
if($_GET) {
$kv = array();
foreach ($_GET as $key => $value) {
$kv[] = "$key=$value";
}
$params = join("&", $kv);
}
print "<script>window.location.href=\"" . $url . $path . "&" . $params . "\"</script>";
?>
The script is placed on my webfusion server under a subdomain which effectively translates the url into a format that Domino can handle, the format ?openagent&orderno=456 is easily handled by either a java or Lotusscript agent, the parameter is extracted from the CGI Request_Content field.
The redirect means I don't for now need to manipulate data in the domino database directly, it also means that with the exception of the url translation script all the website code is in the domino database.
I tried using curl to post to a local file and it fails. Can it be done? my two management systems are on the same server and it seems unnecessary to have it traverse the entire internet system just to go to a file on the same hard drive.
Using localhost didn't do the trick.
I also tried to $_SERVER[DOCUMENT_ROOT].'/dir/to/file.php' using post data. It's for an API that is encrypted, so I'm not sure exactly how it works. It's for a billing system I have and I just realized that it sends data back (API).
It's simply post data and an XML response. I could write an html form tag and input fields and get the same result, but there isn't really anything else to know.
The main question is: Does curl have the ability to post to a local file or not?
it is post data. it's for an API that is encrypted so i'm not sure exactly how it works
Without further details nobody can answer then what you should do.
But if it's indeed a POST receival script on the local server, then you can send a POST request to it using the URL:
$url = "https://$_SERVER[SERVER_NAME]/path/to/api.php";
And then receive its output from the cURL call.
$data = curl($url)->post(1)->postdata(array("billing"=>1234345))
->returntransfer(1)->exec();
// (you would use the cumbersome curl_setopt() calls instead)
So you get a XML or JSON or whatever response.
If they're on the same drive, then use file operations instead:
file_put_contents('/path/to/the/file', $contents);
Using CURL should only be done if you absolutely NEED the http layer to get involved for some reason, or if you're dealing with a remote server. Using HTTP would also mean you need to have the 'target' script be able to handle a file upload plus whatever other data you need to send, and then that script would end up having to do file operations ANYWAYS, so in effect you've gone on a round-the-world flight just so you can move from your living room to the kitchen.
file://locafilespec.ext worked for me. I had 2 files in the same folder on a linux box, in a folder that is not served by my webserver, and I used the file:// wrapper to post to file://test.php and it worked great. it's not pretty, but it'll work for dev until I move it to it's final resting place.
Does curl have the ability to post to a local file or not?
To curl local file, you need to setup HTTP server as file:// won't work, so:
npm install http-server -g
Then run the HTTP server in the folder where is the file:
$ http-server
See: Using node.js as a simple web server.
Then test the curl request from the command-line to localhost like:
curl http://127.0.0.1:8081/file.html
Then you can do the same in PHP.
I am running a socket server using PHP. The socket server runs fine because I can connect to it using PHP.
Now, I have a flash application that is trying to connect to it:
this.socket.addEventListener(Event.CONNECT, onSocketConnect);
this.socket.addEventListener(Event.CLOSE, onSocketClose);
this.socket.addEventListener(IOErrorEvent.IO_ERROR, onIOError);
this.socket.addEventListener(SecurityErrorEvent.SECURITY_ERROR, onSecError);
try {
this.socket.connect("myip", 9999);
} catch (ioError:IOError) {
this.debugLbl.text += "ioError1 "+ioError.message;
} catch (secError:SecurityError) {
this.debugLbl.text += "secError1 "+secError.message;
}
When I run the application locally, it works! However, when I upload it to my server I get a sandbox security error (#2048). The flash app is actually hosted on the same server as the socket server, and there is cross domain policy file in place.
Is it possibly you need to use a php proxy? I had to do that, doc'd it here. Although you did mention that the app's on the same server and theres a crossdomain.xml in place, so i'm probably off the mark there (btw, Flash 10 needs a different crossdomain.xml than prev versions as far as I know).
Are you actually loading the cross domain policy file? As far as I know, Flash Player only tries to load automatically the following file: http://www.example.com/crossdomain.xml. If your file is in another place, you should load it:
Security.loadPolicyFile("http://www.example.com/subfolder/crossdomain.xml");
Also, even if the app is on the same server, Flash Player believes "http://www.example.com" to be different from "http://example.com", so you should make sure you cover this possibility in the cross domain policy file:
<allow-access-from domain="*.example.com"/>
You need to pass the crossdomain.xml file by the socket, because when you work with socket dont work any policy file in the root of the app web.
Here the sample : http://www.blog.lessrain.com/as3-java-socket-connections-to-ports-below-1024/
I'm making a GWT project that uses PHP to connect to a DB2 database. When I compile the project and deploy it to the server (copy the contents of the WAR directory over), it works fine, obviously in hosted mode I run into the SOP issue since GWT is on port 8888 while the php script is running on port 80.
I'm trying to get the -noserver option to work but I must be missing something.. I went back and created the basic sample app from the command line (webApplicationCreator -out /home/mike/gwt/sample1)
I edited the build.xml to include the -noserver and -port 80 arguements for devmode. I want my app to be hosted at localhost/sample1 so I edited the -startupUrl to the whole URL I want to use: http://localhost/sample1/sample1.html
I compiled (ant), copied over the sample1.html, sample1.css from war to the webserver sample1 directory, and the (md5).gwt.rpc, clear.cache.gif, sample1.nocache.js and hosted.html files from the war/sample1 to sample1/sample1 directory as described in the GWT documentation (no history.html file was created).
I then run ant devmode from the project directory (/home/mike/gwt/sample1)
I can get to the sample1.html page, but when I click the button to send the name to the server it returns with
Remote Procedure Call - Failure
Server replies:
An error occurred while attempting to contact the server. Please check your network connection and try again.
I turned on firebug and it's returning a 404 for http://localhost/sample1/sample1/greet. This is where I'm stuck.. this file obviously doesn't exist on my webserver.. but why? Isn't this something that is supposed to be getting compiled by GWT?
Can anyone give me a hand? Thanks!
So, basically you've copied over the client-side of a client/server application. When your GWT client application attempts to make a Remote Procedure Call (RPC) to the server to a greeting service that is part of the initial sample, it can't find that service.
If you wanted to copy that service over, you'd need to have a Java application server, copy over the GreetingService, the web.xml that references it and possibly a few other things (I'd have to check in more detail). That doesn't sound like what you actually want, so either you'll want to build a GWT-RPC service in PHP that responds to that URL, or remove the reference in the GWT code to RPC call to the greeting service.
With a PHP back-end, you're probably not going to use GWT-RPC, I'm guessing that you're more likely to use JSON or XML, and if that's the case, then I'd go with removing the RPC call altogether for now.
Does this all make sense? Feel free to ask for further clarification.
To solve the SOP issue, I used the HttpProxyServlet to proxy the HTTP requests to my webserver through the development server.
Download httpProxyPackage.jar, copy it into WEB-INF/lib/, and configure it like so in WEB-INF/web.xml (this is for the StockWatcher tutorial, assuming your web root is the folder that contains the StockWatcher directory):
<servlet>
<servlet-name>jsonStockData</servlet-name>
<servlet-class>com.jsos.httpproxy.HttpProxyServlet</servlet-class>
<init-param>
<param-name>host</param-name>
<param-value>http://localhost/StockWatcher/war/stockPrices.php</param-value>
</init-param>
</servlet>
<servlet-mapping>
<servlet-name>jsonStockData</servlet-name>
<!--
http://127.0.0.1:8888/stockPrices.php in dev mode
http://gwt/StockWatcher/war/stockPrices.php in prod mode
-->
<url-pattern>/stockPrices.php</url-pattern>
</servlet-mapping>
Then redefine your JSON URL as:
GWT.getHostPageBaseURL() + "stockPrices.php?q=";
instead of:
GWT.getModuleBaseURL() + "stockPrices.php?q=";
It’s maybe not the best way, but if it can get someone else started… There was another way using php-cgi, but I didn’t have it installed.