I'm trying to make something kind of fuzzy today :)
I'd like to create some PHP script / .htaccess to execute every request made to the file (I will redirect every request with .htaccess on my PHP script).
So the script should contact another server with those requests and return the content. My main goal is make it act like a hosts file. I'd send the request with all the parameters, making the other think I'm coming from that very host.
I do this because I can't modify any hosts file on my development server nor on my local workstation. So I'd redirect every request to my development server to my local workstation (on which I installed apache2).
The main questions are: How would you do so? Do you think the php/htaccess combo is okay? How would I identify as the current host to the other server?
I guess it's clear enough :) Any ideas?
Thanks!
As #Marc B says, you are looking to build a Proxy. Due to its nature as an interpreted language, PHP is not the optimal solution for this - using a specialized proxy program / server would be less resource intensive.
That said, if PHP is your only option, there is phpMyProxy that might work for you. I haven't used it myself locally, but the feature list and the demo look pretty impressive.
Related
I wrote an HTML page using XAMPP on my Windows computer, and everything works great locally. However, I want to host the content on a website, and I have no idea how to do it. Here it is as is, with a PHP code displayed, instead of actually running.
Does github.io support running PHP at all? I read that it doesn't. If so, where would I be able to host my code so that my PHP and JavaScript could run given that I point my browser to the webpage's URL? Also, XAMPP had linked it to an MySql database, but I am unsure of how to set that up as well on a server.
These seem like simple questions for the beginning web developer, but I scoured Google and couldn't find an answer. Thank you.
To deploy php on the server you have to check:
Is there PHP?
Take care of pretags
Check version of php (it will run with error if php parser is on)
Configure your wwwroot or httpdocs or hotdocs (webpage public folder)
Take care of .htaccess if something is crashing
And not every server is apache and not every server is node and not every... Just ask support of this hosting because in some cases you have to "turn on" PHP by some fancy button.
Have a nice day!
No, Github Pages doesn't have PHP, it's only meant to host static pages. It does support Jekyll, a static site generator (in that it generates static pages once per push which are then hosted as-is), but that's about all.
PHP/MySQL is only one of the many sets of possible web application technologies, so you can't expect it to be everywhere where web hosting is. It has to be either explicitly listed on hosting service's website, or be available for installation in case you get yourself a full-fledged server machine (maybe virtual) to run your website.
Browser-based JavaScript will still be run by the client, since it's not the server's responsibility to run it, just the delivery. So it can be hosted on GitHub Pages. Also, 3rd party services that don't depend on your own server's code execution are usable too: stuff like commenting systems, searches (you can even make a client-side one!) and analytics.
Is there a simple way to detect all outside calls from PHP?
In an open sourced project I have a lot of 3rd party scripts. With use of new relic I was able to debug long execution times leading back to some of this scripts making calls back to their servers.
I dont mind this but I want to know what data this scripts are sending and most of all I dont want to have slow site when 3rd party script server is down or not accessible.
Is there an easy way to log all curl, file get contents etc requests?
Thanks!
You are searching for a packet sniffer. Usually you'll use tcpdump and/or wireshark.
http://wiki.ubuntuusers.de/tcpdump
https://www.wireshark.org/
There are many solutions, but my preferred is
You build your own proxy : example a dedicated Apache Server (it can running in the same IP but different port who will handle this type of operations). After that, you change all of your old URL to pass by your proxy
Imagine that you have this in your code : curl_init('www.google.com'); so you have to change it by: curl_init('http://localhost:8090/CALLS_OUTSIDE_PHP_CONTROLLER.php?url_to_redirect=www.google.com');
The PHP controller running under 8090 can do many operations as : blacklist/whitelist some urls, doing regular URL check in background... many cool stuff
We have develop a CURL function on our application. This curl function is mainly to map the data over from 1 site to our form-field in our application.
However, this function has been working fine all the while and ready for use for more than 2 months. Yesterday, this fucntion was broken down. the data from this website is no longer able to map over. We are trying to find out why the problem is. When we troubleshooting, it shows that there is response timeout issue.
To re-ensure there were nothing wrong on our coding and our server performance is working, we have duplicates this instance to another server and try out the function. It was working perfectly.
Wondering if any one out there facing such problem?
What could the possibility to cause this issue?
When we are using cURL, will the site owner know that we are calling their data to map into ours server application? If so, is there a way that we can overcome this?
Could be the owner that block our server ip address? tht's why it function works well on my another server but not in the original server?
Appreciate your help on this.
Thank you,
Your problem description is far too generic to determine a specific cause. Most likely however there is a specific block in place.
For example a firewall rule on the other end, or on your end, would cause all traffic to be dropped, thus causing the timeout. There could also be a regular network outage between both servers, but that's unlikely.
Yes, they will see it in their Apache (or IIS) logs regularly. No, you cannot hide from the server logs - it logs all successful requests. You either get the data, or you stay stealthy. Not both.
Yes, the webserver logs will contain the IP doing all the requests. Adding a DROP rule to the firewall is then a trivial task.
I have applied such a firewall rule to bandwidth and/or data leechers a lot of times in the past few years, although usually I prefer the more resilient deny from 1.2.3.4 approach in Apache vhost/htaccess. Usually, if you use someone else's facilities, it's nice to ask for proper permission - lessens the chance you get blocked this way.
I faced a similar problem some time ago
My server IP was blocked from the website owner
It can be seen in the server logs. Google Analytics, however, won't see this, as cURL doesn't execute javascript.
Try to ping the destination server from the one executing the cURL.
Some advices are:
Use a browser header to mask your request.
If you insist on using this server, you can run trough a proxy.
Put some sleep() between the requests.
Im making a simple web-based control panel, and i figured the easiest way for me to accomplish it would be to have PHP on the 2 machines (one being the web facing machine, the other being behind a VPN), basically I need it so when I press a button on the site on the externally facing IP of machine 1, it sends a request to the internally facing IP (eg 192.168.100.1) of machine 2 and runs the PHP file (test.php plus some $_GET data) without actually redirecting the end user to 192.168.100.1, because obviously that will time out as there is no access to it.
If all you want is to make certain internal PHP pages accessible on the external server, you should consider setting up a reverse proxy instead of manually proxying requests with PHP.
See the Apache documentation for an example: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
Of course this won't work if you do your authentication on the external server and/or need to execute additional PHP code on the external server before/after the internal PHP code. In that case refer to Mihai's or Louis's answer.
You can use cURL to send or forward HTTP requests from machine 1 to machine 2 and to receive the responses machine 2 gives you and (if needed) process those responses to show them to the user.
You could also use (XML-/JSON-)RPC or SOAP which would be a bit more elegant and extensible (more commonplace than using cURL) but it would have a higher learning curve with a bigger setup time/work.
You should also be able to use file_get_contents (normally supporting the http protocol) or http_get, a function designed for simple http get requests.
It might not be the most ideal way, but should be fairly easy to do.
I already have a web server that I pay for, and I want to expose some services on it using Thrift and PHP.
My question is: can I run a Thrift server using normal PHP that's hosted on the default port (the same way web pages are hosted) instead of having a separate PHP application that runs on some funky obscure port. This way I wouldn't have to change the server configuration (which is something I'm not able to do even if I wanted to).
Thanks
EDIT: maybe I should clarify a bit more. Once I've defined my service using a .thrift file, is it possible to:
Run the thrift code generator
Take the generated code and put it on my webserver
Create an index.php which says (in pseudocode) "create a new instance of the service, and handle incoming requests"?
Okay, well I have figured out the answer on my own!
If you use a TPhpStream on the server side, you are able to serve requests coming in as regular http requests.
Many thanks to Rob Wilkerson https://github.com/robwilkerson/Thrift-Client-Server-Example--PHP-.
I also blogged about how to implement a simple example with PHP and Python at http://willwarren.com/2012/01/24/creating-a-public-api-with-apache-thrift/