Cookies not sent back to domain in IFrame - php

I have an application running on 192.168.1.100. This application serves a page in which an iframe is embedded which is basically another application running on totally different ip e.g. 10.0.0.1.
The issue is whenever user performs a login inside iframe a cookie is set but it is not sent back to 10.0.0.1 for subsequent requests.
What i am trying to achieve is this even possible?
Any help in this regard is appreciated.

Related

Load and authenticate into a web within an iframe using http, from a web using https

I need to simulate from within an iframe in our site, which uses https and it's loaded only once upon the authentication on our site, the authentication into another site, which only uses http.
How can I do that?
We first tried loading into the iframe a page of our site from which the login form for the remote authentication is automatically submitted with javascript. This cannot be achieved because the http request from the form is blocked by the browser for security reasons. I must clarify that if we use http in our web too, the authentication is done without problems.
I'm not sure if using file_get_contents() will do the trick, because it's not a simple static page what we need to display. We need to keep any data from the remote login (cookies, etc) in the browser so that we can access other parts of the remote web (once I've signed in) from other places of our site. As far as I know, file_get_contents doen't provide any header.
Another alternative I've also considered is curl, using CURLOPT_RETURNTRANSFER=true and CURLOPT_HEADER=true and trying to manually set any cookies I get in the header. I'm not sure if keeping the session implies more actions though.

PHP open a running app in multiple tabs in same browser

I have a php application in which I scrape a website and get all of the links present in the site. While I am running the scraper in a tab of a browser and open the app in the other tab of the same browser, it keeps loading until the other tab processing(running scraper) is complete.
I have tried using ajax in this case i.e. I send the request through ajax post to find the links, but it is not effecting.
Any kind of help and guidance will be appreciated.
That is probably caused by the session lock. If your multiple connections (tabs) require the same session, you can't.
If they could be independent, then you would have to pass a session id in the URL to identify which tab is communicating with the server.
Note that the web server may also have restrictions configured on the number of simultaneous sessions from the same IP.

A script that logs in to multiple websites at once. CURL and PHP tried.

I run a computer lab for grade schoolers (3-14 y.o.) and would like to create a desktop/dashboard page consisting of a number of iframes, each pointing at a different external website
(for which we have created individual accounts for each child); and when a kid logs in (to the dashboard) a script will log her in to those websites, so she does not have to.
I have 1 server and 20 workstations, I'll refer to them as 'myserver' and 'mybrowser'(s) respectively. All these behind the same router (dynamic IP).
A kid gets on a 'mybrowser' workstation, fires up Firefox and runs desktop.php (hosted in 'myserver') and gets a login screen (for 'myserver')
'mybrowser' ---http---> 'myserver'
Once logged in, 'myserver' will retrieve a set of username and password stored in its database and run a CURL script to send those to an 'external web server'.
'mybrowser' ---http---> 'myserver' ---curl---> 'external web server'
SUCCESSFUL, well, I thought.
Turns out CURL, being run off 'myserver', logs in 'myserver' instead of 'mybrowser'.
The session inside the iframe, after refresh, is still NOT logged in. Now I know.
Then I thought of capturing the cookies from 'myserver' and set it into 'mybrowser' so that 'mybrowser' can now browse (within the iframe)
as a logged in user. After all, we (all the 'mybrowsers') are behind the same router as 'myserver', thus same IP address.
So in other words, I only need 'myserver' to log a user in to several external websites all at once ,and once done pass the control over back to individual users' browsers.
I hope the answer will not resort to using CURL to display and control the external websites for the whole session, aside from being a drag that will lead to some other sticky issues.
I am getting the nuance that this is not permitted due to security issues, but what if all the 'mybrowsers' and 'myserver' are behind the same router? Assuming there's a way to copy the login cookies from 'myserver' to 'mybrowsers', would 'external web server' know that a request came from different machines?
Can this be done?
Thanks.
The problem you are facing relates to the security principles of cookies. You cannot set cookies for other domains, which means that myserver cannot set a cookie for facebook.com, for example.
You could set your server to run an HTTP proxy and make it so that all queries run through your server and do some kind of URL translation (e.g. facebook.com => facebook.myserver) which then in return allows you to set cookies for the clients (since you're running on facebook.myserver) and then translates cookies you receive from the clients and feed them to the third party websites.
An example of a non-transparent proxy that you could begin with: http://www.phpmyproxy.com/
Transparent proxies (in which URLs remain "correct" / untranslated) might be worth considering too. Squid is a pretty popular one. Can't say how easy this would be, though.
After all that you'll still need to build a local script for myserver that takes care of the login process, but at least a proxy should make it all possible.
If you have any say in the login process itself, it might be easier to set up all the services to use OpenID or similar login services, StackOverflow and its sister sites being a prime example on how easy login on multiple sites can be achieved.

How can I get a script to run whenever an http request is made?

I have 2 web servers. One hosts my main website (www.site.com), and the other hosts a subdomain (sub.site.com). The way I have it set up, any time you go to the subdomain, it routes through the main server first, using mod_proxy. On the main web server I have a php script that logs information any time a web page is accessed. What I would like to happen is anytime someone accesses the subdomain, the logging script runs on the main server before sending the request on to the subdomain server. Is this possible?
How about a cURL request from the subdomain to the main site, passing along all of the request headers?
Could you use a front controller style that redirects every request as needed? Or maybe make a pointcut and attach logging code transparently by using Aspect Oriented Programming?

PHP - login to a remote server, trough my own server, with HTTPS, cookies and proxy, and downloading the html

so what i am trying to do is this:
login to the other server with a PHP on my own server (either with my username and pass/or with my cookies)
then have access to the page i want to display/download
i want to write a PHP script that is located on my own server, that automatically does a login to another server, that uses HTTPS and a web form for login.
after the login i have access to that page that i am trying to download.
i dont know if it would be possible to login and download the html only with the cookies that i have in my browser through a previous login, or if i need to do the login in my php script through some https login method.
can i do any of this with curl or fsocksopen or what would be the best way to realize this?
thanks in advance!
you just have to try. in most cases you should be fine if you export your cookies and use them in your curl request.
however the website mave hashed the cookies with the remote address, or given a timeout on them.
then you probably have to login from the server. with php / curl you can do that all.
the only thing that may be a problem is javascript/captcha codes.
in addition you should definately check zend http client, it has functionalities that makes "browsing" easy. for example saving cookies and automatically passing them on in the next request and also deleting them if the server tells you so etc.
Use the PEAR HTTP Request class.

Categories