Our website https://www.diamir.de is build with laravel. when we try to load the site in an iframe on another domain, the session is regenerated with every request. the session cookie is send to the browser. but it seems laravel is not able to read it on the next request and therefore regenerates the session.
how can we enable sessions when the site is loaded in an iframe on another domain? is this even possible with the security settings in modern browsers?
thanks.
iFrames are largely locked down windows to view other content inside your website. They were a useful way of hacking things into sites, but have been replaced in a lot of scenarios.
For your particular scenario, showing the site fully on another domain, there are other, better ways to reference the same site from multiple domains.
Personally, I'd redirect in most scenarios, all where an iFrame would work. This results in a user going to any secondary domain, and being redirected to the primary domain. Usually done with a 301 permanent redirect.
However, the one occasion where I need the site sitting on multiple domains is when the site is themed on other domains. In a lot of these cases, it's preferable to fork the site and make changes as needed, but in those that need the same code base and just a theme over the top, I'd simply have the server (Apache, NGINX, or any other server) listen to multiple domains all serving the same site.
Related
I am trying to load an iframe across domains, in order to reuse a form submission that collates data into a single point.
I have managed to get most of the functionality and styling working cross domains, enabling CSP policies etc, but I am having issues with Safari and trying to set first-party cookies.
I have used some of the older techniques for setting first-party cookies with redirects, but for whatever reason, the cookies aren't set when I use the redirect to the embedded domain in order to set domains. I think the main issue with the redirect is that you need to have a user interaction as per the Webkit specifications.
I am trying to make this experience as seamless as possible for the user.
Any feedback appreciated, using PHP, NGINX and some JS, HTML & CSS.
I have access to the remote server that will be referenced in the iframe, but will also be liaising with the developer for the domain where the iframe will be embedding
Trying to optimize a wordpress site. When a browser requests static images, it sends cookies with the request from the main domain, but the server ignores the cookies. These cookies are unnecessary network traffic. To workaround these issues, I am trying to make sure that static components are requested with cookie-free requests by creating a subdomain. And it also helps in parallel requests thereby loading my site faster.
I created the subdomain accordingly, I changed the the uploads_image_path in the "wp-admin/options.php". And every image seems to be redirecting to my subdomain, except all my woocommerce images.
Example:
example.com/wp-content/uploads/2018/07/image_name.jpg
should change to
uploads.example.com/2018/07/image_name.jpg
If I can't redirect my my woocommerce images, it renders that one strategy absolute, because woocommerce contains most of my images. Am I missing something?
Solved it, changing the path on "example.com/wp-admin/options.php" does not write permanent changes ". I had to go directly to the options table in the database, change the the values for "upload_url_path" and "upload_path" to my subdomain.
I run a computer lab for grade schoolers (3-14 y.o.) and would like to create a desktop/dashboard page consisting of a number of iframes, each pointing at a different external website
(for which we have created individual accounts for each child); and when a kid logs in (to the dashboard) a script will log her in to those websites, so she does not have to.
I have 1 server and 20 workstations, I'll refer to them as 'myserver' and 'mybrowser'(s) respectively. All these behind the same router (dynamic IP).
A kid gets on a 'mybrowser' workstation, fires up Firefox and runs desktop.php (hosted in 'myserver') and gets a login screen (for 'myserver')
'mybrowser' ---http---> 'myserver'
Once logged in, 'myserver' will retrieve a set of username and password stored in its database and run a CURL script to send those to an 'external web server'.
'mybrowser' ---http---> 'myserver' ---curl---> 'external web server'
SUCCESSFUL, well, I thought.
Turns out CURL, being run off 'myserver', logs in 'myserver' instead of 'mybrowser'.
The session inside the iframe, after refresh, is still NOT logged in. Now I know.
Then I thought of capturing the cookies from 'myserver' and set it into 'mybrowser' so that 'mybrowser' can now browse (within the iframe)
as a logged in user. After all, we (all the 'mybrowsers') are behind the same router as 'myserver', thus same IP address.
So in other words, I only need 'myserver' to log a user in to several external websites all at once ,and once done pass the control over back to individual users' browsers.
I hope the answer will not resort to using CURL to display and control the external websites for the whole session, aside from being a drag that will lead to some other sticky issues.
I am getting the nuance that this is not permitted due to security issues, but what if all the 'mybrowsers' and 'myserver' are behind the same router? Assuming there's a way to copy the login cookies from 'myserver' to 'mybrowsers', would 'external web server' know that a request came from different machines?
Can this be done?
Thanks.
The problem you are facing relates to the security principles of cookies. You cannot set cookies for other domains, which means that myserver cannot set a cookie for facebook.com, for example.
You could set your server to run an HTTP proxy and make it so that all queries run through your server and do some kind of URL translation (e.g. facebook.com => facebook.myserver) which then in return allows you to set cookies for the clients (since you're running on facebook.myserver) and then translates cookies you receive from the clients and feed them to the third party websites.
An example of a non-transparent proxy that you could begin with: http://www.phpmyproxy.com/
Transparent proxies (in which URLs remain "correct" / untranslated) might be worth considering too. Squid is a pretty popular one. Can't say how easy this would be, though.
After all that you'll still need to build a local script for myserver that takes care of the login process, but at least a proxy should make it all possible.
If you have any say in the login process itself, it might be easier to set up all the services to use OpenID or similar login services, StackOverflow and its sister sites being a prime example on how easy login on multiple sites can be achieved.
I have a file that is being linked to from other sub websites.
The file: http://site.com/file.img
Website A linking to it <img src="http://site.com/file.img"></img>
website B linking to it <img src="http://site.com/file.img"></img>
I need to reliably identify which of these websites has accessed the file, but I know that $_SERVER['HTTP_REFERER'] can be spoofed. What other ways do I have to reliably confirm the requester site? By IP, get them to register an IP? not sure. setup an API key? What options are there?
If a website is only linking to a file, the "website" itself will never actually access your image. Instead, the client who's viewing the site will make a request for the image.
As such, you're depending on information sent by the client, which is completely out of your control and not reliable at all. If you have the opportunity to set some sort of unique cookie on the client, you may be able to use this in some fashion for extended identification, but even that won't be reliable.
There is no 100% reliable solution.
Getting the referrer is the best you can do without getting into complicated territory.
If you don't mind complicated, then read on: set up your Web server to serve file.img only to Website A and Website B, then require that Website A and Website B set up a proxy configuration on their end that will retrieve file.img on behalf of their visitors.
Example:
A visitor to Website A loads a page that contains an image tag like <img src="http://websiteA.com/file.img"/> (note reference to Website A rather than your site). Client requests file.img from WebsiteA.com accordingly. Website A is configured to proxy requests for the path /file.img to your server, http://site.com/file.img. Your site verifies that it is in fact Website A that is requesting the image and then serves it to Website A's proxy. Website A then serves it to the visitor.
Basically, that makes it a pain for Websites A and B, gives you a performance hit, and also requires further configuration on your part. But I imagine that would satisfy your requirement.
Have a look at how OpenID relying is implemented, it allows one site to authenticate against another. The protocol specification will give a hint at the effort and overhead required to reliably implement such a scheme.
http://googlecode.blogspot.com/2010/11/googles-sample-openid-relying-party.html
I am working on a browser based application that will have many users. The catch is that every user should have their own customized login page, but the actual application is the same for everyone, and needs to be in a central location.
The login page is static. That is, if we have a user that requires a separate login, we will make a separate landing page for them, lets say at user1.application.com, that will have a blue background. User two will be handed a url to user2.application.com, which will have a green background. The application does not have to dynamically change the look of the login page, that will be static and managed on a higher level.
What is the most secure way of doing this?
Would it make more sense to have a copy of the application for each user, and keep the database centralized?
The projected number of users is not very high, probably around 20-80.
Thank you,
I can give you instructions on how to do this using Microsoft IIS and ASP.NET. Other servers and programming languages still apply, but the specifics will be a little different.
You'll need to have access to your DNS settings. Create a DNS entry for *.applicaiton.com. We do this as a CNAME record that points to our www domain record that is registered as the A record that is associated with the IP Address.
Option 1:
In IIS 6, create a web site and modify the Host Headers (web site Properties, Web Site tab, "Advanced..." button in the "Web site identification" section). Add an empty host header. This will cause that web site of IIS to answer all requests for all domains associated with the IP Address it is listening on.
Then create a default page and in the code behind, you'll have logic that looks at the Request.UserHostName of each request. It should return "user1.application.com" or "user2.application.com" or perhaps "www.application.com". You'll then have to parse that string and do all the dirty work to load the appropriate page.
Hmm, well, that's how you would do it dynamically, with one web site. Re-reading your question, you talk about "static" login pages. For that you have two options. You can create the static login pages and have your dynamic page read those files and send them down as the response, or option two would be...
Option 2:
In IIS, create a new web site for every user. Modify the host headers as described above such that each web site only have one host header that is equal to the user's login. Do not have a web site with an empty host header. You'll have to create a web site and add the host header for every new user, manually.
Neither option may sound very elegant, but Option 1 does work rather well. We are using it in a similar fashion to host multiple "skins" of our application.