I want to link, into a new tab, an external page and apply to it my own css which change the general layout of this external page. The css is in another server.
Can this be done?
I'm using php, jquery and an Apache server.
I thought in using a php proxy on my server requesting the external page and adding the css but probably, if theres a solution, it will be more efficent.
Thanks!
Downloading the page to your server, applying the CSS and then displaying the page to the user (proxy) is the only way to do this. If the site is insecure and has an exploit you could inject to, you could do it a blackhat way.. but I doubt its that insecure of if anyone here will give you a nasty way to do it + as soon as it was spotted no doubt it would be restored to its original css and the exploit patched.
Think of the security risk if this was possible. CSS is seriously powerful these days, imagine if you could change the CSS on google.com - Fun times :D
No, it can't be done in other way than proxy.
The ability to inject CSS into an embedded iframe (or, more generally, to do anything to a child iframe) would open a huge security vulnerability.
Imagine you display an iframe to a webmail service I use, and heavily restyle it so my "compose reply to" page looks like a "confirm password" page from the mail service. When I naively type in my password, it sends my password in an email to the respondent!
You could achieve your desired goal by using a browser plugin or extension. Visiting a web site should require zero trust, but installing an exntesion generally does imply some trust from the user, so extensions are given greater freedom than plain web pages.
Note that a server proxy fetch will not send the user's session/auth cookies, so if you are trying to fetch a site that requires a login (e.g., Facebook), you'll only be able to fetch public resources.
Depending on what you want to achieve there may be a better approach. If you are planning on distributing it to many users a broser plugin, or maybe easier to create, a UserScript for Plugins like Greasemonkey / Tampermonkey. Another way to do would be creating a bookmarklet for the users to click on in their browser bar. Have a look at this here: http://benalman.com/projects/run-jquery-code-bookmarklet/ it's very easy to create.
Related
There several threads on SO regarding this, but I just need to know how to READ a cookie from siteb.com on sitea.com that opens siteb.com on a iframe, IF this is really the recommended way to go.
Based on this post the author says:
Cookies can be read in an iframe if they were set outside of the
iframe
But I have no idea how to achieve this. Let me explain a bit more about what im trying to design so maybe you can point me in the right direction.
siteb.com is my website, where users login and signup, each time they do, a cookie is set like many normal authentication systems.
sitea.com is a generic site, where I can insert html and javascript code, from sitea I need -if exists- to read the login cookie of siteb. I think an iframe on sitea loading siteb will do the trick, but again, i have no idea how to access that cookies inside the iframe. Is there an easy way to do this?
Another approach i was thinking is to use cross domain iframe communication techniques, but they are not elegant, way complex and some of them fails in certain browsers, the most robust ones uses jquery but I don't want to insert jquery on sitea.
Here's what you need: http://easyxdm.net/ - load this library on both sitea.com and the siteb.com iframe. It makes cross-domain parent-iframe communication "just work" in every browser, using the fastest method avaliable in each browser. (Also, the author, https://stackoverflow.com/users/128035/sean-kinsey does a fantastic job of helping anyone who has trouble with the library - just check the mailing list archives)
Then add a tiny bit of JavaScript to your siteb.com iframe to read cookies and pass them to easyxdm and then add a bit of JavaScript to sitea.com to set up easyxdm (including creating the iframe, I think) and receive the cookie value from it. There's lots of examples on the website to help you get started.
OK here my problem: content is disappearing from my site. It's not the most secure site out there, it has a number of issues. Right now every time I upload a page that can delete content from the my site using simple links wired to a GET request I find the corresponding content being deleted in mass.
Example, I have a functionality on my site to upload images. Once the user uploads an image, the admin(the owner) can use another page to delete all(owned) images from the site. The delete functionality is implemented in such a way that a user clicks on the link under each thumbnail of uploaded images he would send a get request that deletes the image information from the site's database and deletes the image from the server file system.
The other day I uploaded that functionality and the next morning I found all my images deleted. The pages are protected using user authentication when you view the pages using a browser. To my surprise, however, I could wget that page with out any problem.
So I was wondering if some evil web bot was deleting my content using those links? Is that possible? What do you advice for further securing my website.
It is absolutely possible. Even non-evil web bots could be doing it. The Google bot doesn't know the link it follows has any specific functionality.
The easiest way to possibly address this is to setup a proper robots.txt file to tell the bots not to go to specific pages. Start here: http://www.robotstxt.org/
RFC 2616 (HTTP protocol), section 9.1.1: Safe Methods:
The convention has been established that the GET and HEAD methods SHOULD NOT have the significance of taking an action other than retrieval. These methods ought to be considered "safe". This allows user agents to represent other methods, such as POST, PUT and DELETE, in a special way, so that the user is made aware of the fact that a possibly unsafe action is being requested.
Basically, if your application allows deletion via GET requests, it's doing it wrong. Bots will follow public links, and they have no obligation to expect to delete things when doing so, and neither do browsers. If the links are protected it could still be browser prefetching or acceleration of some kind.
Edit: It might also be Bing. Nowadays Internet Explorer sends data to Microsoft about everywhere you go to gather data for its shitty search engine.
Typically, a search-bot will scan a page for any links and peek down those links to see what pages are behind that. So yeah, if a both has access to that page, the page contains links to delete items / stuff and the both opens those links to see what's behind them, the code simply gets triggered.
There's a couple of ways to block bots from scanning pages. Look into robot.txt implementations. Also, you might want to look into the mechanism / safety of your admin authentication system... ;-)
You can use the robots.txt file to block the access for some web bots.
And for those that don't look for the robots.txt file you can also use javascript, there shouldn't be many webbots interpreting it.
delete
I have a secure site with private information that uses https. We have a partnership with another site that provides functionality for our users. We want the header and footer to be the same, but the body functionality to come from their site. I thought I'd create a template file that they can request from our server, which would allow me to keep creative control for whenever our site has changes.
However, the header has account information, so it needs to access the session information for the current user. So number one, is this possible? If a user clicks from my site to theirs and they request the template from our servers, how can it be sure to connect to the correct session? And number two, is this safe? How can I be sure this connection is secure?
Edit: It appears this option is not worth pursuing. I'm going to work on some other ways for the other server to access the information. Thanks.
What you are trying to do in that way is a completely mess.
You should avoid outtputting a page built from different website putted all togheter.
That would become:
Hard to maintain;
Prone to security hole.
If you want the file on your site to be a template, it should be only that. Have the other site add the information to the header after fetching it.
I think for what you are doing, the only option you have to is to redirect any request through your server acting as a proxy to maintain session vars without causing to many security holes.
I'm looking for a way of tracking all outbound clicks from a web page without modifying any of the existing page code. The solution must work with frames, iframes, content from different domains, AJAX etc.
I previously posted about a Javascript / JQuery soluction, but unfortunately the same origin policy means Javascript won't work.
This is not possible. It cannot be done server-side because http is a stateless protocol and you don't have control over iframed content from different domains. It would have to be done client-side and you already figured out even that is not possible.
Bottom line is if you want to know this sort of thing, you have to have control over the content, including access to put page code on iframed content, and page code would have to be modified
DO they use a php page to analyze the link, and return all of the images as josn?
Is there a way to do this with just javascript, so you dont have to go to the server to analyze the page?
I don't now how they do it. I'd implement a small service for that purpose. Given an URL return some relevant image (or generate a screenshot). This service could also cache results for better performance. But still, the page needs to be accessed in order to grab the <img src=... or to take the photograph.
Facebook calls back to the server. If you use Firebug (or, as I did, the Web Inspector in Safari), you can inspect the ajax calls. Facebook calls back to a script at /ajax/composer/attachment.php - in there is some JavaScript which contains HTML that gets inserted into the page. Here is what it looks if I point the Facebook attach link dialogue to the BBC News homepage in Safari Web Inspector:
Facebook JavaScript response when you attach a link in Safari Web Inspector http://tommorris.org/files/Facebook-20100529-181745.jpg
I put up the full JavaScript response on Gist (it is all one-line and minified originally, so I just flung it through TextMate to wrap it).
I'm not sure if you could do it on the client-side - because of browser protections on cross-site scripting - and even if you could, you probably ought not to because of this potential security problem: imagine if someone puts in a URL that points to a page which only they have access to. You don't necessarily want to put what's on someone else's customised or private page up on your Facebook/Digg type site. Imagine if it was something like Flickr and there were private pics - or worse, a porno site. No, better to proxy it back to your server and then grab the images. Plus, it'll probably be faster. No need to tax your end user's potentially slow connection downloading a page when your server will probably be able to do it quicker...