We have a unified portal which links multiple services through a jQuery tab based interface making use of iframes to display content from different services. Our portal runs on a secure server with HTTPS/SSL. While most of our external services are HTTPS, two of them aren't. Obviously we are aware of the issues with mixed content and we didn't like the idea of having non-https sites within the portal, but we didn't have a choice. Everything was ok until a few days ago when Google updated chrome to version 30, which now silently blocks mixed content.This has created a great number of problems for us. We contacted the external services and asked them if they could upgrade their services to HTTPS and one of them has come back saying they have no plans to do so for the next 2 years.
Obviously this is a problem. We tried getting around the problem by getting this service to open into a new browser window, but this is a rather inelegant workaround which I would like to get rid of, if at all possible. Is there any way that I can use AJAX or PHP to prefetch the page in question and then display it within the portal iframe without Chrome blocking it?
I would be very grateful for any advice at all. I do understand how bad an idea it is to mix secure content with non secure content, but I have no choice in the matter as my manager is adamant that the service have to be a part of the portal.
Thanks in advance.
Regards
Alex
A somewhat simple solution would be to use a reverse proxy. You can configure Apache quite easily to take an HTTPS connection, fetch the requested content from another URL and return it. See mod_proxy. The problem is that the browser will necessarily see a different URL/domain on its part (your reverse proxy), which may or may not cause problems with cookies or hardcoded links.
Related
So, I have a problem, and this may or may not be the place to ask this question, but I'm doing it anyway - since I've tried everything and nothing works …
Here goes:
I have a tracking script installed on a digital ocean server … it’s called CPVlab. It enables me to track clicks and gives me statistics on the click. What it does is catch info on a user and their behavior and it can rotate landing pages for split testing those landing pages. This is all done through internal redirects on the domain the script is installed on.
Let’s say it’s installed on : tracker.com
Another feature of the script is : I can enter an A record in the DNS I use and call it someothername.com and point it to the IP adres of the tracker.com.
This way, one can use different domains (tracking domains) in order to not have the main installation domain visible. This helps with customizing the look of different marketing campaigns (you don’t want them all to look like : tracker.com/? querystuff)…
So here’s the problem : It all used to work fine without https:// … But after installing letsencrypt (through an easyengine command for bothe tracker.com as well as tracking domains) the explained feature doesn’t work anymore.
When using **http://**someothername.com as an A record pointing to tracker.com, the server shows me a 404 not found status. And when I use a **https://**someothername.com as an A record pointing to the script, it tells me the connection is not secure. This while both domains have https certificates and they work if I put them in the browser direct. (it will show https).
However when I don’t use this tracking domain feature and just use the plain https://tracker.com domain, it works perfectly.
Maybe this question is a bit far out, but does anyone have an idea if this is related to letsencrypt ? I added the certificates through EE a few months ago, and I know EE uses certbot. However I am thinking that this problem may have something to do with letsencrypt not supporting wildcards at the time of install. Maybe this tracking script is designed in some way that the main domain uses the tracking domains as some sort of sub domain ?
Anyone have an idea about this ? I am definately STUCK here…
Thanks, Lex
E.g. I am selling a WordPress or Joomla plugin and after the user installs and activates this plugin, but it still remains not vworking, because he needs to click on some "verify" button so the status of the plugin changes to became working.
This button will trigger a function that will connect to some "service" where I will previously add his website URL e.g. http://myclientsweb.com plus maybe some verification code attached in the url matched with the data on my serrver, the status of the plugin change to activated.
I can do the programming stuff on both sides (client's web and server for verification) basically, but the problem is I need some solution as server where verification urls and codes are stored is available all the time , something like CDN, so even one server is off, the client can always verify his plugin somwhere else.
So, the best solution would be some kind of CDN service that specialize for that. It could be free or paid. Do you know about something of that nature? Or do you suggest some better solution?
I am thinking you need something else than a CDN for that purpose. I would suggest you look to the cloud for a solution. I think, if you want the most uptime for your verification app, with the simplicity of a CDN, you should go for something like Pagodabox or Heroku.
Those two services host your code for you, much like a regular server, but they automaticly scale and handle requests. I theory it should make your app available 100% of the time, with minimal resources spent from you.
Both services offer free plans to get going, and test out if it's something for you.
I hope this helps, this is my suggestion to your problem at least.
If your server host is good, they usually offer 99% up time guarantee. CDN is required if you are getting large number of verification requests causing delay in processing.
Here, I would suggest, when someone tries to verify plugin and if server is down (couldn't reply), send an email to admin (you). So that you instantly know that the server is down and you can manually process the verification.
CDN stands for content delivery network a large distributed system. Which is mostly used to serve content. And does not seem to fit what you are looking for. If i understand correctly you are looking to offer 99.x% uptime. So client can always register their plugins.
This could require settings up multiple virtual hosts / dedicated servers behind a load balancer. In that way when one node goes down. The load balancer could redirect the traffic to a new node to handle requests. You really need to find a good hosting company. That's all!
I have looked around and it seems that there is no way whatsoever to load external/remote URLs like http://google.com through the client browser using Javascript without using a proxy be it a PHP file in the server side or YQL which essentially uses the Yahoo API as a proxy. This is due to the same-origin policy.
I am not versed in Flash and I think that it might hold an answer because even though some people are agressively phasing it out, it has a lot of power.
My question: is there something I missed when searching? Free hosts have some restrictions on the amount of requests and the load on the server per unit time and I wouldn't like to get kicked out. Also my site scrapes some remote site's data so I wouldn't like to get blocked which I would get if I used a PHP proxy. So is there a simple Flash solution or Javascript solution I did not see?
No, this is not possible due to the Same origin policy: http://en.wikipedia.org/wiki/Same_origin_policy
I have already heard about the curl library, and that I get interest about...
and as i read that there are many uses for it, can you provide me with some
Are there any security problems with it?
one of the many useful features of curl is to interact with web pages, which means that you can send and receive http request and manipulate the data. which means you can login to web sites and actually send commands as if you where interacting from your web browser.
i found a very good web page titled 10 awesome things to do with curl. it's at http://www.catswhocode.com/blog/10-awesome-things-to-do-with-curl
One of it's big use cases is for automating activities such as getting content from another websites by the application. It can also be used to post data to another website and download files via FTP or HTTP. In other words it allows your application or script to act as a user accessing a website as they would do browsing manually.
There are no inherent security problems with it but it should be used appropriately, e.g. use https where required.
cURL Features
It's for spamming comment forms. ;)
cURL is great for working with APIs, especially when you need to POST data. I've heard that it's quicker to use file_get_contents() for basic GET requests (e.g. grabbing an RSS feed that doesn't require authentication), but I haven't tried myself.
If you're using it in a publicly distributed script, such as a WordPress plugin, be sure to check for it with function_exists('curl_open'), as some hosts don't install it...
In addition to the uses suggested in the other answers, I find it quite useful for testing web-service calls. Especially on *nix servers where I can't install other tools and want to test the connection to a 3rd party webservice (ensuring network connectivity / firewall rules etc.) in advance of installing the actual application that will be communicating with the web-services. That way if there are problems, the usual response of 'something must be wrong with your application' can be avoided and I can focus on diagnosing the network / other issues that are preventing the connection from being made.
It certainly can simplify simple programs you need to write that require higher level protocols for communication.
I do recall a contractor, however, attempting to use it with a high load Apache web server module and it was simply too heavy-weight for that particular application.
I have a website where most of the traffic comes from the API (http://untiny.com/api/). I use Google Analytics to collect traffic data, however, the statistics do not include the API traffic because I couldn't include the Google Analytics javascript code into the API pages, and including it will affect the API results. (example: http://untiny.com/api/1.0/extract/?url=tinyurl.com/123).
The solution might be executing the javascript using a javascript engine. I searched stackoverflow and found javascript engines/interpreters for Java and C, but I couldn't find one for PHP except an old one "J4P5" http://j4p5.sourceforge.net/index.php
The question: is using a javascript engine will solve the problem? or is there another why to include the API traffic to Google Analytics?
A simple problem with this in general is that any data you get could be very misleading.
A lot of the time it is probably other servers making calls to your server. When this is true the location of the server in no way represents to location of the people using it, the user agent will be fake, and you can't tell how many different individuals are actually using the service. There's no referrers and if there is they're probably fake... etc. Not many stats in this case are useful at all.
Perhaps make a PHP back end that logs IP and other header information, that's really all you can do to. You'll at least be able to track total calls to the API, and where they're made from (although again, probably from servers but you can tell which servers).
I spent ages researching this and finally found an open source project that seems perfect, though totally under the radar.
http://code.google.com/p/serversidegoogleanalytics/
Will report back on results.
you would likely have to emulate all http calls on the server side with whatever programming language you are using..... This will not give you information on who is using it though, unless untiny is providing client info through some kind of header.
if you want to include it purely for statistical purposes, you could try using curl (if using php) to access the gif file if you detect untiny on the server side
http://code.google.com/apis/analytics/docs/tracking/gaTrackingTroubleshooting.html#gifParameters
You can't easily do this as the Javascript based Google Analytics script will not be run by the end user (unless of course, they are including your API output exactly on their display to the end user: which would negate the need for a fully fledged API [you could just offer an iframable code], pose possible security risks and possibly run foul of browser cross-domain javascript checks).
Your best solution would be either to use server side analytics (such as Apache or IIS's server logs with Analog, Webalizer or Awstats) or - since the most information you would be getting from an API call would be useragent, request and IP address - just log that information in a database when the API is called.