SSL is installed to my VPS correctly. I want to use ssl in some pages of my website. Every form in these pages are starting with "https://", too. But browsers don't accept it.
What are the possible reasons?
There may be a number of reasons. Last time I got it on my site was when I was using an iframe with external content and a flash widget loaded via an external javascript. Both were accessed via HTTP and messed my site's trustworthiness.
So. Check all your external content: javascripts, widgets, iframes, images, stylesheets... You may be loading them via HTTP, which in turn may make Chrome claim the SSL certificate has a problem.
I would try checking it via something like this first : http://www.sslshopper.com/ssl-checker.html
You might also try running curl -verbose https://yourlink.com from the console in order to get detailed printout of where the hiccups are.
Related
Authorize.net consumes my response_url, which is on HTTP, into their HTTPS hosted dll. How can I specify that their dll should be on HTTP, so that my CSS and JS files get pulled in correctly?
I don't have a way of getting access to an SSL host at the moment.
Edit: First, we send from HTTP to their HTTPS hosted form. On their server. Then, their server consumes our HTTP page and dispalys it in their HTTPS response dll.
I only want their response_dll to be on HTTP. I don't see a security issue with that, and imagine there is a way to do this, as their service offering is meant for people without SSL enabled.
Edit2: I'm using their Simple Checkout API.
Answering my own question, based on some advice to contact tech support directly. I did, and their response was:
"We can only relay the html content. We do not offer to relay other content such as images. It is necessary that any images you want to include are hosted using https."
Edit --
I attempted to include a single logo image as a base64 string inside a static html page. That failed with a "script timeout". I speculate this is due to the size of the html file after using that string.
This is a total failure on the part of Authorize.Net
I have url where i host my webinars which is provided by webinar hosting provider. I would like to change that url to something within my domain.
For eg. The webinar url is something like
http://www.onlinemeetingnow.com/seminar/?id=d181a7640e
i would like to change it look something within my domain.
www.mywebsite.com/webinar
Is this possible?
The simplest way of doing this would be to create a PHP script at the desired URL that simply does a readfile () of the target URL. That would give the appearance that your site is hosting the remotely hosted content.
<?php
readfile ('http://www.onlinemeetingnow.com/seminar/?id=d181a7640e');
?>
This approach does require allow_url_fopen to be enabled, which it might not be for security reasons. It also has issues regarding such things as cookies, for example. Say you are using this trick to link to a remote site that requires a login and uses cookies to implement it, people who are logged into the remote site would appear not to be logged in, as their cookie wouldn't be sent to the remote site when you readfile () it.
You could use curl instead, as you have a bit more control, and it doesn't require allow_url_fopen. It still wouldn't be ideal though.
If you can configure your server, you could possibly use something like proxypass or URL rewriting to hide the remote URL.
Other solutions include using an iframe to display the remote site, or using AJAX to load the remote page's markup and inject into your page, but these approaches have their own set of issues that you need to take into account.
In the end, is it really worth the effort needed and the compromises you will have to make to just have the URL appear to be locally hosted when it isn't?
Maybe you want to create that page(s) on your own site and within that page you load the onlinemeetingnow url. This can be done with an iframe or such or you can get the html code from the page (with Curl or something) and than load that into your own page.
I'm developing a project using Javascript, PHP and OpenLayers. A lot of maps are loaded using and HTTPS connection against an external OGC server.
When I try to load the map using HTTPS, they doesn't load (instead of, they show me an "Error loading the map, try again later").
I think that the problem is because of Digital Certificate. If I load directly from the server (using a WMS call) like this (look the last parameter):
https://serverurl/ogc/wms?service=WMS&version=1.1.0&request=GetMap&layers=ms1:lp_anual_250&styles=&bbox=205125.0,3150125.0,234875.0,3199875.0&width=306&height=512&srs=EPSG:4326&format=application/openlayers
The browser ask me for my authorization to see it. If i accept the Digital Certificate, I can see the map. After that, and because of my browser now accepts the certificate, I can see my own map from my own application.
So, the question is: Is there any way to ask for the Digital Certificate mannually when the user access to my web?
Thanks in advance!
PS: solutions using PHP are welcome too because I'm using CodeIgniter to load views
You could try opening the WMS URL in a div or perhaps a hidden iframe - that may cause the browser to pop up its 'Unknown cert' dialogue.
Im going to quote another user (geographika) from gis.stackexchange. I hope can help to someone with my issue:
You can use a proxy on your server so
all client requests are made to your
server, which deals with the
certificate, gets the request and
passes it back to the client. For PHP
have a look at
http://tr.php.net/manual/en/function.openssl-verify.php
If you are also using WMS software
(MapServer, GeoServer) you could
implement the same technique using a
cascading WMS server.
For details on how to do this in
MapServer see
http://geographika.co.uk/setting-up-a-secure-cascading-wms-on-mapserver
I know the question regarding PHP web page scrapers has been asked time and time and using this, I discovered SimpleHTMLDOM. After working seamlessly on my local server, I uploaded everything to my online server only to find out something wasn't working right. A quick look at the FAQ lead me to this. I'm currently using a free hosting service so edit any php.ini settings. So using the FAQ's suggestion, I tried using cURL, only to find out that this too is turned off by my hosting service. Are there any other simple solutions to scrape contents of a of another web page without the use or cURL or SimpleHTMLDOM?
If cURL and allow_url_fopen are not enabled you can try to fetch the content via
fsockopen — Open Internet or Unix domain socket connection
In other words, you have to do HTTP Requests manually. See the example in the manual for how to do a GET Request. The returned content can then be further processed. If sockets are enabled, you can also use any third party lib utilitzing them, for instance Zend_Http_Client.
On a sidenote, check out Best Methods to Parse HTML for alternatives to SimpleHTMLDom.
cURL is a specialty API. It's not the http library it's often made out to be, but a generic data transfer library for FTP,SFTP,SCP,HTTP PUT,SMTP,TELNET,etc. If you want to use just HTTP, there is an according PEAR library for that. Or check if your PHP version has the official http extension enabled.
For scraping try phpQuery or querypath. Both come with builtin http support.
Here's a simple way to grab images when allow_url_fopen is set to false, without studying up on estoteric tools.
Create a web page on your dev environment that loads all the images you're scraping. You can then use your browser to save the images. File -> "Save Page As".
This is handy if you need a one time solution for downloading a bunch of images from a remote server that has allow_url_fopen set to 0.
This worked for me after file_get_contents and curl failed.
file_get_contents() is the simplest method to grab a page without installing extra libraries.
Suddenly, my images are not showing up on my site when accessing https pages. No change in my code. My host did have to recompile their ftp service with SSL support after my request (so I could ftpes my site). Can't thing of anything else that would affect my SSL cert. Same thing happens on FF and IE and on different computers.
If I go to your website ( https://www.scfootball.org/ ), I don't see the images, as you said : I get a 403 (Forbidden) error for each one of those -- I can see this using the "Net" tab of the Firefox extension Firebug, for instance.
If I try to see an image directly, without going trough the site (for instance : https://www.scfootball.org/widgets/GulloParkHeader.png ), then, I can see the image.
If you try, make sure you copy-paste the URL to a new tab/window, and not just click on it
Which means there is some kind of trouble between the website and the access to the images ; not on the images themselves.
If I disable the referer in firefox (the web developper toolbar extension allows that easily) and refresh your website's webpage, the images appear.
If I re-enable the referer, and refresh again, then the images don't appear anymore.
Which means there is something, related to the referer, that prevents the images from being sent, and returns a 403 error instead.
Just a wild guess : maybe there's a .htaccess in your widgets directory (or somewhere else) that prevents images from being served, if the referer doesn't correspond to a specific domain ?
Considering the images are displayed on the site when I access it without HTTPS (i.e. with an URL such as http://www.scfootball.org/index2.php ), maybe there is a "protection" in place so your images are not displayed if the Referer is not that non-https website... And that "protection" has not been updated when you switched to HTTPS ?
(I've seen that kind of "protection" used to prevent hot-linking of images, for instance)
This is an old post but, it could be hotlink protection too.
If you turn it on an allowed alias for a http:// domain you gotta for its https:// too, otherwise itll deny your images from being called to client.
I used DJango API to test locally, everything is normal, after deploying SSL, the interface access is normal, but the picture shows 404
Yeah, "not showing up" is a little vague. If the HTML is served by HTTPS and the images are still being served by HTTP, there's a little security leak inherent in the page, which your browser may deal with in one of several, largely ineffectual, ways.