I'm using the Twitter API PHP wrapper to load content from the Twitter API onto my secure WordPress site. But I'm getting mixed content warnings in the Chrome console as the data it's returning isn't secure.
Is this something I can safely ignore or is there a way to load secure content through the Twitter API?
The twitter API javascript files or maybe even an image are being referenced via http://. Check all files you target on the twitter website and change the path to https://. If you post the code you are using i can help better.
If changing to https doesnt help, press f12 and open up the developer tools. Any ssl errors will be in the console. This will help you find the files that are being referenced as http instead of https
Related
I have been figuratively banging my head against a wall and literally ripping my hair out in frustration over this. This is the issue:
I need to make my webpage tied to my name, i.e. when you search my name the page pops up. Thus I need to verify my webpage on google search console
All the methods for doing this somehow require me entering code into the HTML of my front page
This cannot be edited from Wordpress online, thus I must install Wordpress
Wordpress installation requires that I first install FTP, which requires FTP username and password, which Wordpress does not want to give me.
Thus, I am totally unable to achieve my first goal (connect my webpage to my name). Is there a way to achieve this without having to download Wordpress, which is turning into a huge pain in the ass? I use a free version of Wordpress.
Actually you can insert the Google Console verification meta on a Wordpress.com hosted site.
Go to settings > SEO
Direct URL: https://wordpress.com/settings/seo/
You can verify your webpage in Google Search Console using Google Analytics. It is one of the alternate methods for the verification. Google Analytics support on WordPress.com is available as a feature of the WordPress.com Business plan. Visit Settings → Analytics under My Sites to enable Google Analytics.
Another option would be to verify via DNS and your domain name provider. Detailed instructions how to set up a CNAME record for that are available within the Google Search Console. With this option, you do not need to edit the front page HTML too.
I am working on a scraping project to extract web data from a website. I have made a script to go through URLs and parse HTML contents and get the structured content into my database.The script was working fine,but recently the script got stuck and on investigation it was found that the target site is blocking our IP.
I am using PHP / CURL for this project,now I am getting a 403 error - Access Forbidden, error on a web request.
This has affected the working of my script,no pages could be retrieved from web request,every time I am getting an access restricting error.
I know there are lot of scraping etiquette's to be followed.Since we can't foresee how they had implemented the security features,I was confused on normalizing the web request calls.
I'm working on an amazon AWZ instance with an elastic IP,hence I am confused on when/whether they would lift the ban on my IP.
I have heard of rotating proxy methods to be used with scraping,such that the target server won't block you often.But I'm not sure about it's implementation.
Any help would be highly appreciated.I could provide any additional information if necessary.
sign in to the site to get an API id.
if you send a request to the site with API id and URL. it will send a request to the required URL with a random API and return a response.
just sign in and try it
signup
I'm working on using the Etsy API and have been trying to complete this online tutorial but haven't been able to load any of the data successfully:
http://www.onextrapixel.com/2012/10/01/custom-products-webpage-layout-via-the-etsy-api/
When I load the page, it creates the cache file but the page is blank.
This is my first venture into APIs and I'm not sure how to troubleshoot what the problem might be. It seems like all of the code with loading it into the PHP webpage should work ok.
I've read about a few issues people have had using the Etsy API beacuse of the json cross-domain policy, so I'm wondering if that might be the issue, or if there might be authentication that is required.
I created a pastebin of the code from the tutorial here:
http://pastebin.com/RVDzjG4B
After checking the docs and the API, I got this.
API requests must be made over HTTPS.
Change your links to use https://.
I'm developing a project using Javascript, PHP and OpenLayers. A lot of maps are loaded using and HTTPS connection against an external OGC server.
When I try to load the map using HTTPS, they doesn't load (instead of, they show me an "Error loading the map, try again later").
I think that the problem is because of Digital Certificate. If I load directly from the server (using a WMS call) like this (look the last parameter):
https://serverurl/ogc/wms?service=WMS&version=1.1.0&request=GetMap&layers=ms1:lp_anual_250&styles=&bbox=205125.0,3150125.0,234875.0,3199875.0&width=306&height=512&srs=EPSG:4326&format=application/openlayers
The browser ask me for my authorization to see it. If i accept the Digital Certificate, I can see the map. After that, and because of my browser now accepts the certificate, I can see my own map from my own application.
So, the question is: Is there any way to ask for the Digital Certificate mannually when the user access to my web?
Thanks in advance!
PS: solutions using PHP are welcome too because I'm using CodeIgniter to load views
You could try opening the WMS URL in a div or perhaps a hidden iframe - that may cause the browser to pop up its 'Unknown cert' dialogue.
Im going to quote another user (geographika) from gis.stackexchange. I hope can help to someone with my issue:
You can use a proxy on your server so
all client requests are made to your
server, which deals with the
certificate, gets the request and
passes it back to the client. For PHP
have a look at
http://tr.php.net/manual/en/function.openssl-verify.php
If you are also using WMS software
(MapServer, GeoServer) you could
implement the same technique using a
cascading WMS server.
For details on how to do this in
MapServer see
http://geographika.co.uk/setting-up-a-secure-cascading-wms-on-mapserver
I have just installed SSL certificate to the website. However, When I open the webpage Google chrome does not show green padlock and when I look at the console it says that there is Mixed Content that causes the trouble. It is jQuery that tries to load this image which I have already made the changes in the script to load the secured one (https) Please see the attached image.
I have never seen anything like this before.Has anyone ever had this problem, please advice.
Thank you