I have a website thats essentially a people directory.
Each person has a profile page, I want to somehow enabled other webmasters to take a link of code that they can paste on their websites and it will pull information of my page and format it in my brand colours etc, with a link back to my website. Is this possible or is an iframe the only way?
You can do it in a iframe but comes with shortcomings, or you can use jquery/javascript to load the content from your site inside a div or some container at the remote site. But you would be facing some cross domain issues due to the common origin policy.
So you have explicitly define in your app to allow headers of origin you prefer, you can do that using JSONP or CORS, where jsonp only supports get request, cors is more appropriate way to do and supportallows any type of request.
CORS, to understand more
Read this
Related
Context
I'm running a site over https where new content (each entry has its own page) can be created and shared by users.
Each page has an image, and this image url is present in the og:image meta tag at the top of the page.
Problem
Facebook seems slow to pick up on the og:image. When the page is first created and a user attempts to share the URL, for the first ~1-3 tries, the og:image is not scraped / rendered by Facebook (the title and description are). Afterwards, the image is clearly visible in the share dialog.
A similar issue also occurs when using Facebook's OG URL debug tool. The first time I pop in the URL, it shows no image. If I choose to fetch the page from the source again, it shows the image.
Additional Notes
At first, I thought it might have been site code initially not showing the image, but I sent a curl request and spoofed one of Facebook's user agent strings (this is important to accessing the page) and the resulting HTML contains the og:image tag with the correct image URL. I also know it's not anything to do with accessing the page, or the og:title and og:description data would not be showing (but it is).
My only lead is that it could be an SSL or HTTPS issue. I recently set up the SSL certificate but I'm not sure why that would cause a delay over it not working at all.
For the sake of clarity, the site runs on WordPress on top of a standard LAMP stack.
The issue is apparently a fairly common one. The solution was to, on content creation, send a request to facebook's scraper tool with the content's URL. The scraper will pick up and process the image, allowing the first share to already have that image cached by Facebook.
Yes I've noticed this as well. It takes a long time for Facebook to cache the og:image. Tumblr does it automatically. The only reason why I could imagine why Facebook does this other than poor programming is because perhaps they have a review team scrolling through the thumbnails to block nudity and other crude images. As mentioned above, clicking the facebook share url manually upon creation will prompt them to cache it, hopefully before others click too.
I have been analyzing this issue one year ago. I had the same problem. The og:image meta tag has been only updated after several rescrape attempts.
This re-scraping can be easily triggered on this page https://developers.facebook.com/tools/debug/
According to my old analysis, the root cause of this behavior is that the FB scraper seems to have a very very short timeout. If the content page does not reply to the scraper request very quickly, FB doesn't take this reply into account. Even if the content page serves the correct meta data and a valid HTTP/200 reply, FB ignores it because "too late is too late".
I didn't find any solution for this besides "prescraping" as has been already described by Sean.
In My Case I had a azure WebApp with HTTPS setup without SSL Certificate installed. As It was in Production stage, I tested by reverting back to HTTP. All "og" tags were detected.
So, If your SSL is not properly configured and/or Facebook gives CURL SSL Error, looking into SSL might help.
I have a jQuery script in a clientDomain.com/show.php page that shows some data loaded from a serverDomain.com/echo.php page using jQuery.getJSON(). I want that only allowed domains can show the data because I don't want that unauthorized people install the script in their own website without my permission. Is there any way to restrict the response of a jQuery.getJSON() only to certain domains? The solution should prevent also the use of iframe by the client. In conclusion, the data should be seen only if someone visit directly serverDomain.com/echo.php page or one of the allowed client domains. Thanks in advance for the support!
My request/response script works like the first example in jQuery.getJSON() | jQuery API Documentation
I can only code the client jQuery script (that will be ditribuited to the allowed domains) and the serverDomain.com/echo.php page (that is my property).
Don't do that. Use auth tokens instead that are updated regularly. Anybody can fake an HTTP referrer.
Here's a good answer on SO which covers resful api authentication: REST API Token-based Authentication
I apologize for the title, I'm not quite sure how to word my question with brevity.
Say I make a form in knockout.js and someone else wants to use it. Instead of copying the knockout.js source file, the javascript containing the viewmodel and even the html for the form, is it possible to use use javascript (or something else) to call my server which would then return all of the necessary code? I guess what I'm asking is can I host these files, and to make it simple on their side, just have a few lines to call the necessary files to use the knockout form?
I've noticed a lot of sites have features that call into their API via javascript and then something gets returned (seemingly), such as Twitter and facebook.
How about twitter for example?
Tweet
<script>!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0],p=/^http:/.test(d.location)?'http':'https';if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src=p+'://platform.twitter.com/widgets.js';fjs.parentNode.insertBefore(js,fjs);}}(document, 'script', 'twitter-wjs');</script>
Is this line js.src=p+'://platform.twitter.com/widgets.js simply including the javascript file from Twitter's servers necessary for a tweet button and the rest of the code just ties into their API to let it know what to return?
Is it similar to using google for jQuery?
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min.js" ></script>
edit - I see this is dealing with same origin policy. How can my server load jQuery.js when it's hosted on google's server and not mine?
Ordinarily, you can't because of the Same Origin Policy, which prevents you from calling JavaScript from another domain.
However, you can set up CORS (Cross Origin Resource Sharing) to allow sharing scripts. If you visit that CORS link you can learn about setting it up - the method depends on your web server software. Setting this up (per file) allows the file to be shared to specified (or all) domains.
Excuse me if the title is plain idiotic with respect to the contents.
We were debating a model for an interaction-heavy site in which there will be
site.com
api.site.com
on the same server. the site.com is powered by PHP and api.site.com will be powered by an alternative web framework. The same or different servers answer the two domains.
The rendered site makes AJAX calls to api.site.com.
Securing this is easy if the application were 'all PHP'. The session feature can prevent HTTP requests that allow:
an unlogged stranger from accessing a user's data
a legitimately logged-in user from requesting another user's data
Question 1: How do you secure the internal API so that we can be sure about the legitimateness of each request?
I have googled up AJAX and same origin policy, but I didnt get far with them.
I am thinking randomly generated 'tokens' that will be acknowledged by both domains.
Question 2: Is there a specific name for this model?
You should take a look at JSONP. jQuery has a good example on it: http://api.jquery.com/jQuery.getJSON/
You need to add jsoncallback=? to the URL to make it work.
$.getJSON("http://api.flickr.com/services/feeds/photos_public.gne?jsoncallback=?"
With this, you can avoid the Same origin Policy
The jsoncallback will be a timestamp, which should be echo-ed by the PHP script which outputs the JSON like this:
jsonp1277656587731(/* rest of the JSON here */);
With the number here ofcourse being the randomly generated string, or timestamp in case of jQuery JSONP
I'm building an application, and I'd like to incorporate some stat tracking for each of the pages created. However, these pages are simply redirect pages using header() to different places, depending on what conditions have been met.
Rather then build my own stat tracking platform and incorporate it within PHP, I'd rather send traffic data to the Google Analytics platform. However, as the page exits via a header() alteration, I cannot print the normal Javascript code.
Is there anyway I can still have the page and query string traffic data sent to Google Analytics without using the standard script?
User's browser must make a request to Google's 1-pixel "tracking gif". There is google's solution for mobile web sites, where Javascript is not available. You can see it in the tracking code section of your google analytics settings pages. It is written for PHP. However, this pure-PHP solution just inserts <img> tag into the output. It won't work for you, since you're just making a redirect with HTTP headers.
So, there could be two solutions:
Make the redirect via META tag. Thus, you'll be able to track the redirect with either Javascript or PHP-based analytics code.
Try to fetch that 1x1 GIF from google server-side. However, this will screw lots of things in your Analytics. E.g. originating IP will be wrong, so all demographics will be wrong, you won't be able to pass cookies, etc. It will be most rudimentary tracking at best.
There's an official Google's PHP class "Server-Side Google Analytics PHP Client"
https://code.google.com/p/php-ga/