To address this issue, I've opened a call (well, several calls, actually) with the facebook support team but never got a logical answer.
The issue is simple : pictures disappear from a facebook page after some time.
The problem is on one page, that is Discovery Publisher. Towards the bottom of the page (i.e. esp. dated 2013 and before), some of the pictures don’t display anymore. They displayed fine after upload, but stopped being displayed after a while.
We created these posts. Each picture was uploaded individually from a personal computer.
An initial investigation shows that these pictures are located on one of facebook's servers (see screenshots below). So we logged a call with the support team. Their conclusion is that the issue resides on our side, i.e. on our website.
This does not seem to be the case, however, and that because (1) it worked before and in between we did not change anything; (2) the original pictures were never located on our serveur; and (3) the initial investigation reveals that the pictures are actually stored on one of facebook's servers, and not on our website.
Specifically, for instance, if we take the case of this post, which isn’t displayed on facebook anymore:
The link of the post is here, which is working fine.
The picture loaded at the time was this one:
Which was loaded from a computer and not pulled from our website.
According to the screenshot below, the image is located on one of facebook's servers:
which is here but triggers the following error on a browser: "An error occurred while processing your request. Reference #50.3c0edd58.1448968164.3378a460"
However, if I now do the same with an image that is displayed correctly on our facebook page:
The image URL, according to this screenshot:
is this one, which displays just fine right now.
So, where is the issue, on our page, our server (i.e. discovery publisher), or on facebook's server?
Below is the answer given by facebook, can anyone make sense of it in this context?
Firstly, the url http://www.discoverypublisher.com/publication/james-hilton-lost-horizon/ does not contain any Open Graph tags. These are essential if your client wants the correct image, title, description to appear when they share on Facebook.
Please refer to the following links, Webmasters documentation 1 and the URL debugger 2
Here you will be able to see exactly what our crawler sees and what will display when your url is shared to Facebook.
https://developers.facebook.com/docs/sharing/webmasters
https://developers.facebook.com/tools/debug/
Images are cached asynchronously, so the image may not render the first time someone shares content. This sounds like the blank image problem that your are experiencing. This can be avoided by following the instructions in the caching images documentation 3
https://developers.facebook.com/docs/sharing/webmasters/optimizing#cachingimages
Following, the call was closed, and any attempt to re-open it is immediately followed by a close call.
The fact that the server responds telling you that
An error occurred while processing your request.
and even gives you a unique code for it shows that it's a problem with the Facebook servers. It could be that the servers are (and have been) having problems over the last few weeks/days/hours, but as said in a comment, it's more likely a migration problem.
Related
I have written a PHP based blog for the company i work for. Not using any frameworks. I am having trouble tracking users who come from my facebook page's posts to my blog (not wordpress).
I have created a shortlink url. Let's say it is sample.co and it redirects traffic to sample.com. Everything seems fine until here. The problem starts here.
I am adding all user's ip's, user agents. But if even i get 500 visits, my code adds somethig like 3.000 visits. Facebook stats and Analytics shows similar stats (~500 visits). I see that ip's added to MySQL are all different. It usually happens with Android users. I have read somewhere that Facebook sometimes renders to their users the actual URL when FB shows the post. I mean instead of the widget, Facebook shows the whole page. I am not quite sure about that to be honest.
To solve this problem, I have created and added an jquery script to my page and listened users' scroll event. It worked great. Not seeing too much traffic. But this time the problem is i am counting less users. Even I get 500 users from facebook and Analytics shows similar results, my script adds only 200-300 to MySQL.
Does anyone know a better way to track real traffic? Or do you aware of such problem?
Thanks
It should be filtered on the basis of user agent.
https://developers.facebook.com/docs/sharing/webmasters/crawler
how to detect search engine bots with php?
Identifying users through IP is a good idea, but if your IP keeps changing, it's a good idea to use cookies.
http://php.net/manual/en/function.uniqid.php
If the cookie does not exist, you should see it as a new user.
I have found the answer. The problem is called preview (prefetch). Here is the link:
https://www.facebook.com/business/help/1514372351922333
Simply, facebook preloads everything when FB shows the thumbnail to the visitor to speed up your page's load speed. They send X-Purpose: preview header. So you can simply check if HTTP_X_PURPOSE header's value is preview or not. If so, do not count it as a visitor.!
Here are more detailed descriptions:
http://inchoo.net/dev-talk/mitigating-facebook-x-fb-http-engine-liger/
http://inchoo.net/dev-talk/magento-website-hammering-facebook-liger/
Context
I'm running a site over https where new content (each entry has its own page) can be created and shared by users.
Each page has an image, and this image url is present in the og:image meta tag at the top of the page.
Problem
Facebook seems slow to pick up on the og:image. When the page is first created and a user attempts to share the URL, for the first ~1-3 tries, the og:image is not scraped / rendered by Facebook (the title and description are). Afterwards, the image is clearly visible in the share dialog.
A similar issue also occurs when using Facebook's OG URL debug tool. The first time I pop in the URL, it shows no image. If I choose to fetch the page from the source again, it shows the image.
Additional Notes
At first, I thought it might have been site code initially not showing the image, but I sent a curl request and spoofed one of Facebook's user agent strings (this is important to accessing the page) and the resulting HTML contains the og:image tag with the correct image URL. I also know it's not anything to do with accessing the page, or the og:title and og:description data would not be showing (but it is).
My only lead is that it could be an SSL or HTTPS issue. I recently set up the SSL certificate but I'm not sure why that would cause a delay over it not working at all.
For the sake of clarity, the site runs on WordPress on top of a standard LAMP stack.
The issue is apparently a fairly common one. The solution was to, on content creation, send a request to facebook's scraper tool with the content's URL. The scraper will pick up and process the image, allowing the first share to already have that image cached by Facebook.
Yes I've noticed this as well. It takes a long time for Facebook to cache the og:image. Tumblr does it automatically. The only reason why I could imagine why Facebook does this other than poor programming is because perhaps they have a review team scrolling through the thumbnails to block nudity and other crude images. As mentioned above, clicking the facebook share url manually upon creation will prompt them to cache it, hopefully before others click too.
I have been analyzing this issue one year ago. I had the same problem. The og:image meta tag has been only updated after several rescrape attempts.
This re-scraping can be easily triggered on this page https://developers.facebook.com/tools/debug/
According to my old analysis, the root cause of this behavior is that the FB scraper seems to have a very very short timeout. If the content page does not reply to the scraper request very quickly, FB doesn't take this reply into account. Even if the content page serves the correct meta data and a valid HTTP/200 reply, FB ignores it because "too late is too late".
I didn't find any solution for this besides "prescraping" as has been already described by Sean.
In My Case I had a azure WebApp with HTTPS setup without SSL Certificate installed. As It was in Production stage, I tested by reverting back to HTTP. All "og" tags were detected.
So, If your SSL is not properly configured and/or Facebook gives CURL SSL Error, looking into SSL might help.
I'm using Facebook Debugger for the URL's i try to share
https://developers.facebook.com/tools/debug/og/object/
Show existing scrape information gives me Error parsing input URL, no data was cached, or no data was scraped. and when URL is shared, no thumbnail is shown.
However after using Fetch new scrape information I'm getting proper output, and the sharing is working fine.
This is not probably because of cache, because it happens also when appending random suffix to the URL.
How to fix the damn thing? Situation is weird, as the URL's title and description is parsed fine.
Facebook doesn't scrape information from your website each any every time you share a link. Facebook saves a cached copy of the information on the Facebook servers.
When you click "Fetch new scrape information" it forces Facebook to ignore the cached information and scrape your website immediately.
It sounds like you (or somebody) may have tried to share your URL on Facebook earlier, which has returned an error. Facebook remembered this error until you forced it to fetch new information.
It's perfectly normal so I wouldn't worry about it. Facebook would have refreshed it's cache of your page sooner or later and fixed it.
I have an app that creates dynamic pages for users with like buttons.
The button works on all of the pages, but when I refresh the pages some of them don't "remember" the like count (or that I liked the page a second ago), while others work perfectly.
Here's an example of a page that remembers the like count: www.teespring.com/teespring
And one that doesn't work: www.teespring.com/brownrugby
The problem lies the value of the meta tag fb:admin. Here is what you have published -
"102628019845885" is not a valid Facebook user id. Please correct it and your users would be able to "Like" your page.
Well, you can debug such issues yourself, just go to Facebook tool - http://developers.facebook.com/tools/debug
I figured it out earlier today after spending few hours while debugging a similar issue for my app http://www.jokeshive.com
If you monitor your network traffic while you click the like button, you can an XHR request to Facebook to create the Like for the user.
You will see when you click the like button, Facebook makes this request, and returns a JSON string with the status. Yours actually fails and here's the relevant part of the returned response.
"payload":{"requires_login":false,
"error_info":{"brief":"App ID does not match domain",
"full":"The app ID specified within the \"fb:app_id\" meta tag is
not allowed on this domain. You must setup the Connect Base Domains
for your app to be a prefix of http:\/\/teespring.com\/brownrugby.","errorUri":"\/connect\/connect_to_node_error.php?
title=App+ID+does+not+match+domain&body=The+app+ID+specified+within+the+\u002522fb\u00253Aapp_id\u002522+meta+tag+is+not+allowed+on+this+domain.+You+must+setup+the+Connect+Base+Domains+for+your+app+to+be+a+prefix+of+http\u00253A\u00252F\u00252Fteespring.com\u00252Fbrownrugby.&hash=AQAacTBYi-g6Czel"},
From this response, it seems like there's an issue with the domain configuration of your application, or the app id configuration of your open graph object pages.
Hopefully this helps and points you in the right direction.
Today I noticed that a few Google advertisements in an adsence block on one of my pages were trying to display a file called "/pagead/badge/checkout_999999.gif" from my server. I did a bit of investigating and found out that the companies behind these adverts use Google Checkout and "checkout_999999.gif" is supposed to be a tiny shopping kart icon with a tooltip which reads "This site accepts Google Checkout".
My problem is that "/pagead/badge/checkout_999999.gif" doesn't exist on my server. What do you do to handle this on your website? e.g:
Save the logo () on your server in the place it is expected by Google?
Use a mod_rewrite rule to redirect the request? To where though?
Find a adsense option to turn off Google checkout enabled adverts? (I looked but couldn't see one?)
Ignore the issue and get on with something important
Back-story - please ignore unless very bored: Page 2 of the search page on our site suddenly stopped working and I didn't know why. It turned out to be related to Google adsense. We use PHP session variables to save search criteria over different pages which worked fine for a while but then randomly stopped working. Random bugs are the worst! I was trying to work out what else is random on the page and decided that the Google ads were the only other random thing. Sure enough, sometimes a particular advert seemed to clear the session variables and break the search. What was actually happening was that the advert was requesting a image from our server ("checkout_999999.gif") which didn't exist and Apache was behind-the-scenes redirecting to the site homepage which unfortunately clears the session variables needed for the search - hence the non-obvious breakage. I'm a bit worried that Google ads can request random files from my sever? I'd prefer if they could only use absolute URL's if they want to include logo's or other media.
Sounds like a bug with google adsense delivery. File a bug with them is your best bet for a long-term fix.
After playing around with Apache mod_rewrite for a while I have found a rule that seems to fix my Google Adsense issue:
RewriteRule ^\/pagead\/badge\/checkout\_999999\.gif$ http://pagead2.googlesyndication.com/pagead/badge/checkout_999999.gif [R=301,L]
The problem is I'm not sure how to stop a similar thing happening in the future if Google decides to hotlink to a different file?
As Google doesn't care about the problem (previous posts were sent in June and we are now in October... and the bug was reported directely to Google), I decided to put an image on my server that would suggest the user to click on the ad!
Clicks then increased!!
As this is my server, I can do what I want and that's not my problem if Google asks for files on my server. They cost me money by using my bandwidth and connecting to my server, I am now paid for that!!
You should do the same...