Facebook Connect blank page - php

I have a strange problem with Facebook Connect on one of my sites. It was working perfectly before this morning. I've checked all recent changes, and nothing seems to be related with this issue.
Note: I'm using FBML, and deleting cookies for my site and for Facebook with each test.
When I click the Facebook Connect Button, the login form appears correctly. The blank page occurs when Facebook checks the permissions at this URL:
https://www.facebook.com/dialog/permissions.request?[list of params...]
I have no Javascript errors, and no HTML is returned.
I tried to set the wanted permissions to perms="email" only, in case where the permissions changed in the pasts days, but I have the same problem.
Does anyone has a clue about this situation?

We were already scheduled to remove all FBML tags on this site, and replacing it with JS corrected the issue we had !
So the Facebook Connect using FBML seems no longer supported, even if it's supposed to be.

Related

Session is empty when link is opened from another tab with Laravel

the title may not be 100% clear but I have an issue to which I cannot find a solution... Let me (try to) explain :
I have a Laravel application (v5.5) and when I sign in, everything works fine. I can open a new tab, and my session is found so I do not need to sign in again, which is perfectly normal.
Except (!!!) when I click on a link (i.e from an email) from another tab : when I do this, I need to sign in.
It seems that the session data is completely empty and cannot be retrieved through the cookie. But if I open a new tab and try accessing the url by doing a simple copy/paste in the browser, I'm successfully logged in.
Moreover, after asking for my login/password after clicking the link from the other tab, the application does not redirect me to the queried url as it's supposed to, but rather redirect me to the home page.
Has anyone faced this issue? And maybe successfully solved it?
Thanks !
I've found out what was wrong: I had configured the "same_site" option to "strict" instead of "lax"... It works fine now !

bypass tumblr privacy message in api calls (old api)

I have a script to browse through tumblr blogs without looking at all the blogs individually. I use the old, simple tumblr-api. It worked perfect until tumblr now shows one of these new privacy messages first, that one has to click to continue.
I guess, after the click tumblr sets a cookie, but I can't find a way to get this cookie working for my php script...
if you put in the browser:
http://nakedworldofmars.tumblr.com/api/read/json?start=0&num=5&type=photo
usually you get a json with lot's of data that then can be used. But now this privacy warning comes first. After you once clicked "ok" the api-call works in the browser, actually after the "ok" you are directly redirected and he json-code is delivered. With the next call the warning page does not appear anymore (unless you delete cookies).
Now I use this in a php-script:
<?php
$testread = file_get_contents('http://nakedworldofmars.tumblr.com/api/read/json?start=0&num=5&type=photo');
if (strpos($testread, 'Before you continue') !== false) {
echo $testread;
}
?>
I thought, clicking the "ok" on the echoed page would produce the cookie.
But it doesn't work.
Anybody has any idea how I can make that tumblr knows that my php-script has seen and accepted the page?
I faced exactly the same problem since yesterday – I am using javascript, though. After several trials to solve this using the old API I failed and thus finally just switched to the new API (v2). I also hesitated to do so at first because I am not too familiar with dealing with OAuth keys as well. But it turned out to be quite easy – and it worked instantly. I used this guide as a starting point:
http://www.developerdrive.com/2014/05/how-to-get-started-with-the-tumblr-api-part-1/
http://www.developerdrive.com/2014/05/how-to-get-started-with-the-tumblr-api-part-2/

Facebook - Error parsing input URL, no data was cached, or no data was scraped

After research I found that alot of people facing the same issue. But so far I don't solution, this happened after I switch my server to linode.com
lets take an example. www.acemark2u.com is one of the website hosted under the linode server,
when I try to debug in https://developers.facebook.com/tools/debug/og/object/, it just couldn't fetch the scrape information correctly, and if I try with one of the page www.acemark2u.com/about-us, it just show me the error "Error parsing input URL, no data was cached, or no data was scraped."
weird things happen. when I try to debug using ip address 106.187.35.114/~acemark2 everything goes smooth. fetching nicely, no error 404 for pages.
I suspect it might caused by "gethostbyaddr" function (ref: http://www.gearhack.com/Forums/DisplayComments.php?file=Computer/Network/Internet/Preventing_Your_Web_Server_From_Blocking_Facebook_Share) but so far I don't have solutions.
For people experiencing the same issue but for different causes, I discovered a few interesting things about how Facebook "scrapes" pages, checking the logs of the server while doing some trials.
First of all: if you never tried to share a page with FB, FB never tried to scrape it, and it will not try to do so if you only put the url in the Debug tool.
That's the first reason because you get the error: it just states that FB has no information on the page, you must "force" it to scrape the page.
The first time you try to share a page, FB scrapes it (asks your server the first 40k of the page and analyse the opengraph tags).
What can happen is that you do not see the image: Facebook Share Dialog does not display thumbnails one first load
The reason is that FB behind the scenes is still scraping your page and caching the image. The next time, in fact, you have also the image.
How to solve it? Pre caching: https://developers.facebook.com/docs/sharing/best-practices#precaching
or simply add
<meta property="og:image:width" content="450"/>
<meta property="og:image:height" content="298"/>
i found the solution at last.
In my default DNS A/AAAA record i did not remove these few ip
2400:8900::f03c:91ff:fe73:a95d Default
mail 2400:8900::f03c:91ff:fe73:a95d Default
www 2400:8900::f03c:91ff:fe73:a95d Default
that's why some of the users will pointed to the above IP when they access via proper web address.
This question has already accepted answer but in case this answer doesn't work for anyone here is what worked for me.
The URL which I provided in the og:url was protected URL i.e. only those users can view the page pointed by the URL who are signed-in. When I changed the URL to point to my homepage which can be viewed by both signed-in or signed-out users viz. http://www.ercafe.com everything worked fine.
We had a similar issue on one of our sites.
We resolved this by disabling apache mod_security while we use the facebook object debug tool to "fetch new scrape information"
For me the solution was replacing the DNS A records
example.sk 3600 1.2.3.4
www.example.sk 3600 1.2.3.4
to
example.sk 3600 1.2.3.4
*.example.sk 3600 1.2.3.4

Problems logging into Magento Admin

Occasionally I run into a problem logging into the Magento admin panel. The username and password I enter is correct and the url in the browser window tells me that I have logged in correctly (ie: I see domain.com/index.php/admin/areallylongstring), however the login window is displayed again. No error message telling me that the log in details are incorrect is displayed, I am just routed back to the log in window. Has anyone come across this before and can anyone please suggest a solution?
Thanks!
Try start a private browser session and see if you can login, if so clear your cookies for the website and you should be able to login.
I occasionally get this problem, next time I get it I will dig into the code with my debugger to see what is actually going on.
This problem arises due to multiple reasons and the cheapest solution for this will be comment out the following lines in one of core files of magento.
FILE :- app/code/core/Mage/Core/Model/Session/Abstract/Varien.php.
// set session cookie params
session_set_cookie_params(
$this->getCookie()->getLifetime(),
$this->getCookie()->getPath()//,
//dependes which versin of mage you are using, you may comment these as well
//$this->getCookie()->getDomain(),
//$this->getCookie()->isSecure(),
//$this->getCookie()->getHttponly()
);
Find out above code in that file and then comment out those three cookies as shown above.
But as I said it is the cheapest and easiest solution that you can go with. For more information, checkout this link

How to retrieve Facebook cookies with cross-domain and Safari

I've been working on this problem all day long, so I really need your help.
I'm trying to create a multi-site login system with Facebook Connect and unfortunatly I can't retrieve Cookies.
Here's a little more details:
I'm having a website (www.first.com) which has an iFrame to www.second.com, which display the Facebook Connect button. I have to use this method because a Facebook App is only valid for 1 website, and I will need to use it on multiple.
When the user clicks on the button and log into Facebook, he is redirect to www.second.com, which saves values in a database, which is later retrieved on www.first.com
Everything is working fine in Firefox, IE 8/7 works fine too since I've added the P3P header.
The problem is that I can't make it work on Safari, which requires some kind of interaction from this user to the iframe.
I found a code ( http://anantgarg.com/2010/02/18/cross-domain-cookies-in-safari/ ) but I'm not sure how to use it, I've tried every possible way (I think), and nothing. I guess it doesn't work because I would need to use this on Facebook's server (which i can obviously ;) )
Does anyone have an idea?
Sorry for the huge block of text ;) let me know if you need more information.
Cross Domain Cookies sounds like a security-bug.
Are you sure you don't offend against the same origin policy?
In fact afaik, you can't access a cookies for first.com from second.com

Categories