bypass tumblr privacy message in api calls (old api) - php

I have a script to browse through tumblr blogs without looking at all the blogs individually. I use the old, simple tumblr-api. It worked perfect until tumblr now shows one of these new privacy messages first, that one has to click to continue.
I guess, after the click tumblr sets a cookie, but I can't find a way to get this cookie working for my php script...
if you put in the browser:
http://nakedworldofmars.tumblr.com/api/read/json?start=0&num=5&type=photo
usually you get a json with lot's of data that then can be used. But now this privacy warning comes first. After you once clicked "ok" the api-call works in the browser, actually after the "ok" you are directly redirected and he json-code is delivered. With the next call the warning page does not appear anymore (unless you delete cookies).
Now I use this in a php-script:
<?php
$testread = file_get_contents('http://nakedworldofmars.tumblr.com/api/read/json?start=0&num=5&type=photo');
if (strpos($testread, 'Before you continue') !== false) {
echo $testread;
}
?>
I thought, clicking the "ok" on the echoed page would produce the cookie.
But it doesn't work.
Anybody has any idea how I can make that tumblr knows that my php-script has seen and accepted the page?

I faced exactly the same problem since yesterday – I am using javascript, though. After several trials to solve this using the old API I failed and thus finally just switched to the new API (v2). I also hesitated to do so at first because I am not too familiar with dealing with OAuth keys as well. But it turned out to be quite easy – and it worked instantly. I used this guide as a starting point:
http://www.developerdrive.com/2014/05/how-to-get-started-with-the-tumblr-api-part-1/
http://www.developerdrive.com/2014/05/how-to-get-started-with-the-tumblr-api-part-2/

Related

Making a comment on facebook with curl without API

I know there is a ready to use API for facebook, but since I last tried the API they change to much, now you need permissions an checks for everything you want to do ...
This project is only for personal usage, therefore I don't care about give my facebook password.
My goal is to make an comment to an picture on a facebook page, in this test aviciis
I created my first git repo https://github.com/fritz-net/facebook-AntiAPI because i have 4 files. For login etc. I hope this messy testing code is readable
The problem is that facebook shows an error that something happened (no acces, wrong page, out of date,... - German stuff),... even though the http headers are the same and the post data too. There is no point why facebook should not save this comment like any other. The login also worked well.
I'm using the mobile version because with the ajax/json version i had no real success either.
I hope I forget nothing about my problem in my tiredness. I tried to find other code from people who had done this, but everyone uses the api and I also tried to solve the problem myself for the last 6 hours (from about 02am till now 8) please excuse my bad English writing, dirty code and missing things
Maybe someone can give me a hint, good night
It turned out that solution was very easy.
in the api.class.php there is a regex where I read out the form action, I added the following $action = str_replace("&", "&", $action); and everything worked.
The problem where the html entities in the action url of the form
EDIT: I committed the changes to github for anyone interested

PHP does not read my cookie set with javascript

Apologies if this question duplicates some other question, but I can't find one exactly like it in S.O.
I am writing a remotely hosted app, the kind that runs when you put a javascript on your own website page, where the src="some remote javascript.js". so, the script operates by calling every operation as a jsonp ajax. A lot of jsonp housekeeping, but otherwise works surprisingly well.
In the main remote js script, I set a user cookie when the user logs in. It works fine, the cookie is set for a year, and when you return to the page it continues recognizes you.
However, when I try to output the cookie (even after it has been set) using php, my php code does not see it for some reason.
If I alert(document.cookie); the cookie is displayed.
If I do a var_dump($_COOKIE); php returns array(0) { }.
This isn't a "you have to reload the page after setting the cookie with javascript" problem.
As far as I know, when I use the Firefox Web Developer extension to View Cookie Information, it is all happening on the same domain.
Looking over many other examples, it is clear that PHP should be able to read a cookie, even if set by javascript.
Even as I write this, I think a glimmer of what the problem is is starting to form in my head, that (possibly) a JSONP'd php script isn't going to see the cookie set by javascript.

Block facebook from my website

I have a secure link direction service I'm running (expiringlinks.co). If I change the headers in php to redirect my visitors, then facebook is able to show a preview of the website I'm redirecting to when users send links to one another via facebook. I wish to avoid this. Right now, I'm using an AJAX call to get the URL and javascript to redirect, but it's causing problems for users who don't use javascript.
Here are a number of ways I'd like to block facebook, but I can't seem to get working:
I've tried blocking the facebook bot (facebookexternalhit/1.0 and facebookexternalhit/1.1) but it's not working, I don't think they're using them for this functionality.
I'm thinking of blocking the facebook IP addresses, but I can't find all of them, and I don't think it'll work unless I get all of them.
I've thought of using a CAPTCHA or even a button, but I can't bring myself to do that to my visitors. Not to mention I don't think anyone would use the site.
I've searched the facebook docs for meta tags that would "opt-me out", but haven't found one, and doubt that I would trust it if I had.
Any creative ideas or any idea how to implement the ones above? Thank you so much in advance!
Try this - it works for me ...
<?php
$ua = $_SERVER['HTTP_USER_AGENT'];
if (preg_match('/facebookexternalhit/si',$ua)) {
header('Location: no_fb_page.php');
die() ;
}
?>
You could try to get the logfile of your Webserver, and search there for unusal useragents. (maybe containing facebook)
Or, otherwise get the Logs and delete every containing internet explorer/firefox/opera...
Then you should have only bots useragents in the end.
Then you could search for the facebook one.
All you need to do is appropriately set up robots.txt.
http://www.robotstxt.org/robotstxt.html
You could try using a meta refresh instead of a javascript redirect. They work for all browsers and because the page still returns a 200 response any crawler should stop resolving there.

Extract fan database from Facebook page

I am trying to export Facebook Page Fans.
The closer I found was this article.
It states:
Getting fans from a Facebook page is
not yet supported by the Facebook API.
Luckily, the Facebook Web interface
uses a simple AJAX/JSON call to supply
the data when you view the page.the data when you view the page.
And he explains what he does like this:
My strategy to set this data free was
to sniff the network traffic with the
Wireshark tool, then replay the HTTP
calls with a ruby script.calls with a ruby script.
I don´t know anything about ruby so I started trying with a PHP scripts left in one of the comments, the one by: "Etienne Bley"
The script goes like this.
The script says you can download Charles Proxy to find this variables:
$cookie
$node_id
$post_form_id
$fb_dtsg
When I use the Charles Proxy Soft, and login to as administrator I get this:
And from there I get what I guess is the cookie variable:
BTW Is it safe to share the whole cookie?? is it helpful? (if it is I´ll edit asap)
The script also says:
// set settings in these 4 lines from results of charles when getting the 2nd page of "Get All Fans" in FB ( you need to be admin of fan page to do this )
I can´t understand what does he mean by: getting the 2nd page
So, my questions:
1) What are this variables?
2) What are their values? How should/can I get them?
3) To have this scripts set correctly is the only thing I need for this script to work?
I hope the question is clear enough, if not please ask any questions you need!
Thanks in advance!
I don't know about Charles Proxy Soft, but I used Chrome's excellent Inspector to trace the request.
Steps:
Use Chrome to navigate to the Facebook Page you're interested in
Open up the Inspector (CTRL+Shift+J on Windows), go to the "Resources" tab and "Enable Resource Tracking".
On the Facebook page, click "See all" in the Fans box on the left side of the page.
Scroll to the bottom of the fan list, and click "Next"
In the Resources tab, you'll have a request to /ajax/social_graph/fetch.php. Click on that, and in the Headers tab you'll see what you need. In my example:
I'm sure you can do that with a hundred different other programs, I find it easier to use Chrome since it's already there :)
Alright, so it seems this is all simple. I recommend getting a copy of Fiddler to inspect this plan yourself.
I opened up a fan page, went to view the fans, and hit next page. I saw a POST request for http://www.facebook.com/ajax/social_graph/fetch.php?__a=1. What I got back was a really nice JSON array, containing all of the fans.
If we inspect the variables posted, it becomes obvious...
edge_type = fan
page = 1
limit = 100
node_id = 123123123123123123123 (ID of the fan page I'm assuming)
class = FanManager
post_form_id = 97823498723498 (No idea, but I bet you can get this from the dialog)
fb_dtsg = a1s3d5f (No idea)
lsd =
post_form_id_source = AsyncRequest
Anyway, what you are interested in is page and limit. I bet if you set page to 0 and limit to 500 or whatever, you will get what you are looking for. In the event you can't change limit reliably, just leave it at 100 and keep incrementing page. Also, I have my cookies in there, with the session information. How you will get those and post from PHP I don't know, but I hope this gives you some things to go on.
Again, get Fiddler, inspect what happens when you browse the page.

Using javascript history.back() fails in Safari .. how do I make it cross-browser?

I am using
Back
to provide a back to previous page link. It works fine on Windows (IE/Mozilla) but fails in Safari on both Windows/Mac.
Is there a way to make it work on all of the systems/browsers (cross-browser/platform)?
If it's not possible, is there any other way using PHP etc?
it should be history.go(-1); return false;
or
history.go(-1); event.preventDefault();
You should consider doing it like this instead:
Back
Try this instead. It should work across IE, FF, Safari and Chrome.
<a href="#" onclick="if(document.referrer) {window.open(document.referrer,'_self');} else {history.go(-1);} return false;">Cancel<a>
The below code is working.
Back
If anyone is still struggling with this issue try removing the href-attribute from the link you want to use window.history.back() with. I'm not sure if this workaround complies with HTML-standards but it worked out fine for me.
I've faced the same issue recently, and although I'm not exactly sure why, this is the solution that worked for me:
If the user is on iOS:
history.go(-2)
If not:
history.go(-1)
I faced a similar issue on an e-commerce site I have been building for one of my customers. I initially went the route of calling:
window.history.back();
when the button was clicked. I encountered the same problem you are having with cross compatibility issues.
To answer you question about
If it's not possible, is there any other way using PHP etc?
My opinion is you should not invoke a method on the server to do a client operation. This is unnecessary overhead on your app and in my opinion, poor design/implementation.
Now to answer your main question:
Is there a way to make it work on all of the systems/browsers (cross-browser/platform)?
To solve the issue I found a client cookie library produced by Mozilla (https://developer.mozilla.org/en-US/docs/Web/API/Document/cookie) from another StackOverflow post (my apologies to the author - I don't have the link to your post).
Using the library I create a cookie with a key of 'back-url' when the user navigates to the part of my app where I want them to be able to go back:
$('#id-of-button-clicked').click(function() {
docCookies.setItem("back-url", window.location.href, ".myDomain.com", "/");
});
This code sets a cookie with a key-value pair 'back-url', and the current url and makes it accessible from the root directory of myDomain.com.
Next, on the page where I want the user to be able to navigate back to the URL set in the cookie, I call the following code:
$(‘#id-of-back-button’).click(function() {
window.location.href = docCookies.getItem('back-url');
});
This code sets the window location by getting the value of 'back-url'.
Disclaimer: I am no professional js developer - just trying to put in my two cents from some lessons I have learned.
Challenges to this answer:
This answer uses cookies, many people don't like the use of cookies. My customers requirements allow me to use cookies.
I'm not encrypting the cookie - some may consider this bad practice. I am still in the early implementation phase of our development. I am, however, restricting access to the cookie to only within our domain.

Categories