PHP Query into CURL - php

i have a simple function, which should query the USGN number from the table and insert it into CURL request, but its giving white page...
For my bad luck i cannot provide error, its just blank page.
here is the script:
function getUSGNavatar($id) {
$usgn = mysql_query("SELECT * FROM cm_users WHERE USGN = '$usgn' AND ID = '$id'") or die(mysql_error());
return mysql_fetch_assoc($usgn);
// CURL
$ch = curl_init('http://www.unrealsoftware.de/getuserdata.php?id='$usgn'&data=avatar');
curl_exec($ch);
if(!curl_errno($ch))
{
$info = curl_getinfo($ch);
}
curl_close($ch);
// CLOSE function
}
Thanks for help
Something messed up with the query part i am sure, curl works okay, when ididn't used mysql worked fine.
I will rewrite Mysql into PDO, or MySQLI please dont mention it.

You are returning i.e. return mysql_fetch_assoc($usgn);
and you should not be as this will finish the function at that point.
Oh and your string building in
$ch = curl_init('http://www.unrealsoftware.de/getuserdata.php?id='$usgn'&data=avatar'); also had a problem.
function getUSGNavatar($id) {
$usgn = mysql_query("SELECT *
FROM cm_users
WHERE ID = '$id'");
if ( ! $usgn ) {
echo mysql_error();
}
$row = mysql_fetch_assoc($usgn);
// CURL
$ch = curl_init('http://www.unrealsoftware.de/getuserdata.php?id=' .
$row['USGN'] . '&data=avatar');
curl_exec($ch);
if(!curl_errno($ch)) {
$info = curl_getinfo($ch);
}
curl_close($ch);
// now you probably want to return something from the function
return $info;
}
Oh and as per your request I will not nag you about the use of the deprecated mysql_ database extension, because we believe you that you will rewrite this code once its working, dont we?

you're getting the blank page because of this:
id='$usgn'&data
change this to:
id='.$usgn.'&data
you're missing the "."

Related

PHP calling another PHP page for MySQL Query (returning JSON data)

I would like to find out how a PHP page calls another PHP page, which will return JSON data.
I am working with PHP (UsersView.php) files to display my contents of a website. However, I have separated the MySQL Queries in another PHP (Get_Users.php) file.
In the Get_Users.php, I will have a MySQL statement to query the database for data. It will then encode in JSON and be echo-ed out.
In the UsersView.php, I will call the Get_Users.php in order to retrieve the Users JSON data. The data will then be used to populate a "Users Table".
The thing is, I do not know how to call the "Get_Users.php" from the "UsersView.php" in order to get the data.
Part of UserView.php
$url = "get_user.php?id=" . $id;
$json = file_get_contents($url);
$result = json_decode($json, true);
I am trying to call the file which is in the same directory, but this does not seem to work.
Whole of Get_Users.php
<?php
$connection = mysqli_connect("localhost", "root", "", "bluesky");
// Test if connection succeeded
if(mysqli_connect_errno()) {
die("Database connection failed: " . mysqli_connect_error() . " (" . mysqli_connect_errno() . ") " .
"<br>Please retry your last action. Please retry your last action. " .
"<br>If problem persist, please follow strictly to the instruction manual and restart the system.");
}
$valid = true;
if (!isset($_GET['id'])) {
$valid = false;
$arr=array('success'=>0,'message'=>"No User ID!");
echo json_encode($arr);
}
$id = $_GET['id'];
if($valid == true){
$query = "SELECT * FROM user WHERE id = '$id'";
$result = mysqli_query($connection, $query);
if(mysqli_num_rows($result) == 1){
$row = mysqli_fetch_assoc($result);
$arr=array('success'=>1,'type'=>$row['type'],'user_id'=>$row['id'],'email'=>$row['email'],'name'=>$row['name'],'phone'=>$row['phone'],'notification'=>$row['notification']);
echo json_encode($arr);
}else{
$arr=array('success'=>0,'message'=>"Invalid User ID!");
echo json_encode($arr);
}
}
mysqli_close($connection);
?>
You have a couple of different ways to accomplish this:
You should be able to first set the actual id and then include the Get_Users.php file like this. Notice that you should not echo out the output from Get_Users.php, instead only return the encoded json data using return json_encode($arr);:
// set the id in $_GET super global
$_GET['id'] = 1;
// include the file and catch the response
$result = include_once('Get_Users.php');
You can also create a function that can be called from UserView.php:
// Get_Users.php
<?php
function get_user($id) {
// connect to and query database here
// then return the result as json
return json_encode($arr);
}
?>
// In UserView.php you first include the above file and call the function
include_once('Get_Users.php');
$result = get_user(1);
You could also use file_get_contents(). Notice that you need to make sure so that allow_url_fopen is enabled in your php.ini file for this to work:
$result = file_get_contents('http://example.com/Get_Users.php?id=1');
To enable allow_url_fopen you need to open up your loaded configuration file and set allow_url_fopen=1 and finally restart your webserver.
You could also use curl to achieve the same result:
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, 'http://example.com/Get_Users.php?id=1');
$result = curl_exec($ch);
curl_close($ch);
An ajax request could also be made to get the result. This example uses jQuery:
$(document).ready(function() {
$.get({
url: 'Get_Users.php',
data: 'id=1',
success: function(response) {
// response contains your json encoded data
// in this case you **must** use echo to transfer the data from `Get_Users.php`
}
});
});
Change UsersView.php to like this
$actual_link = 'http://'.$_SERVER['HTTP_HOST'].$_SERVER['CONTEXT_PREFIX'];
$url = "get_users.php?id=" . $id;
$url = $actual_link.$url;
$json = file_get_contents($url);
$result = json_decode($json, true);
This will work fine.

retry a statement inside an array loop in php but with the next value of the array

I'm here again, learning more and more about PHP, but still have some problems for my scenario, most of my scenario has been programmed and solved without problem, but I found an issue, but to understand it, I need to explain it first:
I have a PHP script which can be invoked by any client and its work is to receive a request, ping to a proxy from a list which I define manually, to know if a proxy is available, if it is available, I proceed to retrieve a response using "curl" with a POST method. The logic is like this:
$proxyList = array('192.168.3.41:8013'=> 0, '192.168.3.41:8023'=>0, '192.168.3.41:8033'=>0);
$errorCounter = 0;
foreach ($proxyList as $key => $value){
if(!isUrlAvailable($key){ //It means it is NOT available so I count errors
$errorCounter++;
} else { //It means it is AVAILABLE
$result = callThisProxy($key);
}
}
The function "isUrlAvailable" uses a $fsockopen to know if the proxy is available. If not, I make a POST with CURL as mentioned before, the function has callThisProxy() something like:
$ch = curl_init($proxyUrl);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS,'xmlQuery='.$rawXml);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$info = curl_exec ($ch);
if($isDebug){echo 'Info in the moment: '.$info.'<br/>';}
curl_close ($ch);
But, we're testing some scenarios, what happen if I turn off the proxy between the verification of the proxy availability and the call? I mean:
foreach ($proxyList as $key => $value){
if(!isUrlAvailable($key){ //It means it is NOT available so I count errors
$errorCounter++;
} else { //It means it is AVAILABLE
$result = callThisProxy($key);//What happen if I kill the proxy when the result is being processed?
}
}
I tested it and when I do that, the $result comes as empty string ''. But the problem is that I lost that request, and my goal is to retry it with the next $key which is a proxy. So, I've been thinking of a "do, while" when I invoke the result. But not sure, if it is ok or there's a better way to do it, so please I ask for help with this issue. Thanks in advance for your time any answer is welcome. Thanks.
Maybe something like:
$result = "";
while ($result == "")
{
foreach ($proxyList as $key => $value)
{
if (!isUrlAvailable($key))
{
$errorCounter++;
}
else
{
$result = callThisProxy($key);
}
}
}
// Now check $result, which should contain the first successful callThisProxy()
// result, or nothing if none of the keys worked.
You could just keep a list of proxies that you still need to try. When you hit the error or get a valid response then you remove the proxy from the list of proxies to try. If you do not get a good response then keep it in the list and try it again later.
$proxiesToTry = $proxyList;
$i = 0;
while (count($proxiesToTry) != 0) {
// reset to beginning of array
if($i >= count($proxiesToTry))
$i = 0;
$proxy = $proxiesToTry[$i];
if (!isUrlAvailable($proxy)) { //It means it is NOT available so I count errors
$errorCounter++;
unset($proxiesToTry[$i]);
} else { //It means it is AVAILABLE
$result = callThisProxy($proxy);
if($result != "") // If we got a response remove it from the array of proxies to try.
unset($proxiesToTry[$i]);
}
$i++;
}
NOTE: You will never break out of this loop if you don't ever get a valid response from some proxy.

Using PHP to Detect All Links (Including those that don't go to a file)

I'm trying to detect broken links. The following PHP accessing a MySQL table seems to work great (but slow due to fopen) for almost everything:
function fileExists($path){
return (#fopen($path,"r")==true);
}
$status="";
$result = mysql_query(" SELECT id, title, link from table ");
while ($row = mysql_fetch_array($result)) {
$id=$row{'id'};
$title=$row{'title'};
$link1=$row{'link1'};
etc.
if ($link){
if (fileExists($link)!=TRUE) {
$status='BROKEN_LINK';
}
}
//Here do something if the status gets set to broken
}
BUT the problem is links like this:
torrentfreak.com/unblocking-the-pirate-bay-the-hard-way-is-fun-for-geeks-120506
Here it isn't going to a file but going somewhere and getting content. So what is the best way to actually detect these situations correctly when they are not on your own domain?
Thanks!
Mordak
You can try using the cURL method:
function fileExists(&$pageScrape, $path){ // Adding parameter of cURL resource as a pointer.
curl_setopt($pageScrape, CURLOPT_URL, $path); // Set URL path.
curl_setopt($pageScrape, CURLOPT_RETURNTRANSFER, true); // Don't output the scraped page directly.
curl_exec($pageScrape); // Execute cURL call.
$status = curl_getinfo($pageScrape, CURLINFO_HTTP_CODE); // Get the HTTP status code of the page, load into variable $status.
if ($status >= 200 && $status <= 299) { // Checking for the page success.
return true;
} else {
return false;
}
}
$pageScrape = curl_init();
$status="";
$result = mysql_query(" SELECT id, title, link from table ");
while ($row = mysql_fetch_array($result)) {
$id=$row{'id'};
$title=$row{'title'};
$link1=$row{'link1'};
etc.
if ($link){
if (fileExists($pageScrape, $link)!=TRUE) {
$status='BROKEN_LINK';
}
}
//Here do something if the status gets set to broken
}
curl_close($pageScrape);
You can fine tune the status check by looking over the list of HTTP status codes: Wikipedia link

Twitter Code "Not Fol Back" PHP

I have this code, to get followers and friends(peeps you are following), However I'm completely lost, I want to get a "Who's not following back" page. Whats the best way to go about this? Should I manually search for strings or . . .
Pseudo :: $riends-$followers
$user = $query[1];
if (!$user) {
user_ensure_authenticated();
$user = user_current_username();
}
$folwers = API_URL."statuses/followers/{$user}.xml";
$folwin = API_URL."statuses/friends/{$user}.xml";
$tl = lists_paginated_process($request);
$content = theme('followers', $tl);
theme('page', 'Followers', $content);
}
Thanks
This might not be the most optimal answer out there, but you might want to read the following link:
https://dev.twitter.com/docs/api/1/get/friendships/exists
Since you're able to get all profiles that you're following, you could easily just loop the array of usernames and curl to the "friendship verification"-page. Check out the snippet below:
<?php
$link = 'http://api.twitter.com/1/friendships/exists.xml?screen_name_a=php_net&screen_name_b=';
foreach( $usernames as $username ) {
//Create a curl handler and make sure we return to a variable
$ch = curl_init($link . $username);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$status = strip_tags(curl_exec($ch)); //Should return: true
curl_close($ch);
//Output based on the results
if( $status == 'true' ) echo $username , ' isn\'t following you. <br />';
else echo $username , ' is following you. <br />';
}
?>
It might at least be worth a shot, although I'm pretty sure that there are better ways to accomplish what you're looking for. Good luck!

Grabbing Twitter Friends Feed Using PHP and cURL

So in keeping with my last question, I'm working on scraping the friends feed from Twitter. I followed a tutorial to get this script written, pretty much step by step, so I'm not really sure what is wrong with it, and I'm not seeing any error messages. I've never really used cURL before save from the shell, and I'm extremely new to PHP so please bear with me.
<html>
<head>
<title>Twitcap</title>
</head>
<body>
<?php
function twitcap()
{
// Set your username and password
$user = 'osoleve';
$pass = '****';
// Set site in handler for cURL to download
$ch = curl_init("https://twitter.com/statuses/friends_timeline.xml");
// Set cURL's option
curl_setopt($ch,CURLOPT_HEADER,1); // We want to see the header
curl_setopt($ch,CURLOPT_TIMEOUT,30); // Set timeout to 30s
curl_setopt($ch,CURLOPT_USERPWD,$user.':'.$pass); // Set uname/pass
curl_setopt($ch,CURLOPT_RETURNTRANSER,1); // Do not send to screen
// For debugging purposes, comment when finished
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,0);
curl_setopt($ch,CURLOPT_SSL_VERIFYHOST,0);
// Execute the cURL command
$result = curl_exec($ch);
// Remove the header
// We only want everything after <?
$data = strstr($result, '<?');
// Return the data
$xml = new SimpleXMLElement($data);
return $xml;
}
$xml = twitcap();
echo $xml->status[0]->text;
?>
</body>
</html>
Wouldn't you actually need everything after "?>" ?
$data = strstr($result,'?>');
Also, are you using a free web host? I once had an issue where my hosting provider blocked access to Twitter due to people spamming it.
note that if you use strstr the returend string will actually include the needle-string. so you have to strip of the first 2 chars from the string
i would rather recommend a combination of the function substr and strpos!
anways, i think simplexml should be able to handle this header meaning i think this step is not necessary!
furthermore if i open the url i don't see the like header! and if strstr doesnt find the string it returns false, so you dont have any data in your current script
instead of $data = strstr($result, '<?'); try this:
if(strpos('?>',$data) !== false) {
$data = strstr($result, '?>');
} else {
$data = $result;
}

Categories