I am new here to get answers for my issues, hoping for your kind advice. Thanks in advance.
I have written a HTTP API to send SMS using curl. Everything is working fine, except I am failing to loop and post curl for certain phone numbers. For example: User uploads 50000 phone numbers using excel sheet on my site, I fetch all the mobile numbers from the database, and then post it through CURL.
Now the sms gateway which I send the request accepts only maximum 10000 numbers at once via http api.
So from the 50000 fetched numbers I want to split the numbers to 10000 each and loop that and send curl post.
Here is my code
//have taken care of sql injection on live site
$resultRestore = mysql_query("SELECT * FROM temptable WHERE userid = '".$this->user_id."' AND uploadid='".$uploadid."' ");
$rowRestoreCount = mysql_num_rows($resultRestore);
#mysql_data_seek($resultRestore, 0);
$phone_list = "";
while($rowRestore = mysql_fetch_array($resultRestore))
{
$phone_list .= $rowRestore['recphone'].",";
}
$url = "http://www.smsgatewaycenter.com/library/send_sms_2.php?UserName=".urlencode($this->param[userid])."&Password=".urlencode($this->param[password])."&Type=Bulk&To=".urlencode(substr($phone_list, 0, -1))."&Mask=".urlencode($this->sendname)."&Message=Hello%20World";
//echo $url;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$curl_scraped_page = curl_exec($ch);
curl_close($ch);
Now, from the $phone_list, I need to loop for every 10000 numbers, How can I achieve this?
Its been 2 days, I have tried several things and not getting the result.
Kindly help...
NOTE: I'm going to start off with the obligatory warning about using mysql functions. Please consider switching to mysqli or PDO.
There are a number of different ways you could do this. Personally, I would reconfigure your script to only fetch 10,000 numbers at a time from the database and put that inside a loop. It might look something like this (note that for simplicity I am not updating your mysql* calls to mysqli*). Keep in mind I didn't run this through a compiler since most of your code I can't actually test
// defines where the query starts from
$offset= 0;
// defines how many to get with the query
$limit = 10000;
// set up base SQL to use over and over updating offset
$baseSql = "SELECT * FROM temptable WHERE userid = '".$this->user_id."' AND uploadid='".$uploadid."' LIMIT ";
// get first set of results
$resultRestore = mysql_query($baseSql . $offset . ', '. $limit);
// now loop
while (mysql_num_rows($resultRestore) > 0)
{
$rowRestoreCount = mysql_num_rows($resultRestore);
$phone_list = "";
while($rowRestore = mysql_fetch_array($resultRestore))
{
$phone_list .= $rowRestore['recphone'].",";
}
$url = "http://www.smsgatewaycenter.com/library/send_sms_2.php?UserName=".urlencode($this->param[userid])."&Password=".urlencode($this->param[password])."&Type=Bulk&To=".urlencode(substr($phone_list, 0, -1))."&Mask=".urlencode($this->sendname)."&Message=Hello%20World";
//echo $url;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$curl_scraped_page = curl_exec($ch);
curl_close($ch);
// now update for the while loop
// increment by value of limit
$offset += $limit;
// now re-query for the next 10000
// this will continue until there are no records left to retrieve
// this should work even if there are 50,123 records (the last loop will process 123 records)
$resultRestore = mysql_query($baseSql . $offset . ', '. $limit);
}
You could also achieve this without using offset and limit in your sql query. This might be a simpler approach for you:
// define our maximum chunk here
$max = 10000;
$resultRestore = mysql_query("SELECT * FROM temptable WHERE userid = '".$this->user_id."' AND uploadid='".$uploadid."' ");
$rowRestoreCount = mysql_num_rows($resultRestore);
#mysql_data_seek($resultRestore, 0);
$phone_list = "";
// hold the current number of processed phone numbers
$count = 0;
while($rowRestore = mysql_fetch_array($resultRestore))
{
$phone_list .= $rowRestore['recphone'].",";
$count++;
// when count hits our max, do the send
if ($count >= $max)
{
$url = "http://www.smsgatewaycenter.com/library/send_sms_2.php?UserName=".urlencode($this->param[userid])."&Password=".urlencode($this->param[password])."&Type=Bulk&To=".urlencode(substr($phone_list, 0, -1))."&Mask=".urlencode($this->sendname)."&Message=Hello%20World";
//echo $url;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$curl_scraped_page = curl_exec($ch);
curl_close($ch);
// now reset count back to zero
$count = 0;
// and reset phone_list
$phone_list = '';
}
}
// if we don't have # of phones evenly divisible by $max then handle any leftovers
if ($count > 0)
{
$url = "http://www.smsgatewaycenter.com/library/send_sms_2.php?UserName=".urlencode($this->param[userid])."&Password=".urlencode($this->param[password])."&Type=Bulk&To=".urlencode(substr($phone_list, 0, -1))."&Mask=".urlencode($this->sendname)."&Message=Hello%20World";
//echo $url;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$curl_scraped_page = curl_exec($ch);
curl_close($ch);
}
I notice that you are retrieving the information in $curl_scraped_page. In either of these scenarios above, you will need to account for the new loop if you're doing any processing on $curl_scraped_page.
Again, please consider switching to mysqli or PDO, and keep in mind that there are likely more efficient and flexible ways to achieve this than what you are doing here. For example, you might want to log successful sends in case your script breaks and incorporate that into your script (for example, by selecting from the database only those numbers that have not yet received this text). This would allow you to re-run your script but only send to those who did NOT yet receive the text, rather than hitting everyone again (or maybe your SMS gateway handles that for you?)
EDIT
Another approach would be to load all the retrieved numbers into a single array, then chunk the array into pieces and process each chunk.
$numbers = array();
while ($rowRestore = mysql_fetch_array($resultRestore))
{
$numbers[] = $rowRestore['recphone'];
}
// split into chunks of 10,000
$chunks = array_chunk($numbers, 10000);
// loop and process the chunks
foreach ($chunks AS $chunk)
{
// $chunk will be an array, so implode it with comma to get the phone list
$phone_list = implode(',', $chunk);
// note that there is no longer a need to substr -1 the $phone_list because it won't have a trailing comma using implode()
$url = "http://www.smsgatewaycenter.com/library/send_sms_2.php?UserName=".urlencode($this->param[userid])."&Password=".urlencode($this->param[password])."&Type=Bulk&To=".urlencode($phone_list)."&Mask=".urlencode($this->sendname)."&Message=Hello%20World";
//echo $url;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$curl_scraped_page = curl_exec($ch);
curl_close($ch);
}
Related
I am looking to collect the titles of all of the posts on a subreddit, and I wanted to know what would be the best way of going about this?
I've looked around and found some stuff talking about Python and bots. I've also had a brief look at the API and am unsure in which direction to go.
As I do not want to commit to find out 90% of the way through it won't work, I ask if someone could point me in the right direction of language and extras like any software needed for example pip for Python.
My own experience is in web languages such as PHP so I initially thought of a web app would do the trick but am unsure if this would be the best way and how to go about it.
So as my question stands
What would be the best way to collect the titles (in bulk) of a
subreddit?
Or if that is too subjective
How do I retrieve and store all the post titles of a subreddit?
Preferably needs to :
do more than 1 page of (25) results
save to a .txt file
Thanks in advance.
PHP; in 25 lines:
$subreddit = 'pokemon';
$max_pages = 10;
// Set variables with default data
$page = 0;
$after = '';
$titles = '';
do {
$url = 'http://www.reddit.com/r/' . $subreddit . '/new.json?limit=25&after=' . $after;
// Set URL you want to fetch
$ch = curl_init($url);
// Set curl option of of header to false (don't need them)
curl_setopt($ch, CURLOPT_HEADER, 0);
// Set curl option of nobody to false as we need the body
curl_setopt($ch, CURLOPT_NOBODY, 0);
// Set curl timeout of 5 seconds
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
// Set curl to return output as string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// Execute curl
$output = curl_exec($ch);
// Get HTTP code of request
$status = curl_getinfo($ch, CURLINFO_HTTP_CODE);
// Close curl
curl_close($ch);
// If http code is 200 (success)
if ($status == 200) {
// Decode JSON into PHP object
$json = json_decode($output);
// Set after for next curl iteration (reddit's pagination)
$after = $json->data->after;
// Loop though each post and output title
foreach ($json->data->children as $k => $v) {
$titles .= $v->data->title . "\n";
}
}
// Increment page number
$page++;
// Loop though whilst current page number is less than maximum pages
} while ($page < $max_pages);
// Save titles to text file
file_put_contents(dirname(__FILE__) . '/' . $subreddit . '.txt', $titles);
I have about 15 locations in a mysql table with lat and long information.
Using PHP and google maps API Am able to calculate distance between 2 locations.
function GetDrivingDistance($lat1, $lat2, $long1, $long2)
{
$url = "https://maps.googleapis.com/maps/api/distancematrix/json?origins=".$lat1.",".$long1."&destinations=".$lat2.",".$long2."&mode=driving&language=en-US";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_PROXYPORT, 3128);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$response = curl_exec($ch);
curl_close($ch);
$response_a = json_decode($response, true);
$dist = $response_a['rows'][0]['elements'][0]['distance']['text'];
$time = $response_a['rows'][0]['elements'][0]['duration']['text'];
return array('distance' => $dist, 'time' => $time);
}
I want to to select one as fixed e.g. row 1 given lat and long
$query="SELECT lat, long from table WHERE location=1"
$locationStart = $conn->query($query); =
I want to calculate the distance to all other locations in the tables (other rows) and return the the outcome sorted by distance
tried to calculate each one alone and end up with very long code and takes too long to fetch that via api, also still not able to sort them this way!
any hint?
Disclaimer: This is not a working solution, nor have I tested it, it is just a quick example I've done off the top of my head to provide a sort of code sample to go with my comment.
My brains still not fully warmed up, but I believe the bottom should at least act as a sort of guide to help put across the idea I was making in my comment, i'll try to answer any questions you have when I'm free. Hope it helps.
<?php
define('MAXIMUM_REQUEST_STORE', 5); // Store 5 requests in each multi_curl_handle
function getCurlInstance($url) {
$handle = curl_init();
curl_setopt($handle, CURLOPT_URL, $url);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
return $handle;
}
$data = []; // Build up an array of Endpoints you want to hit. I'll let you do that.
// Initialise Variables
$totalRequests = count($data);
$parallelCurlRequests = [];
$handlerID = 0;
// Set up our first handler
$parallelCurlRequests[$handlerID] = curl_multi_init();
// Loop through each of our curl handles
for ($i = 0; $i < $totalRequests; ++$i) {
// We want to create a new handler/store every 5 requests. -- Goes off the constant MAXIMUM_REQUEST_STORE
if ($i % MAXIMUM_REQUEST_STORE == 1 && $i > MAXIMUM_REQUEST_STORE) {
++$handlerID;
}
// Create a Curl Handle for the current endpoint
// ... and store the it in an array for later use.
$curl[$i] = getCurlInstance($data[$i]);
// Add the Curl Handle to the Multi-Curl-Handle
curl_multi_add_handle($parallelCurlRequests[$handlerID], $curl[$i]);
}
// Run each Curl-Multi-Handler in turn
foreach ($parallelCurlRequests as $request) {
$running = null;
do {
curl_multi_exec($request, $running);
} while ($running);
}
$distanceArray = [];
// You can now pull out the data from the request.
foreach ($curl as $response) {
$content = curl_multi_getcontent($response);
if (!empty($content)) {
// Build up some form of array.
$response = json_decode($content);
$location = $content->someObject[0]->someRow->location;
$distance = $content->someObject[0]->someRow->distance;
$distanceArray[$location] = $distance;
}
}
natsort($distanceArray);
I'm new to PHP, and I want to get latitude and longitude of a place and then add them to MySQL database.
I'm using Google Geo-code API to get them, this is what I do right-row
for ($i = 0; $i<1000; $i++) {
$sql = mysql_query("SELECT place_address FROM place_locator WHERE place_id =".$i, $this->db) or die('invalide request : ' . mysql_error());
if (mysql_num_rows($sql)) {
while ($place = mysql_fetch_assoc($sql)) {
//Encode the place string I got, to get rid of space
$encodePlace = str_replace(" ", "%20", $place["place_address"]);
//Use Google API
$url = 'http://maps.googleapis.com/maps/api/geocode/json?address='.$encodePlace.'&sensor=false';
//Use Curl to send the request
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
$obj = json_decode($response, true);
$updateSql = mysql_query("UPDATE `place_locator`.`place_locator` SET
`latitude` = '".$obj["results"][0]["geometry"]["location"]["lat"]."',
`longitude` = '".$obj["results"][0]["geometry"]["location"]["lng"]."' WHERE `place_locator`.`place_id` = ".$i, $this->db) or die('Invalide : ' . mysql_error());
curl_close($ch);
}
}
It works for a loop of 10,when going to 1000, it will take a lot of time and many results didn't updated to the database.
I think may be multi thread should help, but I don't really know how it works, please help me.
Thanks in advance
I had the same problem. Google limits the frequency of the requests! Try a sleep(1); in the loop and it will work but need much more time.
I want to search number of links or URL on http://public-domain-content.com
and store them in an array and then just randomly select any one from array and just display or echo
How can i do that in php
If I understood what you're asking, you can achieve this using file_get_contents();
After using file_get_contents($url), which gives you a string, you can loop through the result string searching for spaces to tell the words apart. Count the number of words, and store the words in an array accordingly. Then just choose a random element from the array using array_rand()
However, sometimes there are security problems with file_get_contents().
You can override this using the following function:
function get_url_contents($url)
{
$crl = curl_init();
$timeout = 5;
curl_setopt ($crl, CURLOPT_URL,$url);
curl_setopt ($crl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($crl, CURLOPT_CONNECTTIMEOUT, $timeout);
$ret = curl_exec($crl);
curl_close($crl);
return $ret;
}
http://php.net/manual/en/function.curl-setopt.php <--- Explanation about curl
Example code:
$url = "http://www.xxxxx.xxx"; //Set the website you want to get content from
$str = file_get_contents($url); //Get the contents of the website
$built_str = ""; //This string will hold the valid URLs
$strarr = explode(" ", $str); //Explode string into array(every space a new element)
for ($i = 0; $i < count($strarr); $i++) //Start looping through the array
{
$current = #parse_url($strarr[$i]) //Attempt to parse the current element of the array
if ($current) //If parse_url() returned true(URL is valid)
{
$built_str .= $current . " "; //Add the valid URL to the new string with " "
}
else
{
//URL invalid. Do something here
}
}
$built_arr = explode(" ", $built_str) //Same as we did with $str_arr. This is why we added a space to $built_str every time the URL was valid. So we could use it now to split the string into an array
echo $built_arr[array_rand($built_arr)]; // Display a random element from our built array
There is also a more extended version to checking URLs, which you can explore here:
http://forums.digitalpoint.com/showthread.php?t=326016
Good luck.
I have an array containing the contents of a MySQL table. I need to put each of these contents into curl_multi_handles so that I can execute them all simultaneously
Here is the code for the array, in case it helps:
$SQL = mysql_query("SELECT url FROM urls") or die(mysql_error());
while($resultSet = mysql_fetch_array($SQL)){
$urls[]=$resultSet
}
So I need to put be able to send data to each url at the same time. I don't need to get any data back, and in fact I'll be having them time out after two seconds. It only needs to send the data and then close.
My code prior to this, was executing them one at a time. here is that code:
$SQL = mysql_query("SELECT url FROM shells") or die(mysql_error()); while($resultSet = mysql_fetch_array($SQL)){
$ch = curl_init($resultSet['url'] . $fullcurl); //load the urls and send GET data
curl_setopt($ch, CURLOPT_TIMEOUT, 2); //Only load it for two seconds (Long enough to send the data)
curl_exec($ch);
curl_close($ch);
So my question is: How can I load the contents of the array into curl_multi_handle, execute it, and then remove each handle and close the curl_multi_handle?
You still call curl_init and curl_setopt. Then you load it into a multi_handle, and keep calling execute until it's done. This is based on the documentation at curl_multi_init. Since you're timing out in two seconds, and not processing responses, I think you can just sleep for two seconds at a time. curl_multi_select might be better if you actually need to process the responses.
$SQL = mysql_query("SELECT url FROM shells") ;
$mh = curl_multi_init();
$handles = array();
while($resultSet = mysql_fetch_array($SQL)){
//load the urls and send GET data
$ch = curl_init($resultSet['url'] . $fullcurl);
//Only load it for two seconds (Long enough to send the data)
curl_setopt($ch, CURLOPT_TIMEOUT, 2);
curl_multi_add_handle($mh, $ch);
$handles[] = $ch;
}
// Create a status variable so we know when exec is done.
$running = null;
//execute the handles
do {
// Call exec. This call is non-blocking, meaning it works in the background.
curl_multi_exec($mh,$running);
// Sleep while it's executing. You could do other work here, if you have any.
sleep(2);
// Keep going until it's done.
} while ($running > 0);
// For loop to remove (close) the regular handles.
foreach($handles as $ch)
{
// Remove the current array handle.
curl_multi_remove_handle($mh, $ch);
}
// Close the multi handle
curl_multi_close($mh);
If i were you, i would write class mysql and a class curl.
Its very good at all.
First i would create a method witch would return all urls from a passed mysql result.
Something like
public function getUrls($mysql_fetch_array)
{
foreach($mysql_fetch_array as $result)
{
$urls[] = $result["url"];
}
}
then you could write a method like curlSend($url,$param)
//remember you have to edit i dont know your full code so its just
// a way you could do it
public function curlSend($url,$param="")
{
$ch = curl_init($resultSet['url'] . $fullcurl); //load the urls and send GET data
curl_setopt($ch, CURLOPT_TIMEOUT, 2); //Only load it for two seconds (Long enough to send the data)
curl_exec($ch);
curl_close($ch);
}
public function send()
{
$urls = getUrls($this->mysql->result($sql));
foreach($urls as $url)
{
$this->curlSend($url);
}
}
Now this is how you could do it.