We're interested about using Google Custom Search / Google in our project, mostly due to the fact that it's amazing at conjugation & correcting misspelled words.
We know that it can return data in JSON or XML, and we're fine with that. But finding an answer to question:
Can we use that conjugation and mistake correction and search our own database/api?
If you would type drnks with no alcohol it would automatically correct to drinks with no alcohol, and then search our database like this:
http://example.com?search=drinks&alcohol=0, and it could respond like this:
{
"coke": {
"alcohol": 0,
"calories": 300,
"taste": "awesome"
},
"pepsi": {
"alcohol": 0,
"calories": 300,
"taste": "meh"
}
}
And then it would return these two results, in some form.
Solutions using the paid version are fine.
If it's possible to do this, could you provide me with a simple example?
Google provides a REST API for their custom search, you can query it from your server to determine whether there is a better spelling for the search terms or not, and then use that to query your internal database.
In my code I'm using Guzzle, a REST client library to avoid suffering with cURL's ugly and verbose code, but feel free to use cURL if you really need to.
// Composer's autoloader to load the REST client library
require "vendor/autoload.php";
$api_key = "..."; // Google API key, looks like random text
$search_engine = "..."; // search engine ID, looks like "<numbers>:<text>"
$query = "drnks with no alcohol"; // the original search query
// REST client object with some defaults
// avoids specifying them each time we make a request
$client = new GuzzleHttp\Client(["base_url" => "https://www.googleapis.com", "defaults" => ["query" => ["key" => $api_key, "cx" => $search_engine, "fields" => "spelling(correctedQuery)"]]]);
try {
// the actual request, with the search query
$resp = $client->get("/customsearch/v1", ["query" => ["q" => $query]])->json();
// whether Google suggests an alternative spelling
if (isset($resp["spelling"]["correctedQuery"])) {
$correctedQuery = $resp["spelling"]["correctedQuery"];
// now use that corrected spelling to query your internal DB
// or do anything else really, the query is yours now
echo $correctedQuery;
} else {
// Google doesn't have any corrections, use the original query then
echo "No corrections found";
}
} catch (GuzzleHttp\Exception\TransferException $e) {
// Something bad happened, log the exception but act as if
// nothing is wrong and process the user's original query
echo "Something bad happened";
}
Here are some instructions to obtain your API key, and the custom search engine ID can be obtained from the control panel.
If you look carefully you can see I've specified the fields query parameter to request a partial response with only the eventual spelling suggestions, to (hopefully) get better performance as we don't need anything else from the response (but feel free to change/remove it if you do need the complete response).
Note that Google has no clue about what's in your database so the spelling corrections will only be based on the public data Google has about your website, I don't think there is a way to make Google know about your internal DB, not that it's a good idea anyway.
Finally, make sure to handle rate-limits and API failures gracefully by still giving the user the possibility to search using their original query (just act like nothing wrong happened, and only log the error for later review).
Related
I am trying to get a reply text message from Clickatell using their Rest API, when I call the parseReplyCallback function when their system posts to my page - it seems to be null or I am not sure how to get the variables it is returning. What I would like to do is have all of the variables returned insert into a SQL database so I can use it elsewhere.
I have tried quite a few things, using various styles of getting the variables such as $_POST, $results['text'], $results->text, and so forth each time I can't seem to get any information out of it. I can't just var_dump or anything because I can't see any backend or console so I am pretty much in the blind, hoping someone else is using this system and has it working fine.
require __DIR__.'/clickatell/src/Rest.php';
use clickatell\ClickatellException;
use clickatell\Rest;
$Rest = new Rest("j8VKw3sJTZuVfQGVC7jdhA");
// Incoming traffic callbacks (MO/Two Way callbacks)
$Rest->parseReplyCallback(function ($result) {
//mysqli_query($con,"INSERT INTO `SMSCHAT` (`text`) VALUES ('$result')");
$mesageId = mysqli_real_escape_string($con,$result['messageId']);
$text = mysqli_real_escape_string($con,$result['text']);
$replyMessageId = mysqli_real_escape_string($con,$result['replyMessageId']);
$to = mysqli_real_escape_string($con,$result['toNumber']);
$from = mysqli_real_escape_string($con,$result['fromNumber']);
$charset = mysqli_real_escape_string($con,$result['charset']);
$udh = mysqli_real_escape_string($con,$result['udh']);
$network = mysqli_real_escape_string($con,$result['network']);
$keyword = mysqli_real_escape_string($con,$result['keyword']);
$timestamp = mysqli_real_escape_string($con,$result['timestamp']);
//do mysqli_query
});
I'd like for it to break the result into individual variables (because I plan on doing other things such as an auto-reply, etc) and upload it to the SQL database scrubbed.
Either doesn't create the table entry or gives me a blank one altogether in that first test where I put the result in the text field.
From a Clickatell point of view, although we understand what you're asking - it's unfortunately outside the scope of support that we offer on our products.
If you would like more information on our REST API functionality, please feel free to find it here: https://www.clickatell.com/developers/api-documentation/rest-api-reply-callback/
If you don't succeed in setting up the callbacks, please feel free to log a support ticket here: https://www.clickatell.com/contact/contact-support/ and one of our team members will reach out and try to assist where possible.
I've generated a PHP client library for Magento2 using swagger-codegen. I'm able to connect to Magento and just trying some methods to see how usable the generated client is. It seems like I'm either missing something or maybe the swagger spec published by Magento is not quite there yet.
In particular, invoking the various list operations seems to be marginalized by design, and broken by nature in the generated swagger client. Take for example the operation to list products, /V1/products. Swagger UI indicates this can be parameterized with GET parameters (and in fact seems you must - when I try calling it with no parameters Magento returns an HTTP 400). Here's the sample generated code from the Markdown Swagger generated along with the client library
try {
$result = $api_instance->catalogProductRepositoryV1GetListGet(
$search_criteria_filter_groups_filters_field,
$search_criteria_filter_groups_filters_value,
$search_criteria_filter_groups_filters_condition_type,
$search_criteria_sort_orders_field,
$search_criteria_sort_orders_direction, $search_criteria_page_size,
$search_criteria_current_page);
print_r($result);
} catch (Exception $e) {
echo
'Exception when calling CatalogProductRepositoryVApi->catalogProductRepositoryV1GetListGet: ',
$e->getMessage(), "\n";
}
The first thing I notice is that these parameters only allow a single entry for each field, when the API actually allows you to define multiple filter_groups, multiple filters per filter_group etc. This great blog post helped me understand how the API is supposed to work.
Stepping back though, and supposing a limit of one filter_group and one filter for that group are acceptable and just trying to use the generated client on faith, I tried to put together a simple call
// Fetch all products with a contrived like query
$oMageClient = new Swagger\Client\Api\CatalogProductRepositoryVApi($oApiClient);
$result = $oMageClient->catalogProductRepositoryV1GetListGet('name', '%', 'like');
Magento complains with an HTTP 400, and it's because of the generated client's request params:
searchCriteria[filterGroups][][filters][][field]=name&searchCriteria[filterGroups][][filters][][value]=%&searchCriteria[filterGroups][][filters][][conditionType]=like
What it's done is broken up the parameters into different filter_groups... Sure enough when I look at the generated Swagger\Client\Api\CatalogProductRepositoryVApi:: catalogProductRepositoryV1GetListGetWithHttpInfo method I find the culprit where the query params are set. By changing
// query params
if ($search_criteria_filter_groups_filters_field !== null) {
$queryParams['searchCriteria[filterGroups][][filters][][field]'] = $this->apiClient->getSerializer()->toQueryValue($search_criteria_filter_groups_filters_field);
}// query params
if ($search_criteria_filter_groups_filters_value !== null) {
$queryParams['searchCriteria[filterGroups][][filters][][value]'] = $this->apiClient->getSerializer()->toQueryValue($search_criteria_filter_groups_filters_value);
}// query params
if ($search_criteria_filter_groups_filters_condition_type !== null) {
$queryParams['searchCriteria[filterGroups][][filters][][conditionType]'] = $this->apiClient->getSerializer()->toQueryValue($search_criteria_filter_groups_filters_condition_type);
}
to
// query params
if ($search_criteria_filter_groups_filters_field !== null) {
$queryParams['searchCriteria[filterGroups][0][filters][0][field]'] = $this->apiClient->getSerializer()->toQueryValue($search_criteria_filter_groups_filters_field);
}// query params
if ($search_criteria_filter_groups_filters_value !== null) {
$queryParams['searchCriteria[filterGroups][0][filters][0][value]'] = $this->apiClient->getSerializer()->toQueryValue($search_criteria_filter_groups_filters_value);
}// query params
if ($search_criteria_filter_groups_filters_condition_type !== null) {
$queryParams['searchCriteria[filterGroups][0][filters][0][conditionType]'] = $this->apiClient->getSerializer()->toQueryValue($search_criteria_filter_groups_filters_condition_type);
}
I'm able to get a response back from Magento. So I have a couple of questions
So is there an issue with the JSON Magento is publishing causing the generated Swagger code to be buggy? Or is there some other step I've messed up in generating the client?
It feels like something's not right, because if you look at the blog article and the generated Swagger documentation, Swagger is suggesting the filter_groups parameter is a string, when really it should be an array of objects.
Information
I've started using the Asana API to make our own task overview in our CMS. I found an API on github which helps me a great deal with this.
As I've mentioned in an earlier question, I wanted to get all tasks for a certain user. I've managed to do this using the code below.
public function user($id)
{
if (isset($_SERVER['HTTP_X_REQUESTED_WITH']) &&
($_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest')) {
$this->layout = 'ajax';
}
$asana = new Asana(array(
'apiKey' => 'xxxxxxxxxxxxxxxxxxxx'
));
$results = json_decode($asana->getTasksByFilter(array(
'assignee' => $id,
'workspace' => 'xxxxxxxxxx'
)));
if ($asana->responseCode != '200' || is_null($results)) {
throw new \Exception('Error while trying to connect to Asana, response code: ' . $asana->responseCode, 1);
}
$tasks = array();
foreach ($results->data as $task) {
$result = json_decode($asana->getTaskTags($task->id));
$task->tags = $result->data;
$tasks[] = $task;
}
$user = json_decode($asana->getUserInfo($id));
if ($asana->responseCode != '200' || is_null($user)) {
throw new \Exception('Error while trying to connect to Asana, response code: ' . $asana->responseCode, 1);
}
$this->render("tasks", array(
'tasks' => $tasks,
'title' => 'Tasks for '.$user->data->name
));
}
The problem
The above works fine, except for one thing. It is slower than a booting Windows Vista machine (very slow :) ). If I include the tags, it can take up to 60 seconds before I get all results. If I do not include the tags it takes about 5 seconds which is still way too long. Now, I hope I am not the first one ever to have used the Asana API and that some of you might have experienced the same problem in the past.
The API itself could definitely be faster, and we have some long-term plans around how to improve responsiveness, but in the near-to-mid-term the API is probably going to remain the same basic speed.
The trick to not spending a lot of time accessing the API is generally to reduce the number of requests you make and only request the data you need. Sometimes, API clients don't make this easy, and I'm not familiar with the PHP client specifically, but I can give an example of how this would work in general with just the plain HTTP queries.
So right now you're doing the following in pseudocode:
GET /tasks?assignee=...&workspace=...
foreach task
GET /task/.../tags
GET /users/...
So if the user has 20 tasks (and real users typically have a lot more than 20 tasks - if you only care about incomplete and tasks completed in the last, say, week, you could use ?completed_since=<DATE_ONE_WEEK_AGO>), you've made 22 requests. And because it's synchronous, you wait a few seconds for each and every one of those requests before you start the next one.
Fortunately, the API has a parameter called ?opt_fields that allows you to specify the exact data you need. For example: let's suppose that for teach task, all you really want is to know the task ID, the task name, the tags it has and their names. You could then request:
GET /tasks?assignee=...&workspace=...&opt_fields=name,tags.name
(Each resource included always brings its id field)
This would allow you to get, in a single HTTP request, all the data you're after. (Well, the user lookup is still separate, but at least that's just 1 extra request instead of N). For more information on opt_fields, check out the documentation on Input/Output Options.
Hope that helps!
Apologies, since I may not know the terminologies for the salesforce API. I just started programming a connector to interact with salesforce and I am stuck.
I have a requirement, where each time a new entry is added to the Leads section, I will have to retrieve a couple of fields (Firstname and Product Code) and pass it to a different software that makes use of PHP.
<?php
require "conf/config_cleverbridge_connector.inc.php";
require "include/lc_connector.inc.php";
// Start of Main program
// Read basic parameters
if ($LC_Username === "")
{
$LC_Username = readParam("USER");
}
if ($LC_Password === "")
{
$LC_Password = readParam("PASSWORD");
}
$orderID = "";
$customerID = substr(readParam("PURCHASE_ID"), 0, 10);
$comment = readParam("EMAIL")."-".readParam("PURCHASE_ID");
// Create product array
$products = array();
$itemID = readParam("INTERNAL_PRODUCT_ID");
$quantity = 1;
if (!ONCE_PER_PURCHASED_QUANTITY)
{
$quantity = readParam("QUANTITY");
}
// Add product to the product array
$products[] = array (
"itemIdentification" => $itemID,
"quantity" => $quantity,
);
// Create the order
$order = array(
"orderIdentification" => $orderID,
"customerIdentification" => $customerID,
"comment" => $comment,
"product" => $products,
);
// Calling webservice
$ticket = doOrder($LC_Username, $LC_Password, $order);
if ($ticket)
{
Header("HTTP/1.1 200 Ok");
Header("Content-Type: text/plain");
print TICKET_URL.$result->order->ticketIdentification;
exit;
}
else
{
$error = "No result from WSConnector_doOrder";
trigger_error($error, E_USER_WARNING);
printError(500, "Internal Error.");
exit;
}
// End of Main program
?>
Now this is the code that I got and have to work with. And this is hosted on a different remote server.
I am very very new to salesforce and I am not really sure how to trigger calling this php file over a remote site.
The basic idea is:
1. New entry in Lead is created.
2. Immediately 2 fields (custID and prodID) are sent to this PHP file I have pasted above (some of the variables are different)
3. This does its processing and sends 2 fields back to salesforce.
Any help or guidance is appreciated. Even links to read up on is okay as I am completely clueless.
PS: I have another example where it makes use of JSON Messages if that may make any difference.
Thanks
I'll repost the links from my comment :)
https://salesforce.stackexchange.com/questions/23977/is-it-possible-to-get-the-record-id
Web hook in salesforce?
If your PHP endpoint is visible on the open web (not a part of some intranet or just your own localhost) then simplest thing to do would be to send an Outbound Message from Salesforce. No coding required, just some XML document you'll have to parse on the PHP side. Plus it will automatically attempt to resend the messages if the host is unreachable...
If your app can't be accessed from SF servers then I think your PHP app will have to be the "actor". Querying SF every X minutes for new Leads or maybe subscribing to Streaming API... This will mean you'd have to store credentials to SF on your PHP app and remember to either change the password periodically or set on the "integration user"'s profile the "password never expires" checkbox.
So you're getting the notification, you generate your tickets, time to send them back. Will you want to pretend the update of Lead was done by the person that created it or will you want to see "last modified by: Integration User"? Outbound message can contain session id which you can use to act as the person who initiated the action (created the lead and fired the workflow) - at least until they log out or the session timeouts.
For message back you can use SOAP or REST salesforce apis - read the docs to figure out how to send an update command (and if you want to make it clear it was done by special user associated with this PHP app - how to log in to the APIs). I think the user's profile must have "API enabled" ticked before you could reuse somebody's session so maybe it's better to have a dedicated account for integrations like that...
Another thing to keep in mind if it'd be outbound messages is to ignore the messages sent from sandboxes so if somebody makes a test environment you will not call your "production" database of tickets. You can also remember to modify the outbound message and remote site setting every time a sandbox is made so you'll have "prod talking to prod, test talking to test". I know you can include user's session id in the OM - so maybe you can also add organization's id (for production it'll stay the same, every new sandbox will have new id).
The problem with this approach is that it might not scale. If 1000 leads is inserted in one batch (for example with Data Loader) - you'll get spammed with 1000 outbound messages. Your server must be able to handle such load... but it will also mean you're using 1 API request to send every single update back. You can check the limit of API requests in Setup -> Company Information. Developer Edition will have this limit very low, sandboxes are better, production is best (it also depends how many user licenses have you bought). That's why I've asked about some batching them up.
More coding but also more reliable would be to ask SF for changes every X minutes (Streaming API? Normal query? check the "web hook" answer) and send an update of all these records in one go. SELECT Id, Name FROM Lead WHERE Ticket__c = null (note there's nothing about AND LastModifiedDate >= :lastTimeIChecked)...
I want to use an idea that I have seen on another website where I enter a "keyword", press Enter, and it then takes the client to a specific page or website.
I have seen something like this on http://qldgov.remserv.com.au, On the right side there is a field called "My Employer", type in "health" for example and you will be provided with relevant content.
Essentially I have client branded mini sites where we want to assign a "keyword" for each client brand so all of their employees will be able to go to their site entering this one keyword without all of them having individual logins. I want to be able to link to a URL that I can define in some manner.
I have looked at the source code of the site mentioned above and see they are using a form but I am not sure how they have assigned the keywords or if its even possible to do this without a database or anything like that. Trying to keep it as simple as possible as I am not a PHP/Java expert by any means.
Any help would be appreciated, even if its not code but an idea of the direction I need to go in to make this work. Thanks in advance!! :-)
The easiest way in my eyes would be to define an array that contains all of the keywords and respective urls client side (in JS). For example:
​var array = { 'health' : '/health.php', 'sport' : '/swimming.php' };
You would then get the user input on onSubmit and if it exists modify the window.location appropriately.
if ( array[user_input] !== undefined ) {
window.location = array[user_input];
}
else {
alert ( 'not found' );
}
If the user supplied health they will be redirected to /health.php, if they supply sport they will be redirected to /swimming.php (JSFiddle). Alternatively you can use server-side (PHP, JAVA) to handle the request but this may not be worth the effort.
Goodluck.
By using php (rather than javascript), you're not relying on javascript + making it seo friendly.
Firstly you're going to need either some sort of database or a list of keywords/urls
$keywords = array('keyword1' => 'path/to/load.php', 'another keyword' => 'another/path');
Then you'll need a basic form
<form action="loadkeyword.php">
<input name="query">
<button type="submit">Go</button>
</form>
Then in loadkeyword.php
$keywords = array('keyword1' => 'path/to/load.php', 'another keyword' => 'another/path');
$query = $_GET['query'];
if (isset($keywords[$query])) {
$url = $keywords[$query];
header("HTTP/1.0 301 Moved Permanently");
header('location: '.$url);
exit;
} else {
header("HTTP/1.1 404 Not Found");
die('unable to locate keyword');
}
If you have a large list of keywords, I would suggest using a database instead of an array to keep track of your keywords.
The site you link to is doing it server-side, either via a keyword-list that matches content or a search function (I suspect the latter).
There are a few different ways you could achieve your goal, all of them to do with matching keywords to content and then redirecting, either with an array, a list, or a database - the principle will be the same.
However, I would respectfully suggest this may not be the best solution anyway. My reasoning is that (based upon the example you give) you're effectively making your users guess which keyword matches which minisite (even if you have many keywords for each site). Why not just have some kind of menu to choose from (i.e. a selector with a list of minisites)?