Making APEX trigger from webservice - is it possible? - php

i,m trying to write a trigger in SalesForce Opportunity which have to send the updated StageName to external webservice using predefined username and password.
here is sample code:
trigger isStageChanged on Opportunity (after update) {
for (Opportunity o : Trigger.new) {
Opportunity beforeUpdate = System.Trigger.oldMap.get(o.Id);
if(beforeUpdate.StageName != o.StageName) {
// Call WS
}
}
}
"Call ws" means to use link like:
http://mydomain.com/webservice/index.php?username=MY_USERNAME&password=MY_PASS&stage=opportunityNewStage
The question is: Where can i store MY_USERNAME and MY_PASS in case i want to use this trigger on different SalesForce customers and to let them configure them after install?
When i red the APEX Code Developers Guide i red that i can use a Web Services API to deploy my trigger and that way i can store user and pass on my server. Is it possible?
On my server i'm using php.
I hope i am clear in my question.

You should probably store the username and password using custom settings. This will allow their administrator to easily configure the username and password with values they provide.
Note that you can't make a callout from inside a trigger as it holds up database operations, instead you need to put your callout in a public/global method of another class, using the #future annotation:
#future public void DoMyCallout(list<id> liOpportunities)
{
// load your oppties if need be, then do the callout
}
#future indicates that this method will run asynchronously, so you can call it from the trigger but it will run in another thread and so not hold up the database operation in progress.

Related

How can I do a partial integration test (phpunit)?

I am working on an extension (app) of nextcloud (which is based on Symfony). I have a helper class to extract data from the request that is passed by the HTTP server to PHP. A much-reduced one could be something like this (to get the point here):
<?php
namespace OCA\Cookbook\Helpers;
class RequestHelper {
public function getJson(){
if($_SERVER['Request_Method' === 'PUT'){ // Notice the typos, should be REQUEST_METHOD
$raw = file_get_content('php://input');
return json_decode($raw, true);
} else { /* ... */ }
}
}
Now I want to test this code. Of course, I can do some unit testing and mock the $_SERVER variable. Potentially I would have to extarct the file_get_content into its own method and do a partial mock of that class. I get that. The question is: How much is this test worth?
If I just mimick the behavior of that class (white box testing) in my test cases I might even copy and paste the typo I intentionally included here. As this code is an MWE, real code might get more complex and should be compatible with different HTTP servers (like apache, nginx, lighttpd etc).
So, ideally, I would like to do some automated testing in my CI process that uses a real HTTP server with different versions/programs to see if the integration is working correctly. Welcome to integration testing.
I could now run the nextcloud server with my extension included in a test environment and test some real API endpoints. This is more like functional testing as everything is tested (server, NC core, my code and the DB):
phpunit <---> HTTP server <---> nextcloud core <---> extension code <---> DB
^
|
+--> RequestHelper
Apart from speed, I have to carefully take into account to test all possible paths through the class RequestHelper (device under test, DUT). This seems a bit brittle to me in the long run.
All I could think of is adding a simple endpoint only for testing the functionality of the DUT, something like a pure echo endpoint or so. For the production use, I do not feel comfortable having something like this laying around.
I am therefore looking for an integration test with a partial mock of the app (mocking the business logic + DB) to test the route between the HTTP server and my DUT. In other words, I want to test the integration of the HTTP server, nextcloud core, my controller, and the DUT above without any business logic of my app.
How can I realize such test cases?
Edit 1
As I found from the comments the problem statement was not so obviously clear, I try to explain a bit at the cost of the simplicity of the use-case.
There is the nextcloud core that can be seen as a framework from the perspective of the app. So, there can be controller classes that can be used as targets for URL/API endpoints. So for example /apps/cookbook/recipe/15 with a GET method will fetch the recipe with id 15. Similarly, with PUT there can be a JSON uploaded to update that recipe.
So, inside the corresponding controller the structure is like
class RecipeController extends Controller {
/* Here the PUT /apps/cookbook/recipe/{id} endpoint will be routed */
public function update($id){
$json = $this->requestHelper->getJson(); // Call to helper
// Here comes the business logic
// aka calls to other classes that will save and update the state
// and perform the DB operation
$this->service->doSomething($json);
// Return an answer if the operation terminated successfully
return JsonResponse(['state'=>'ok'], 200);
}
}
I want to test the getJson() method against different servers. Here I want to mock at least the $this->service->doSomething($json) to be a no-op. Ideally, I would like to spy into the resulting $json variable to test that exactly.
No doubt, in my test class it would be something like
class TestResponseHandler extends TestCase {
public function setUp() { /* Set up the http deamon as system service */}
public testGetJson() {
// Creat Guzzle client
$client = new Client([
'base_uri' => 'http://localhost:8080/apps/cookbook',
]);
// Run the API call
$headers = ...;
$body = ...;
$response = $client->put('recipe/15', 'PUT', $headers, $body);
// Check the response body
// ....
}
}
Now, I have two code interpreters running: Once, there is the one (A) that runs phpunit (and makes the HTTP request). Second, there is the one (B) associated with the HTTP server listening on localhost:8080.
As the code above with the call to getJson() is running inside a PHP interpreter (B) outside the phpunit instance I cannot mock directly as far as I understand. I would have to change the main app's code if I am not mistaken.
Of course, I could provide (more or less) useful data in the test function and let the service->doSomething() method do its job but then I am no longer testing only a subset of functions but I am doing functional or system testing. Also, this makes it harder to generate well-aimed test cases if all these side-effects need to be taken into account.

Laravel 7 - Use REST API instead of a database

I am using a rest api to store/retrieve my data which is stored in a postgres database. The api is not laravel, its an external service!
Now i want to create a website with laravel (framework version 7.3.0) and i'm stuck on how to implement the api calls correctly.
For example: i want to have a custom user provider with which users can log-in on the website. But the validation of the provided credentials is done by the api not by laravel.
How do i do that?
Just make a Registration controller and a Login Controller by "php artisan make:controller ControllerName" and write Authentication logics there.
In previous versions of Laravel you had a command like "php artisan make:auth" that will make everything needed to do these operations. But in Laravel 7.0 you need to install a package called laravel/ui.
Run "composer required laravel/ui" to install that package
Then run "php artisan ui bootstrap --auth"
and now, you are able to run "php artisan make:auth"
This command will make whole Registration (Signup) and Login system for you.
and in orer to work with REST, you may need to know REST (Http) verbs. Learn about GET, POST, PUT, PATH, DELETE requests and how to make those request with PHP and Laravel collection methods. Learn about JSON parsing, encoding, and decoding. Then you can work with REST easily. and work without any template codes from other packages.
Thank you so much. I hope this answer give you some new information/thought. Thanks again.
Edit:
This might not be the best way. But this is what I did at that time. I tried curl and guzzle to build the request with session cookie and everything in the header to make it look like a request from a web browser. Couldn't make it work.
I used the web socket's channel id for the browser I want the changes to happen and concatenated it with the other things, then encrypted it with encrypt($string). After that, I used the encrypted string to generate a QR code.
Mobile app (which was already logged in as an authenticated used) scanned it and made a post request with that QR string and other data. Passport took care of the authentication part of this request. After decrypting the QR string I had the web socket's channel id.
Then I broadcasted in that channel with proper event and data. Caught that broadcast in the browser and reloaded that page with JavaScript.
/*... processing other data ...*/
$broadcastService = new BroadcastService();
$broadcastService->trigger($channelId, $eventName, encrypt($$data));
/*... returned response to the mobile app...*/
My BroadcastService :
namespace App\Services;
use Illuminate\Support\Facades\Log;
use Pusher\Pusher;
use Pusher\PusherException;
class BroadcastService {
public $broadcast = null;
public function __construct() {
$config = config('broadcasting.connections.pusher');
try {
$this->broadcast = new Pusher($config['key'], $config['secret'], $config['app_id'], $config['options']);
} catch (PusherException $e) {
Log::info($e->getMessage());
}
}
public function trigger($channel, $event, $data) {
$this->broadcast->trigger($channel, $event, $data);
}
}
In my view :
<script src="{{asset('assets/js/pusher.js')}}"></script>
<script src="{{asset('assets/js/app.js')}}" ></script>
<script>
<?php
use Illuminate\Support\Facades\Cookie;
$channel = 'Channel id';
?>
Echo.channel('{{$channel}}')
.listen('.myEvent' , data => {
// processing data
window.location.reload();
});
</script>
I used Laravel Echo for this.
Again this is not the best way to do it. This is something that just worked for me for that particular feature.
There may be a lot of better ways to do it. If someone knows a better approach, please let me know.
As of my understanding, you are want to implement user creation and authentication over REST. And then retrieve data from the database. Correct me if I'm wrong.
And I'm guessing you already know how to communicate over API using token. You are just stuck with how to implement it with laravel.
You can use Laravel Passport for the authentication part. It has really good documentation.
Also, make use of this medium article. It will help you to go over the step by step process.

Add read replicas for SilverStripe website

I've managed to get a stable load balanced front end servers that can scale horizontally quite well however the next bottle neck would be the db. There was a blog post discussing scaling dbs horizontally however very little detail on it. I'm currently using PostgreSQL and so the only plugin I've found wouldn't work.
Are my only options creating my own HAProxy or rewriting the PostgreSQL plugin to allow connections with read replicas?
I'm using AWS for all my hosting
Firstly - I'd love to be corrected on this!
Having only had a quick look through some of the ORM classes in a SilverStripe 3.5 site, it looks like while the ORM does support multiple database connections (see DB::get_conn with argument for name) it is designed for specific use cases in mind. That is to say, you may have a module that needs to write to a specific database, so this would allow it to.
What you want is native and automatic support for this within the framework, so that all reads go to your slave(s) and writes go to your master. Unfortunately, it doesn't look like this comes out of the box. You might be able to achieve it by overloading a couple of the core SQL classes using the injector.
If you were to try it, this answer outlines how you could separate select statements out from the rest and run them through a different database connector.
As a quick example of how you might go at achieving this with SQLSelect, you will notice that it is injectable, which means you can easily overload it.
File: mysite/_config/injector.yml
Injector:
SQLSelect:
class: ReadOnlySQLSelect
You need to register a new database connection with the DB class:
File: mysite/_config.php
$readDatabaseConfig = array(/** define your DB credentials here, as with the default $databaseConfig **/);
if (!DB::connect($readDatabaseConfig, 'default_read')) {
user_error('Failed to connect to read replica DB!', E_USER_ERROR);
}
Now, overload the SQLSelect class and replace the parts of it that call the DB class methods. This class inherits from SQLExpression which is the class the contains the methods you actually care about in this instance:
File: mysite/code/ReadOnlySQLSelect.php
class ReadOnlySQLSelect extends SQLSelect
{
public function sql(&$parameters = array())
{
// Changed from SQLExpression: third parameter passed as connection name
$sql = DB::build_sql($this, $parameters, 'default_read');
if (empty($sql)) {
return null;
}
if ($this->replacementsOld) {
$sql = str_replace($this->replacementsOld, $this->replacementsNew, $sql);
}
return $sql;
}
public function execute()
{
$sql = $this->sql($parameters);
// Changed from SQLExpression: skip DB::prepared_query since it doesn't allow
// you to provide the connection name - replace it with its contents instead.
$conn = DB::get_conn('default_read');
return $conn->preparedQuery($sql, $parameters);
}
}
Note: SQLSelect::unlimitedRowCount should technically be replaced where it calls DB::prepared_query, since the prepared query method calls DB::get_conn with no arguments, so will always return the default connection. You could replace the DB::prepared_query line the same as used above:
$conn = DB::get_conn('default_read');
$result = $conn->preparedQuery($sql, $innerParameters);
If you implement the above method, also change new SQLSelect() to SQLSelect::create(), otherwise you'll end up with some queries that still hit the master server because it'll bypass your class by not using the injector.
There's also an instance in SQLConditionalExpression that you should replace too (::toSelect) but that is likely to affect query transformations from other child implementations of that class, and you won't be able to do much about it without either (A) PRing a fix to the framework or (B) overloading all the other SQL* classes.
At this point you should have everything you need to route select queries to your default_read connection.
Infrastructure
On the infrastructure side, you should be able to set up read replicas through the RDS console. When you do so it will provide you with a DNS endpoint for your replica node(s), which you can use in your _config.php to configure the connection to the read replica database.
If this works for you, you should create a module for it and put it up on GitHub - this would definitely be useful for others in future!
You may also consider making pull requests to the framework to add additional arguments to methods like DB::prepared_query to accept a connection name.
Also worth noting is that if you're using the mysqlnd database adapter you may be able to take advantage of read/write splitting, implemented with some sort of injector overloading but all handled at a lower level than the application layer.

Laravel Log useFiles method is making Log write in multiple files

I am using Laravel Log Facade in my app. And I have several services like Mandrill, Twilio, Stripe, etc., that need to be logged in separate file. But when I use Log::useFiles() to set separate file for one of the service wrapper class, like this:
Class Mailer
{
static function init()
{
Log::useFiles(storage_path('logs/mandrill-'.date("Y-m-d").'.log'));
}
static function send()
{
// some code here...
Log::error("Email not sent");
}
}
And I am ending up with log being written in both Laravel log file, and this Mandrill log file.
Is there a way to tell Log to write logs only in one file?
It's generally strange that it does that, because when I use directly Monolog, it writes only in one file, as it should. As far as I know Log Facade is using Monolog.
First of all, keep in mind that if you change the log handlers in your Mailer class you'll change them for the whole application.
Secondly, the reason that after your changes you get logs written to 2 files is that useFiles() does not overwrite the default log handler but adds a new handler to the handlers that Monolog will use. Therefore, you just add a second handler to the list and both of them handle the log message by saving them into different files.
Thirdly, Laravel's Log facade does not provide a way to replace the default handler - if you want to use it you need to use Monolog directly. You can access it by calling Log::getMonolog().

Dependency injection php website

The more I read about dependency injection the more I get confused. I know what it is for, that is not the problem. Trying to do some design on paper this is what I came up with and somehow it seems to me I am overlooking something.
First I imagined building an actual server that would accept incoming requests and returns responses to the user.
class Server {
private $responseBuilder;
public function __construct($responseBuilder) {
$this->responseBuilder = $responseBuilder;
}
public function run() {
// create socket, receive request
$response = $this->responsebuilder->build($request);
// send response
}
}
class Response {
private $method;
private $message;
private $url;
// getters & setters
}
class ServerBuilder {
public build() {
// construction logic
return new Server(new ResponseBuilder());
}
}
Since Apache is used to handle server requests we could replace the server with something that just send the response.
$bldr = new ResponseBuilder();
$response = $bldr->build();
// send response some way
Note that ResponseBuilder has direct access to the request ($_SERVER['..'])
and so it has everything it needs to choose the right response.
PHP however allows us to build and send responses inline. So we could have a Controller object for each page or something else that send the response and have a builder for that.
$bldr = new ControllerBuilder();
$controller = $bldr->build();
$controller->run();
class ExampleController implements Controller {
public function run() {
header("HTTP/1.1 404 Not Found");
echo 'sorry, page not found';
}
}
This all makes sense to me. But let's look at the server example again.
It calls $responseBuilder->build() and gets a response back. But this would mean that the builder (or other builders if we split it) is also responsible for anything else that might occur like authenticating a user, writing to the database,... and I can't get my head around the fact that writing to a database would be part of the object graph construction.
It would be like: Send me your request. Oh you want the homepage? I will build you your response and while I'm at it I will also do some things that have nothing to do with building it like logging what I just did and saving some of your data in a cookie and sending a mail to the administrator that you are the first visitor on this page ever, ...
You should decouple them. You have a few assumptions that I think are a bit strange. Let's start with them.
The main purpose of an incoming http request is to give back some html
I have built PHP backends that only return JSON, instead of HTML. I had a really strong border between back and front end. I only used the backend to give me data from the database, or add/edit data in the databse. The front end was just a PHP script that would build the pages any way i wanted.
Since it is the web there is in theory no use for setters since
everything can be injected in the constructor
You could use the constructor, but you don't have too. You can use setters. Dependency injection is actually just turning the flow around.
You are on the right track though. You want some class that is responsible for building your pages. So, make it only responsible for your building your pages, and take out the other responsibilities. Things like logging, authentication etc should be outside of that.
For instance if you want logging, you could have your builder create your page, and your logger could then listen to all the things your builder is doing (with the observer pattern for instance). So if your builder says: "i created the home page", you can log it with your logger, who is actually listening to your builder.
Authentication for instance should happen even before your builder starts. You don't want your builder to go to work if you can already figure out that a user is not supposed to be on a page. You could use a database for that, and whitelist any usertype/pagerequest combination.
Then for data handling, i would create a backend, that only handles requests that are supposed to give back data, or save it. The front end could then communicate to get it's content by pulling it.
I hope this clears up a few things, but I'll be happy to answer more indept questions.

Categories