how to access Symfony's data / business layer from 3rd party apps - php

I have te following architectural scenario which doesn't depend on me and I can't change:
On one server/machine has several php applications and a postgresdatabase: each application uses its own schema in order to separate applications'data logically. I'm developing a new application - the one represented by the A - using Symfony, which manages and stores data that should be partially accessible by other apps, especially by an old app that's not been developed with symfony nor any other framework. To make it simple you could imagine A app storing movies sent from clients by means of its REST API. I would like B, and C accessing movies' data: B ocasionally should need accessing all the movies with a certain actor acting in it, and C would like to update the owner studio (or vice versa). Of course it would be an A's job, the symfony app that was born exactly for that purpose. SO I thought I have two ways, represented by the arrows:
A's exposes some way an API that B can call. That way I don't have to duplicate business logic and data persistence. Maybe by exposing Doctrine's entities or a controller in some way. Of course I should load at least doctrine, the container, the configuration the httpframework component and when it comes to using C I guess this solution would be unfeasible because I would have two apps using and loading most of the same classes, without any kind of isolation between the two. Another (extreme?) solution I thought is not using Symfony for exposing my A functionalities to the other apps: I could write an API class that connects to the DB and does some logic without any Symfony's support (PDO and friends. No services, no components, nothing.). Luckily what I have to expose is little and that wouldn't be that big problem.
Finally, I would avoid calling A by means of the shell (i.e. app.php console/getMovie) because it consumes tons of resources and I don't think it's that fast. My server is really small and couldn't live up to that
B and the other apps could access A's schema, but that way I maybe should duplicate some business logic and I see it kind of messy. Maybe C app could simply uses an A bundle I could write, explicitly written to expose some A's functionalities 3rd party apps need.
These are the two solutions I've found, but I do appreciate how you think I should design this.

It is possible to share data between schema by using views in a common place (usually public schema).
This has several advantages:
it lets you use your schema structure the way you want.
it exposes the data you want.
it lets you manage the write accesses you want (yes, views may be writeable)
it makes you able to deal with GRANT/REVOKE easily.
it makes the applications able to leverage Postgres` LISTEN/NOTIFY feature to communicate.
There are downsides:
Updating a view means deploying new model files on all the applications using it. It can be a project on its own that the other applications depend on through composer.

Related

Large scale Laravel App performance when multiple models

I am building a l large scale PHP Laravel App. It is an app for Developers which consists of many smaller apps/modules.
For example a Bookmarks app similar to PinBoard, a Notebook app similar to Evernote, Code Snippets app, RSS Feed Reader, and about 30 more little apps all in 1 app in which each app acts as a module.
MY question is about performance. With this many Models and Controllers, etc being loaded, can I not expect good performance on a large user base of this app?
I realize there are too many factors unknown like server settup and more but more so I am asking are things like Models and stuff all loaded on app init or only as needed? Many times most of the apps/modules will not be used in a request.
I have see some packages which add modules and the modules get there own controllers, models, routes, views, and everything that you wold see in the App folder for each module. However I am not sure I want to go that route so right now I have it all under the main app like usual.
The simple answers are yes, the app only loads the models/controllers you're using for a given request, and no, the number/size of models and controllers (including those in packages and modules) you have won't negatively affect your app's performance.
The more complex answer is, as always, it depends on how you've written your code; models calling into other models for some piece of business logic, or eager-loading several relationships worth of data without caching, or using an absurd amount of dependency injection in every single controller... but all that's pretty bad app design anyway and not specific to Laravel.
But the most impact you'll see to performance, in my experience, is Eloquent code that tries to be too fancy or inefficiently fetch data out of the database. Check out the documentation to learn about ways to avoid that:
https://laravel.com/docs/5.3/eloquent-relationships#eager-loading

Split API into micro-services

I've have one question about architecture.
We have api for mobile backed. And now we implementing some new features, like user messaging.
For no api uses one database, and I want to have separate api and database for messages. Like micro-services. api.somedomain.com, messages.somedomain.com, etc.
Main api is guarded by implementing access via access keys. And in micro-services databases I need some data from main database, like user info, profile info, etc.
Maybe someone have ideas how to implement such mechanism?
Maybe master-slave replication with slave = database where I need information from main database?
Maybe you can just create some endpoints in your main API to get the information your want. In your new micro services you can consume this API. I think it's a beautiful design and make you application more agnostic and extensible
Perhaps mapping each database involved as an separated entity manager.
You can map the new services as new container services and new controllers.
Symfony and Doctrine allows you to access different entity managers very easily.
Replication is a good resource, but sometimes it's cumbersome. Maintaining a replica set requires additional efforts and hardware.
You should evaluate if the database is too big or might have some performance issue due the micro services consuming the main database.

Symfony2 : Share Bundles between 2 projects

Our company decided to move to Symfony2 and now we are trying to re-write our application. I'm still new to Symfony2 and trying to figure out what is the best way to build the app.
our app consists of 2 part each part is on a different server:
A contains all the logic required to get/store/update records in database
B users requests go to, process the request and send another request to A for any database interaction
I'm planning to use Doctrine in A so i created all the required Entities, but i feel i need to share A entities with B since it's easier to create/validate forms using entities.
Am I going into the right direction or I don't have to share the entities?
If I share the entities and B doesn't have any database connection, will that create any problem for me?
If I'm going to do this approach (separate logic and database calls into 2 separate bundles each one on a different server) is there any consequences? where should i build Services ? on A or B or also i would have to share Services too?
Thanks
I did not have the exact problem you described, but I wanted to share/extend some entities between applications in single project. However, I was recommended by good folks on SO to branch entire project.
I suppose you will be using Git so you'll have two branches: A and B. Both will have common files (entities in you case), but will certainly contain different controllers/services as necessary.
Now, since you've mentioned two servers, not only you will have separated code but you will be able to set different deployment scripts both for A and B.
The downside: I believe that some intermediate knowledge of git branching will be necessary.
Is that acceptable for you?

Can Doctrine be used with a Postgres DB & MSSQL DB at the same time?

At my work we have two separate sites that are very closely related. One of them is a ASP/MSSQL site and the other is a PHP/Postgres site.
I want to create a REST API that everything from now on is built on top of. I would like it to be tied to both DBs so that it can be a a single point of retrieving and setting data.
I was thinking of using a DBA like Doctine to keep from writing queries in two different syntax. In the same system is it possible to tie parts of Doctrine to the MSSQL and other parts to Postgres?
If so, how? Any other thoughts on design are welcomed.
Within your application framework, you need to configure two separate entity managers, each of which will connect to a different database. More on entity managers at http://docs.doctrine-project.org/projects/doctrine-orm/en/latest/reference/configuration.html
The core architectural pattern is that your models are plain PHP objects, and the entity manager (Data Mapper) will read the mapping configuration to know how to map the models to a database.
If you're writing a REST based API it shouldn't really matter what the DB backend is. For example, if you write your API in a combination of Django and tastypie, you could simply swap out a settings.py config to work with both Postgres and MySQL... or even MongoDB if you so chose
The point is, a REST API is a generic solution that can be used by a multitude of languages, you should chose a framework that allows you the same flexibility in DB backends to implement.

How to implement MVC across the LAN?

I don't want to reinvent the wheel, so ask how others do this ...
I have 2 upcoming projects, one in Delphi and one in PHP (plus maybe one in Java), so info specific to those would be welcome, but a generic answer is also acceptable.
So, I have a bunch of PCs and a database server (ODBC) and want to develop an MVC app.
I guess that the Model is on the d/b server and that the view is on each individual PC. Where is the controller? One on the d/b server, or one copy on each PC?
When writing data I imagine it is enough to lock the relevant d/b table(?). But how to update all of those views and tell them that there is new data or that the data which they are working on has been modified on deleted by another user?
Any code is welcome as are URLs or book recommendations ... thanks
Suggest you start with this
http://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93controller
Its a regular intranet app right? why do you want to split everything separately. Models, views and controllers are just the respective files where you do the relevant stuff, they dont reside physically on a separate locations. I think you have got the Concept of mvc completely wrong. MVC means splitting and layering the code based on their functions not keeping them physically separate.
To be more clear in a laymans language your models views and controllers are just directories in your application. Your view files goes in to your view directory(not necessarily this is changeable though) your models that is your DB related operation files goes into models and the classes that control and drive your app goes into your controller directory.
Running it on intranet is quiet simple. ALl you have to do is host your files on a system that has a static IP or Name within your network then from other systems go to your browser and point your browser at
http://myserver/myapp or http://192.168.100.100/myapp
I dont know anything about delphi, but what ever i said above stands for PHP and many other languages.
So taking all the above points in you are not going to update the views separately on each system all files are in your central server and any changes made in them automatically reflects when requested from the clients.
From your question i assume you are completely new to web development or atleast mvc so first have a look at some simple mvc like codeigniter in PHP. CI has a very good DOC so you can come up to speed.
Hope i have answered all questions.
As far as I understood MVC, there is no rule about the location of the controller itself.
Please take in consideration that MVC is an architectural pattern, not an hardware nor even logical design. If, like in any n-Tier architecture, it could make sense to have the DB in a dedicated computer (for performance and maintainability/backup reasons), controllers and views can be... everywhere...
MVC is a pattern, that is, more a way of modeling/interfacing objects and classes than a way of distributing your application into modules. In fact, you can (and IMHO should) share MVC code among client and servers, whereas the objects are still implementing Model/View/Controllers in a separated way.
A "classic" MVC implementation (e.g. used by RoR or by DoR or Relax in the Delphi world - both in draft status) uses directories or files to split views and controllers. But this is just one implementation design of this pattern.
You can have a pure object MVC orientation, as we implemented for instance in our ORM framework. You can in fact have objects everywhere, to handle your model, to access the DB, to handle your business services.
In our mORMot Open Source framework, for Delphi 6-XE, you can follow this development pattern:
Data Tier is either SQLite3 and/or an internal very fast in-memory database, and/or any other external database (via OleDB or OCI), most SQL queries are created on the fly by the ORM kernel;
Logic Tier is performed by pure ORM aspect: you write Delphi classes which are mapped by the Data Tier into the database, and you can write your business logic in both Client or Server side, just by adding some events or methods to the classes; a Service-Oriented-Architecture (DataSnap-like) is also available, and can be used without any object;
Presentation Tier is either a Delphi Client, either an AJAX application, or any other back-end able to communicate using RESTful JSON over HTTP/1.1 - in this case, PHP or JAVA clients are perfect candidates.
I suggest you take the time to download and take a look at the mORMot documentation. It's far from perfect, but it tries to be complete. In the SAD document, you'll find some pages about the architecture we implemented, in particular MVC, n-Tier, SOA and ORM. Don't be afraid of the 1000 pages of PDF - you don't have to read it all: I just added some general diagrams which may help you modeling your solution.
In all cases, taking a breath before implementing a solution for your project is a very good decision. The time you'll spend now by looking at existing architectures will certainly save you pain in the future. Good start!

Categories