Here is a situation.
I want to duplicate an API twice on my server but with different databases. What I want to do is to use some function of these API's. In PHP I have a situation where I am implmenting a loop of all these API locations.
forach($apis as $api){
include ($api->apiFiles);
useAPIfunction();
}
the problem is The API have the same functions and classes, the loop runs one time but for the second time the PHP Fatal error cannot redeclare function" occurs.
Now I don't want to change the API like changing function names etc, I have to do all from my script. Is there a situation to unregister the API file after the first iteration completed because it is the only way to include the new API.
Thanks in Advance for Help.
You don't need to include the same API multiple times, you just include it once (you can use the include_once command to avoid loading the same API multiple times). Ideally, the API should take parameters telling it what database to use. But if it uses global variables, you set the variables before calling the API functions. It's best for it to use parameters, because the second way means that you can only use one database at a time.
I have done a trick. I included the API files on the first iteration and then for other iteration add a script to connected to the second database and vice verse. Now the API will be connected once using the first API files but the Database is changing on new iteration. Also I did not found a way to get The Database Info so I retrieved it from the API config file.
forach($apis as $api){
if($count = 0){
include ($api->apiFiles);
}else{
$theData = file_get_contents($api->path'\includes\config.php', NULL, NULL, 10, 600);
$theData = str_replace('<?php','',$theData);
$theData = explode("'",$theData);
$apiHost = $theData[19];
$apiDB = $theData[7];
$apiDbUser = $theData[11];
$apiDBPass = $theData[15];
mysql_connect($apiHost,$apiUser,$apiPass);
mysql_select_db($apiDB);
}
useAPIfunction();
}
Related
I state that I'm not familiar with Prestashop and I'm using version 1.7.6.
I'm trying to understand how I could use the import function from csv file without using of user interface.
I tried to look for documentation on a possible web api but I found nothing.
What I'd like to accomplish is the following scenario:
I have two web applications on the same server
/my_webapp
/my_prestashop
By "my_webapp" I receive a csv file, process it and produce a new csv file.
Now continuing running the process in "my_webapp", I would like to instantiate the ambient of the prestashop application to invoke the import csv function by passing it the new file just created.
Searching the web I found some sample code but, trying to use and adapt it, I am not making it work.
For example, on “my_webapp” folder I just create a “myimport.php” file and call it with two GET parameters.
The following is the call:
localhost/my_webapp/myimport.php?csv=prod.csv&limit=5
note: the file “prod.csv” is on
"path to admin folder"/import
Content of “myimport.php” file:
<?php
$rootPrestashop = '/var/www/html/my_prestashop”;
define('_PS_ADMIN_DIR_', $rootPrestashop.'/admin_shop'); //not sure if this instruction is needed
$pathConfig = $rootPrestashop.'/config/config.inc.php';
$initConfig = $rootPrestashop.'/init.php';
require_once($pathConfig);
require_once($initConfig); //this line throw an error and then I can't test the others!
$importCtrl = new AdminImportControllerCore();
$crossSteps = array();
$limit = $_GET["limit"];
$importCtrl->productImport(false, $limit, $crossSteps, true, 0);
This is what I’m trying to do, but I failed to initialize the environment.
Maybe I’m on the wrong way and there’s a better way.
I ask if anyone can help me understand if I can carry out this process and what would be the correct way.Thanks in advance
if (!defined('_PS_ADMIN_DIR_')) {
define('_PS_ADMIN_DIR_', __DIR__);
}
include _PS_ADMIN_DIR_.'/../config/config.inc.php';
if (!Context::getContext()->employee->isLoggedBack()) {
Tools::redirectAdmin(Context::getContext()->link->getAdminLink('AdminLogin'));
}
I have a PHP script that can take a few minutes to be done. It's some search engine which executes a bunch of regex commands and retrieve the results to the user.
I start by displaying a "loading page" which does an AJAX call to the big processing method in my controller (let's call it 'P'). This method then returns a partial view and I just replace my "loading page" content with that partial view. It works fine.
Now what I would like to do is give the user some information about the process (and later on, some control over it), like how many results the script has already found. To achieve that, I do another AJAX call every 5 seconds which is supposed to retrieve the current number of results and display it in a simple html element. This call uses a method 'R' in the same controller as method 'P'.
Now the problem I have is that I'm not able to retrieve the correct current number of results. I tried 2 things :
Session variable ('file' driver) : in 'P' I first set a session variable 'v' to 0 and then update 'v' every time a new result is found. 'R' simply returns response()->json(session('v'))
Controller variable : same principle as above but I use a variable declared at the top of my controller.
The AJAX call to 'P' works in both cases, but everytime and in both cases it returns 0. If I send back 'v' at the end of the 'P' script, it has the correct value.
So to me it looks like 'R' can't access the actual current value of 'v', it only access some 'cached' version of it.
Does anyone have an idea about how I'm going to be able to achieve what I'd like to do? Is there another "cleaner" approach and/or what is wrong with mine?
Thank you, have a nice day!
__
Some pseudo-code to hopefully make it a bit more precise.
SearchController.php
function P() {
$i = 0;
session(['count' => $i]); // set session variable
$results = sqlQuery(); // get rows from DB
foreach ($results as $result) {
if (regexFunction($result))
$i++
session(['count' => $i]); // update session variable
}
return response()->json('a bunch of stuff');
}
function R() {
return response()->json(session('count')); // always returns 0
}
I would recommend a different approach here.
Read a bit more about flushing content here http://php.net/manual/en/ref.outcontrol.php and then use it.
Long story short in order to display the numbers of row processed with flushing you could just make a loop result and flush from time to time or at an exact number or rows, the need for the 5 seconds AJAX is gone. Small untested example :
$cnt = 0;
foreach($result as $key => $val) {
//do your processing here
if ($cnt % 100 == 0) {
//here echo smth for flushing, you can echo some javascript, tough not nice
echo "<script>showProcess({$cnt});</script>";
ob_flush();
}
}
// now render the proccessed full result
And in the showProcess javascript function make what you want... some jquery replace in a text or some graphical stuff...
Hopefully u are not using fast_cgi, beacause in order to activate output buffering you need to disable some important features.
I believe you have hit a wall with PHP limitations. PHP doesn't multithread, well. To achieve the level of interaction you are probably required to edit the session files directly, the path of which can be found in your session.save_path global through php_info(), and you can edit this path with session_save_path(String). Though this isn't recommended usage, do so at your own risk.
Alternatively use a JSON TXT file stored somewhere on your computer/server, identifying them in a similar manner to the session files.
You should store the current progress of the query to a file and also if the transaction has been interrupted by the user. a check should be performed on the status of the interrupt bit/boolean before continuing to iterate over the result set.
The issue arises when you consider concurrency, what if the boolean is edited just slightly before, or at the same time, as the count array? Perhaps you just keep updating the file with interrupts until the other script gets the message. This however is not an elegant solution.
Nor does this solution allow for concurrent queries being run by the same user. to counter this an additional check should be performed on the session file to determine if something is already running. An error should be flagged to notify the user.
Given the option, I would personally, rewrite the code in either JSP or ASP.NET
All in all this is a lot of work for an unreliable feature.
i am trying to rewrite my code to support multithreading ,it is a simple code but i can't figure out how to do it,basically what it do is
request the first webpage with curl --> to get a unique id
use the unique id to request another page --> to get a session
use the session to request another page --->sleep() then do it again
now this is what a single thread do,but i want to create a lot of threads in the same time
what i did is ,create 3 sperate files
the first one create 10 sessions and save them in a txt file with other parameters (session1|unique_id1|paramter1|anotherparameter1)
the second file contain this code
$sessions = file('sessions.txt');
$WshShell = new COM("WScript.Shell");
foreach($sessions as $kk => $session) {
if (!empty($session)) {
$oExec = $WshShell - > Run("php requests.php $kk", 0, false);
}
}
it open the txt file,and foreach line it open the requests file with the line number in argv
and in the third file,it take the line number ,and open the sessions file ,retreive the paramater of the session and send requests with that session
so this is how i did my multithreading,but i feel like i wrote a php code with rocks
now i want to rewrite it without having to open 10 sperate php process
There really isn't a native way to do threading in PHP. The approach you took works, but I would approach it differently. It's possible to fork processes in PHP. This I've done and works well.
One approach is to use some messaging system like RabbitMQ and distribute the work that way. Basically an Actor or Pub-sub model.
Another approach that might work well for you would be "pthreads". http://php.net/manual/en/book.pthreads.php
I've not tried this method myself so I cannot give you details as to how well it does or doesn't work.
Hope this helps!
I'm currently coding one of my first php applications.
The application has to connect to a LDAP server and change some user attributes in the directory.
That application has some parameters to read in a mySQL Database in order to run.
Some examples of these parameters could be:
-LDAP Address
-LDAP Service Account
-LDAP Password
there are much more parameters, which rule, for example, the way users authenticate to my application,...
Currently, the database is read at each user session initialization, but, it doesn't have any sense because parameters do not vary from a session to another.
So, i'm looking for a way to load these parameters from the database, only one time (for example, at the php service initialization), and access to these parameters in the "normal" php code through variables.
What would be the best way to do this?
Thank you in advance.
You are looking for a persistent cross-request storage. There are many options for this.
The simplest is APCu (which can be used in conjunction with Zend OpCache, or for PHP < 5.5, APC).
Simply:
if (apc_exists('mykey')) {
$data = apc_fetch('mykey');
} else {
// create it from scratch
apc_store('mike', $data);
}
$data can be most any PHP type, arrays, objects, or scalars.
You can even put this code in the auto_prepend_file INI setting so it is run automatically on every request.
However: this is per server (and per SAPI, so mod_php/php-fpm/cli don't share the cache) so you will have to create it once per server.
Alternatively, for a multi-server setup you can use something like memcached or redis. These are stand-alone daemons that will let you store arbitrary key/value pairs of string data (so you may need to serialize()/unserialize() on the values).
I personally prefer memcache, which has two extensions for PHP, pecl/memcached and pecl/memcache (I prefer pecl/memcached, it has more features).
Both of them are pretty simple.
pecl/memcached:
$memcache = new Memcached();
$memcache->addServer('localhost', '11211');
$data = $memcache->get('mykey');
if (empty($data)) {
// Create data
$memcache->set('mykey', $data);
}
pecl/memcache:
$memcache = new Memcache();
$memcache->connect(); // uses localhost:11211, the default memcache host/port
$data = $memcache->get('mykey');
if (empty($data)) {
// Create data
$memcache->set('mykey', $data);
}
Both extensions support storage of arrays and objects without serialization.
You can of course store multiple keys with any of these solutions and just pull them all, instead of using one, or one with an array/object.
You can use Memcache do cache database requests. See here how to use.
Another way is using Php Sessions.
<?php
session_start(); // need to be before any html code
$_SESSION['something'] = 'Something here...';
echo $_SESSION['something']; // will show "Something here..."
And you can remove using...
unset($_SESSION['something']);
You also can use cookies, using the function setcookie. See here.
And you can get cookies using...
echo $_COOKIE['something'];
Production mode
In a production mode, this will work as set_transient of Wordpress. You will do the first db request to get the value and will cache this value using cookies, sessions or memcache.
If you want to show this values inside of your page, you can use a standard caching library.
My understanding of the question is that you have some SQL data that is more or less constant and you don't want to have to read that in from the SQL connection on every request.
If that is the case you can use memcache to store the data:
http://php.net/manual/en/book.memcache.php
The data will still be persistent and you will only need to go to the database if the cached data isn't there or needs to be refreshed.
If the data is specific to a particular user you can just use a session.
http://php.net/manual/en/book.session.php
http://php.net/manual/en/session.examples.basic.php
If this is only to be used when starting up your server (so once and done) and you don't want to bother to with memcached/xcache (as they would be over kill) you can still use environment variables. See get_env
I'm trying to make an interface for updating all instances of a WordPress plugin on a server that hosts 20+ WordPress sites. I've got everything working except for the fact that I have a loop with:
require_once($path.'/wp-load.php');
require_once($path.'/wp-admin/includes/admin.php');
require_once($path.'/wp-admin/includes/class-wp-upgrader.php');
where $path is equal to a website directory ($path changes with every iteration of my loop).
The reason I need to require the files this way is because wp-load.php includes (among other things) a file called wp-config.php, which defines things like the SQL database, which is different between each website.
TL&DR and stating my actual question:
Is there any way for me to do something like the following code?
require_once("dir1/a.php"); // define("VAR","dir1");
echo VAR; // displays "dir1"
unrequire_once("dir1/a.php");
require_once("dir2/a.php"); // define("VAR","dir2");
echo VAR; // displays "dir2"
Since you've mentioned this is an in-house tool, you could consider using runkit in order to remove the constant at runtime, allowing it to be redefined later: http://www.php.net/manual/en/function.runkit-constant-remove.php
Afraid not - even if you could, you wouldn't get the behaviour you're looking for because constants (define) can only be created once, and are like that for the rest of execution. They cannot be removed, or changed thereafter.
Unfortunately, you're going to have to re-architect your script (e.g. to use variables instead of constants).
Yes: fork the process. You can do something like:
foreach ($pathlist as $path) {
$pid = pcntl_fork();
if(!$pid) {
require_once($path.'/wp-load.php');
require_once($path.'/wp-admin/includes/admin.php');
break;
}
}
This will create a copy of your script for each $path in your list and gets around the require_once problem.
You need to "break" the loop if it's a child, as you only want the child process to run for its individual $path setting and not continue the loop itself!