Magento Start Dataflow Profile with new CSV from Command Line - php

I am running magento 1.6.2
I built a dataflow profile to import my customers/logins. All goes well there. I need to be able to run this every day, and I have seen some code out there to trigger dataflow profiles to run from the command line, but I can't figure out how to run the profile with a new CSV each time. Heck, I can't even find where it put the CSV that I uploaded! Any ideas where it might be? And does anybody know the best way to run a profile from the CLI?
Thanks

Well, I'll answer my own question:
The directory where it puts files uploaded interactively is /tmp/magento/var/import/
Turns out, you don't need that information, because you don't want the interactive upload, you want the hard-coded path set in the "remote/local server" setting.
I got the command line import working by using the HO_Shellimport found here: https://github.com/ho-nl/Ho_ShellImport. It's free and easy (no affiliation).

Related

Moodle reports from Database

I have a moodle database which I exported a few months ago before our server went down. Now I want to generate reports from my old database, I have tried to import to new moodle site but moodledata folder is missing. So now I'm looking for another way to generate reports from my database. I have tried to make Msql queries but I think that would take a lot of time for now. I need help if there is any tool around which I can use or any API which I can use to generate reports from my database. I have tried to use Seal Report to tackle this issue but I have found that there is a lot of manual work to be done, I don't means this tool can't do that but I'm just looking if there is any other tool which can simplify my task.
NB: I know some will say this is not a programming question, Please feel free to suggest any best way to query using any language.
You should be able to set up a local copy of a Moodle site with a copy of the database and with a blank Moodle data folder (I've done this regularly in order to investigate issues on a customer's site).
Once you've done that, you will have access to any reporting tools you would normally have inside Moodle.
You may find it easiest to set up a fresh install of Moodle, pointed at a blank database, then, once the install is finished, edit the config.php file to point at the restored copy of the original site. You may have to purge caches (php admin/cli/purge_caches.php) and you may have to reset the admin password (php admin/cli/reset_password.php). It is also wise to turn off email (edit config.php and add $CFG->noemailever = true; ).

Install MediaWiki locally with a large DB: "LocalSettings.php" couldn't be generated

I'm trying to install MediaWiki (1.29.1 or 1.27.3) locally with a large Wiktionary dump (3GB).
After converting the xml dump into an sql file and importing the latter into my DB that I create with this script, I followed the MediaWiki installation instructions in the browser to generate my specific "LocalSettings.php". I get the message
There are MediaWiki tables in this database. To upgrade them to MediaWiki 1.29.1, click Continue."
By clicking the "continue" button, the browser stays in loading state forever.
My understanding is that my DB containing the wiktionary dump has some tables that are not compatible with the version of wikimedia that I'm using. Therefore, an update of the DB is required.
I tried to run install.php from the command line to avoid having timeout with the browser. The command didn't return anything (after waiting more than 2 hours).
I tried as well a workaround:
Create my DB with empty Tables
Generate "LocalSettings.php" from the browser (that was fast since the DB is small)
Import the wiki sql dump to my DB
Refresh the index.php page
I got then a blank page with this message
Exception caught inside exception handler.Set $wgShowExceptionDetails
= true; and $wgShowDBErrorBacktrace = true; at the bottom of LocalSettings.php to show detailed debugging information.
All the examples and tutorials that I found online about this matter are assuming/using a small or new created DB.
Any idea what's wrong? Did really someone tried to use an existing wikimedia dump and run it locally? Why is there no such an advanced example?
You wrote "I'm trying to install Wikimedia (1.29.1 or 1.27.3)". I suppose that you are talking about Mediawiki, not Wikimedia. Am I right?
1) You can try parsed version of Wiktionary. It is a little bit old (2014) http://whinger.krc.karelia.ru/soft/wikokit/index.html
2) You can try to use my tutorial about download Wiktionary dump, uploading to MySQL, converting and parsing to something more usefull for work: Getting started Wiktionary parser.
See: MySQL import
The issue in a first level originates from mwdumper which seems to be outdated. An sql DB I generated using mwdumper is missing some tables which should have been though created by running update.php. It was not possible for me to run any php file neither from shell nor from the browser and I suspect the size of the dump to be the cause.
The workaround which by some magic helped to overcome this issue was:
run the update.php from shell with missing db credentials. This somehow enables logs and make the execution of index.php possible through the browser
add manually to the missing table columns claimed in the error messages (the column types here should be respected)
place a LocalSettings.php file, generated easily from a Wiktionary DB with empty tables, in the right directory of the mediawiki installation.
Run index.php from the browser
Et voila! The huge wiktionary mysql dump is queryable now throw the mediawiki interface. Not sure if such a trick could be called a solution but it solved the problem in my case. An explanation for what could have happened in background would be definitely helpful.

Retrieving images in the uploaded pdf document in php

I am trying to display the images in the pdf document that I uploaded to the server as hyperlinks in php so that if user clicks on them they will get the corresponding document.
Please help me ,Thanks in advance!
Use pdfimages, which comes with the open-source xpdf software package (for *nix operating systems). You'll have to call it through exec or the like, then work with the output from PHP. I am not aware of any PHP library that provides this functionality, so you're going to have to experiment.
EDIT
You mentioned that you aren't experienced with PHP... I thought I'd add that this isn't a quick-and-easy type of task, you certainly aren't going to find a bunch of tutorials around the internet for this.
To get started, you'll have to install the xpdf package on your server. There's a lot of different ways to do this depending on which OS you've got.
After that is set up, you'll be using a command line to execute a program on your server; you'll want to capture the output of that command in PHP and work with it further. So initially, you'll want to work out exactly what your command line will look like as well as what the output looks like and means - do this from command line, don't worry about the PHP part yet. In this case, your output is going to be a list of the image files extracted from a given PDF, you're command line call will look something like "pdfimages mypdf.pdf". Play around, find out what happens.
After you work out exactly what command line you need to send and what the command does, you can focus on the PHP angle. In a nutt shell, you want PHP to execute the exact command that you've already worked out. Look at the manual for exec for information on how to call a command line and get the output back. Write your script to make the correct call and show the call's output.
Next, move on to doing something with that output. I presume you'll want to somehow store the extracted images in a web-accessible place, put them in the database, show them to the user, etc. That is the very last stage after you've worked out the initial steps.
Good luck!

xdebug, having problems with profiler output

Right, since watching Rasmus Lerdorf's talk on PHP performance I've been wanting to profile the ERP / Accounting application I am working on, not least because I know there are performance issues with it, profiling should highlight the major problems for me to investigate.
So downloaded xdebug and put the following few lines in my php.ini file:
zend_extension="/usr/lib/php5/20090626+lfs/xdebug.so"
xdebug.profiler_output_dir="/home/me/xdebug/profiles/"
xdebug.ptofiler_enable_trigger=On
With this I simply aim my browser as my app with &XDEBUG_PROFILE in the query string and the profiling begins. The problem is the output I am viewing with KCacheGrind doesn't include any of the functions from with my application, and no flow between entities.
When the page is executing I copied (in the terminal) the profile file to a separate file, to capture it's state throughout the profile. I loaded each of these separately into KCacheGrind and they all show the full profile of the application, all but the last one?
Can anyone tell me why the full profile isn't being output? Looking at the file sizes of my copied files it appears the first few are rather large, but the last one is significantly smaller, is xdebug messing with them after it has been captured?
Many thanks :-)
EDIT
Just to help, this is what I see when I open up one of the copied profiles (before the profile has completed), I'm sure there is much more to this.
And this is what I get from the final profile, no relationships, just a bunch of PHP functions. I want to see all the full profile.
EDIT 2
So here I am constantly running the ls -als command, the last list is the cut down version, the previous one is the last ls where the file was at it's full size.
I cannot upload the large file as it's over 3 million lines long, if it helps here is the xdebug php info section.
Right, I've actually solved the problem myself, I added this option to my php.ini file:
xdebug.profiler_append=1
This will append the data to the same filename if it exists, therefore I'll need to make sure the filename option is set correctly, but I think that has solved my problem for now.
Thanks to those that answered :-)

OpenCart mp3 preview

What is the best method and player for giving an audio preview on an OpenCart store. This would involve uploading the full track and then extracting a portion to be played
m3psplt is by far your best bet.
It can sometimes be a little dicey to install (particularly on CentOS, other RH based distros) but it's really the only solution I've found.
I usually run a script that analyzes the mp3 with getid3 to get the length, then I calculate the halfway point of the mp3, and pass that plus thirty seconds to mp3splt via the exec command to mp3splt.
It works great when you can get it to install properly. If you're on debian/ubuntu it's actually a cinch to install via aptitude.
The only other thing I could think do do would be to wrap your command line unix audio editing utilities in a php script to basically create a "grab 2 minute head of MP3" function, then run that on files when they are uploaded. then yes, save them in a "previews" area of the file system and store the filename in a DB table for later reference.
I've found a PHP script that could fit your needs (please note I didn't tested it). You can find it here. The class interface seems simple and functional. Anyway, you will need to modify your OpenCart product template to expose the preview command.

Categories