Import multiple files automatic in mySQL - php

I am trying to set up the following interface between websites:
1) Other websites will upload (xml) files on my webserver via FTP on a specific folder
2) My website will scan the folder for new files and will import them immediately after they have been uploaded
2a) Ideally the scan is triggered after a new file is uploaded in FTP, but I do not know how to do this. Is this possible?
2b) Alternatively, I use cron to scan the folder every second, but I think this will use a lot of resources.
3) After the (xml) file has been detected, it will be automatically imported in the mySQL database.
I just cant find any help on using the scan for new file trigger. I am using PHP, PHP my admin and Drupal. Are these tools sufficient or should I have something else?

You could rather user their feeds and import and update will be done at every cron runs!

Related

PHP Distributed File System, File System for downloading from Network Storage

I have network storage which is available for me. My job is to create script which will daily download updated files from this storage ( file to download have same name just content is different).
And here is question to You because I'm not very experienced in writing scripts ( only php and little bit sh).
I tried using samba and class which is available here but this class is supposed to create UI to download files manually which is not what I am looking for.
Can You please tell me any other way how can I download files from network storage?
Network file system, using PHP and MongoDB
http://verens.com/2014/11/06/distributed-file-storage-using-php-and-mongodb/

PHP: select the latest file added to an Amazon S3 folder

I'm working on an auto-update solution, and I'm using Amazon S3 for distribution.
I would like for this to work like follows:
I upload a file to s3 folder
An automatic PHP script detects that a new file has been added and notifies clients
To do this, I somehow need to list all files in an amazon bucket's folder, and find the one which has been added last.
I've tried $s3->list_objects("mybucket");, but it returns the list of all objects inside the bucket, and I don't see an option to list only files inside the specified folder.
What is the best way to do this using Amazon S3 PHP api?
To do this, I somehow need to list all files in an amazon bucket's folder, and find the one which has been added last.
S3's API isn't really optimized for sort-by-modified-date, so you'd need to call list_buckets() and check each timestamp, always keeping track of the newest one until you get to the end of the list.
An automatic PHP script detects that a new file has been added and notifies clients
You'd need to write a long-running PHP CLI script that starts with:
while (true) { /*...*/ }
Maybe throw an occasional sleep(1) in there so that your CPU doesn't spike so badly, but you essentially need to sleep-and-poll, looping over all of the timestamps each time.
I've tried $s3->list_objects("mybucket");, but it returns the list of all objects inside the bucket, and I don't see an option to list only files inside the specified folder.
You'll want to set the prefix parameter in your list_objects() call.
S3 launched versioning functionality of files in bucket http://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html.
You could get latest n files by calling s3client.listVersions(request) and specifying n if you want.See http://docs.aws.amazon.com/AmazonS3/latest/dev/list-obj-version-enabled-bucket.html
Example is in java. Not sure if the API for PHP is added for versioning.

How to import maxmind database using php

I know there are similar questions to this, but I never found a solution to match my case.
so my problem is i have a large file like 130MB with .txt extension.
now I want to upload this file to mysql database.
now I have problem uploading this file it gets timeout, using phpmyadmin.
is there a good way to upload this file using php?
or is there any other way besides those?
Access your server via the console (ssh, telnet, etc) and import the file using the native cli client load data syntax to import your file data:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Edit: updating answer based on comments.
Since you can't access mysql via a CLI, i would suggest uploading the text file via ftp, then making a quick php script to import the file via a simple db connect + insert statement..
also use set_time_limit(0) to ensure the script doesn't timeout while executing the query..
You'll also need to make sure you have enough ram available to load the file.

Best practices for automating creation of XML files for front-end access?

I am working on a custom PHP/MySQL CMS. The data managed in the CMS is exported to XML files via a PHP script that must be ran manually. A Flash/AS3-based front-end loads the XML files and displays the specified data.
Is it advisable to setup some sort of automated process for creating the XML files?
What are some "best-practices" or related advice?
Thanks!
If you have os access, then of course you can create these files via CRON for unix or linux or Scheduler on Windows. If the process only actually needs to regenerate files when things have changed, your data change process can additionally put the "refresh" request into a Queue (either a flat file or database table) that is then picked up by the script that generates the XML files to know which should be rerun. If all of the files should just be generated on a periodic basis, then code the script to process all and set that to run from whatever scheduler you are using.
If you don't have OS access, you can code it in a php page that requires credentials and setup a scheduler with a script on your desktop to just call out to the page to tell it when to refresh when given the proper credentials.
I suggest:
Set Flash script to read XML file from /location/file.php
This file.php can generate XML and output it to Flash
You can make cashing mechanism on that file.php

File upload and storage handling in a web application

I am currently using php and ajax file upload to develop a web application.
in a web application involves getting the files uploaded from user, e.g email client, photo gallery. This is the scenario that i got stuck.
When user uploads some files but close the browser without submit, i want to delete those files and only move the relevant files.
I have tried leave the stuff in tmp/ folder and been given a temp name by apache but when i do the upload i have to move the file immediately otherwise the file cannot be found in the later stage by referencing to the temp filename.
The reason that i leave it in a /tmp/ is that i will want to setup a cron job and delete files in those folder to free up server space.
Am i doing the right thing? or is there a standard industry approach used by hotmail, google etc?
You will need another temporary folder which you can manage yourself.
You can upload to this folder you created yourself called temp. When the uploading is complete, move the temporary file from PHP's tmp folder into your temp folder.
Then when the submission is done, you move the file away into its respective folders.
Have a cron job that works background to remove old files in that folder.
Remember to give permissions to PHP, Apache and the cron job for access to the folder.
Don't rely on industrial standards - besides, Microsoft and Google don't use PHP. (maybe Google, but definitely not Microsoft).
Why not just move it from the tmp/ folder to your own temporary staging folder immediately, and then keep a reference to it in the DB, and have a cron job that periodically scans the DB for 'staging' files with a timestamp more than X hours in the past and removes them?
I dont know about big boys, but I guess, you can create a database table, that will hold the temporary file names, the pros of this approach is that, you can delete the entry from temporary file table, even browser is not closed in the middle, and additionally setting up cron job to delete files as found under temporary file table.

Categories