removing draft file after saving permanently in moodle File API - php

I'm using moodle filemanager to get a file from user and save it permanently like this:
$fs = get_file_storage();
$pluginname='profile_field_fileupload';
$pluginfolder= 'profile_field_profileimage';
$draftitemid=file_get_submitted_draft_itemid($this->inputname);
if (empty($entry->id)) {
$entry = new stdClass;
$entry->id = $this->userid;
}
$context = context_user::instance($this->userid);
$files = $fs->get_area_files($context->id, $pluginname,$pluginfolder,false,'',false);
foreach ($files as $file) {
$file->delete();
}
file_save_draft_area_files($draftitemid, $context->id, $pluginname,$pluginfolder,$entry->id,array('subdirs'=>false, 'maxfiles'=>1));
But draft still exists.
How should I remove draft after saving it?

Wait a few days - Moodle's cron process automatically cleans up draft files after that point (the delay is to make sure you haven't still got a copy of the form open and in use).
Remember that the draft area files are taking up no extra storage space on your server, as all files with identical content are stored only once, with multiple entries in the 'mdl_files' table all pointing to the same physical location on the server's hard disk.

Related

ZipArchive not saving file on live server in Laravel

I have some encrypted responses that I convert to a Zip file in my Laravel application. The function below downloads the API response, saves it as a Zip file, and then extracts it while I read the folder's contents. In my local environment, it works well. However, the Zip file is not getting saved to the storage folder on the live server. No error is being shown, only an empty JSON response. Please, what could be the cause?
public function downloadZipAndExtract($publication_id, $client_id)
{
/* We need to make the API call first */
$url = $this->lp_store."clients/$client_id/publications/$publication_id/file";
$file = makeSecureAPICall($url, 'raw');
// Get file path. If file already exist, just return
$path = public_path('storage/'.$publication_id);
if (!File::isDirectory($path)) {
Storage::put($publication_id.'.zip', $file);
// Zip the content
$localArchivePath = storage_path('app/'.$publication_id.'.zip');
$zip = new ZipArchive();
if (!$zip->open($localArchivePath)) {
abort(500, 'Problems experienced while reading file.');
}
// make directory with the publication_id
// then extract everything to the directory
Storage::makeDirectory($publication_id);
$zip->extractTo(storage_path('app/public/'.$publication_id));
// Delete the zip file after extracting
Storage::delete($publication_id.'.zip');
}
return;
}
First thing I'd check is if the storage file is created and if it isn't created, create it. Then I'd look at your file permissions and make sure that the the groups and users permissions are correct and that you aren't persisting file permissions on creation. I've had many instances where the process that's creating files(or trying) is not in the proper group and there is a sticky permission on the file structure.

PHP - Modification time of extracted files is set to server's time zone - How to change?

So our code creates a zip file and inside it are PDF's from a specific folder. Everything works fine, except that when this zip file is downloaded to a computer and the files are extracted - the modification time for each PDF file follows the server's time zone. (Our site is on a GoDaddy shared server, and they're located in Arizona. We are located in Singapore.)
Is there a way to modify this? I have tried to set the default time zone or use touch() but no luck.
// Initialize archive object
$zip = new ZipArchive();
$res = $zip->open(plugin_dir_path( __FILE__ ).'MortgageWise-All-Reports.zip', ZipArchive::CREATE | ZipArchive::OVERWRITE);
if ($res === TRUE) {
// Create recursive directory iterator
/** #var SplFileInfo[] $files */
$files = new RecursiveIteratorIterator(
new RecursiveDirectoryIterator($rootPath)
);
foreach ($files as $name => $file) {
// Skip directories (they would be added automatically)
if (!$file->isDir()) {
// Get real and relative path for current file
$filePath = $file->getRealPath();
$relativePath = substr($filePath, strlen($rootPath));
date_default_timezone_set("Asia/Singapore");
//touch($filePath, );
// Add current file to archive
$zip->addFile($filePath, $relativePath);
}
}
// Zip archive will be created only after closing object
$zip->close();
}
else {
echo "File not created.";
}
I would start by trying to set the timezone at the start of the script, before you create the ZIP, rather than in the foreach loop.
But with that being said I don't believe it will make much difference as the PDF files already exist? And therefore will already have those properties assigned to them!
Do you have access to Web Host Manager or similar? If you have a VPS or Dedicated server, you can change the time zone of the server yourself.
EDIT: sorry, you already stated you are on a shared server, would highly recommend a VPS to give you full control over these settings, should only be $50 a month

File rename no longer working

I have an update controller function in a laravel resource controller thats pretty basic. Updates the title, body etc. in the DB as normal (no issues with data updates).
The issue is with images. My image directory for each project follows a slug like naming convention based on the title of the project. I built the update function so that the directory and images would be renamed if the title was changed. This was working fine as soon as I built it, but it's suddenly stopped. The directory renames, but the files do not. Here is the relevant snippet from the controller:
$count = 0;
$slug = Input::get('title');
$prefix = Str::slug($slug);
$oldPrefix = $project->imagesprefix;
$path = public_path().'/img/projects/';
$directory = $path . $prefix;
rename($path.$oldPrefix, $directory);
//dumped directory here and it's definitely set
$files = preg_grep('~\.(jpg)$~', scandir($directory));
//dumped $files here and the array is definitely set
foreach($files as $file){
rename($directory.'/'.$file, $directory.'/'.$prefix.'-'.$count.'.jpg');
++$count;
}
rename($directory.'/thumbnails/'.$oldprefix.'-thumb.jpg', $directory.'/thumbnails/'.prefix.'-thumb.jpg');
All of this was working fine when tested this afternoon. Now it updates the DB, renames the directory folder but fails to rename the files. I've dumped the both $files and $directory before the loop and they are both set.
If file rename fails then this is quite often related to permissions.

Discover new files

I have a network storage device that contains a few hundred thousand mp3 files, organized by [artist]/[album] hierarchy. I need to identify newly added artist folders and/or newly added album folders programmatically on demand (not monitoring, but by request).
Our dev server is Windows-based, the production server will be FreeBSD. A cross-platform solution is optimal because the production server may not always be *nix, and I'd like to spend as little time on reconciling the (unavoidable) differences between the dev and production server as possible.
I have a working proof-of-concept that is Windows platform-dependent: using a Scripting.FileSystemObject COM object I am iterating through all top-level (artist) directories and checking the size of the directory. If there is a change, then the directory is further explored to find new album folders. As the directories are iterated, the path and file size is collected into an array, which I write serialized into a file for next time. This array is used on a subsequent call, both to identify changed artist directories (new album added) as well as identifying completely new artist directories.
This feels convoluted, and as I mentioned it is platform-dependent. To boil it down, my goals are:
Identify new top-tier directories
Identify new second-tier directories
Identify new loose files within the top-tier directories
Execution time is not a concern here, and security is not an obstacle: this is an internal-only project using only intranet assets, so we can do whatever has to be done to facilitate the desired end result.
Here's my working proof-of-concept:
// read the cached list of artist folders
$folder_list_cache_file = 'seartistfolderlist.pctf';
$fh = fopen($folder_list_cache_file, 'r');
$folder_list_cache = fread($fh, filesize($folder_list_cache_file));
fclose($fh);
if (!$folder_list_cache)
$folder_list_cache = '';
$folder_list_cache = unserialize($folder_list_cache);
if (!is_array($folder_list_cache))
$folder_list_cache = array();
// container arrays
$found_artist_folders = array();
$newly_found_artist_folders = array();
$changed_artist_folders = array();
$filesystem = new COM('Scripting.FileSystemObject');
$dir = "//network_path_to_folders/";
if ($handle = opendir($dir)) {
// loop the directories
while (false !== ($file = readdir($handle))) {
// skip non-entities
if ($file == '.' || $file == '..')
continue;
// make a key-friendly version of the artist name, skip invalids
// ie 10000-maniacs
$file_t = trim(post_slug($file));
if (strlen($file_t) < 1)
continue;
// build the full path
$pth = $dir.$file;
// skip loose top-level files
if (!is_dir($pth))
continue;
// attempt to get the size of the directory
$size = 'ERR';
try {
$f = $filesystem->getfolder($pth);
$size = $f->Size();
} catch (Exception $e) {
/* failed to get size */
}
// if the artist is not known, they are newly added
if (!array_key_exists($file_t, $folder_list_cache)) {
$newly_found_artist_folders[$file_t] = $file;
} elseif (array_key_exists($file_t, $folder_list_cache) && $size != $folder_list_cache[$file_t]['size']) {
// if the artist is known but the size is different, a new album is added
$changed_artist_folders[] = $file;
}
// build a list of everything, along with file size to write into the cache file
$found_artist_folders[$file_t] = array (
'path'=>$file,
'size'=>$size
);
}
closedir($handle);
}
// write the list to a file for next time
$fh = fopen($folder_list_cache_file, 'w') or die("can't open file");
fwrite($fh, serialize($found_artist_folders));
fclose($fh);
// deal with discovered additions and changes....
Another thing to mention: because these are MP3s, the sizes I'm dealing with are big. So big, in fact, that I have to watch out for PHP's limitation on unsized integers. The drive is currently at 90% utilization of 1.7TB (yes, SATA in RAID), a new set of multi-TB drives will be added soon only to be filled up in short order.
EDIT
I did not mention the database because I thought it would be a needless detail, but there IS a database. This script is seeking new additions to the digital portion of our music library; at the end of the code where it says "deal with discovered additions and changes", it is reading ID3 tags and doing Amazon lookups, then adding the new stuff to a database table. Someone will come along and review the new additions and screen the data, then it will be added it to the "official" database of albums available for play. Many of the songs we're dealing with are by local artists, so the ID3 and Amazon lookups don't give the track titles, album name, etc. In that case, the human intervention is critical to fill in the missing data.
Simplest thing for the BSD-side is a find script that simply looks for inodes with a ctime greater than the last time it ran.
Leave a sentinel file somewhere to 'store' the last run time, which you can do with a simple
touch /tmp/find_sentinel
and then
find /top/of/mp3/tree --cnewer /tmp/find_sentinel
which will produce a list of files/directory which have been changed since the find_sentinel file was touched. Running this via cron will get you regular updates, and the script doing the find can them digest the returned file data into your database for processing.
You could accomplish something similar on the Windows-side with Cygwin, which'd provide an identical 'find' app.
DirectoryIterator will help you walk the filesystem. You should consider putting the information in a database though.
I'd go with a solution that enumerates the contents of each folder in a MySQL database; your scan can quickly check against the contents listed in the database, and add entries that aren't already there. This gives you nice enumeration and searchability of the contents, and should be plenty fast for your needs.

PHP script that sends an email listing file changes that have happened in a directory/subdirectories

I have a directory with a number of subdirectories that users add files to via FTP. I'm trying to develop a php script (which I will run as a cron job) that will check the directory and its subdirectories for any changes in the files, file sizes or dates modified. I've searched long and hard and have so far only found one script that works, which I've tried to modify - original located here - however it only seems to send the first email notification showing me what is listed in the directories. It also creates a text file of the directory and subdirectory contents, but when the script runs a second time it seems to fall over, and I get an email with no contents.
Anyone out there know a simple way of doing this in php? The script I found is pretty complex and I've tried for hours to debug it with no success.
Thanks in advance!
Here you go:
$log = '/path/to/your/log.js';
$path = '/path/to/your/dir/with/files/';
$files = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path), RecursiveIteratorIterator::SELF_FIRST);
$result = array();
foreach ($files as $file)
{
if (is_file($file = strval($file)) === true)
{
$result[$file] = sprintf('%u|%u', filesize($file), filemtime($file));
}
}
if (is_file($log) !== true)
{
file_put_contents($log, json_encode($result), LOCK_EX);
}
// are there any differences?
if (count($diff = array_diff($result, json_decode(file_get_contents($log), true))) > 0)
{
// send email with mail(), SwiftMailer, PHPMailer, ...
$email = 'The following files have changed:' . "\n" . implode("\n", array_keys($diff));
// update the log file with the new file info
file_put_contents($log, json_encode($result), LOCK_EX);
}
I am assuming you know how to send an e-mail. Also, please keep in mind that the $log file should be kept outside the $path you want to monitor, for obvious reasons of course.
After reading your question a second time, I noticed that you mentioned you want to check if the files change, I'm only doing this check with the size and date of modification, if you really want to check if the file contents are different I suggest you use a hash of the file, so this:
$result[$file] = sprintf('%u|%u', filesize($file), filemtime($file));
Becomes this:
$result[$file] = sprintf('%u|%u|%s', filesize($file), filemtime($file), md5_file($file));
// or
$result[$file] = sprintf('%u|%u|%s', filesize($file), filemtime($file), sha1_file($file));
But bare in mind that this will be much more expensive since the hash functions have to open and read all the contents of your 1-5 MB CSV files.
I like sfFinder so much that I wrote my own adaption:
http://www.symfony-project.org/cookbook/1_0/en/finder
https://github.com/homer6/altumo/blob/master/source/php/Utils/Finder.php
Simple to use, works well.
However, for your use, depending on the size of the files, I'd put everything in a git repository. It's easy to track then.
HTH

Categories