GitHub CI - move a file using PHP copy() in a PHPUnit test - php

I have an application written in PHP 8. I've added some Unit Tests for it using PHPUnit.
In one of the tests I am using PHP's copy function to move a file from one location to another. This is done to test an endpoint which downloads the file by moving a "dummy" file to the "real" location that the file would be in, in the production application.
My test looks like this:
// tests/TestCase/Controller/DocsControllerTest.php
public function testDownload()
{
$testFile = '75e57e4a-2149-4270-9d76-c7c8f0298c2c.pdf';
copy('/full/path/to/testFiles/' . $testFile, '/webroot/docs/');
// Download the file from the endpoint
$id = 9; // File ID to download
$this->get('/download/' . $id);
// This should return a HTTP 200 response containing the PDF
$this->assertResponseCode(200, 'Downloading a valid PDF should produce a 200 response.');
}
To explain the function above:
We have a test file called 75e57e4a-2149-4270-9d76-c7c8f0298c2c.pdf. This is a real PDF file with appropriate encoding.
We move the file, using copy(), from a directory where we hold some test files, into the full directory path where the production web application will really store the files (/webroot/docs/).
The remainder of the logic deals with downloading the file from the endpoint. The $this->get makes a HTTP GET request to an endpoint (/download/) which also passes in the appropriate file ID. The location of the file is looked up from a MySQL database and then is streamed to the browser, thus generating a HTTP 200 response containing the PDF.
This works both when I run phpunit locally by executing vendor/bin/phpunit --filter testDownload:
PHPUnit 9.5.10 by Sebastian Bergmann and contributors.
Time: 00:05.053, Memory: 20.00 MB
OK (1 test, 16 assertions)
It also works in a browser, i.e. if I make a request to /download/9 I am served the appropriate PDF.
The problem I'm having is on GitHub. When I run the unit test there it fails the CI with this error:
Warning Error: copy(/home/runner/work/my-app/webroot/docs/75e57e4a-2149-4270-9d76-c7c8f0298c2c.pdf): Failed to open stream: No such file or directory
In [/home/runner/work/my-app/tests/TestCase/Controller/DocsControllerTest.php, line 745]
Given that this works locally I can't understand why this error is occurring. Is there some restriction with using copy() in GitHub's CI?
The directory and files at /full/path/to/testFiles/ are not .gitignore'd so they are committed with the rest of the repo code. So the test file, 75e57e4a-2149-4270-9d76-c7c8f0298c2c.pdf, exists within the codebase on GitHub.
I am using PHPUnit 9.5.10, PHP 8.0 on a Mac running macOS Monterey (12.2).

It clearly reads:
Failed to open stream: No such file or directory
Maybe add .gitkeep into target directory docs? It may also be, that the source file is not there. One usually can take this error message literal. While it's entirely unclear what $this->get() even is or why your Mac would have anything to do with running GitHub Action?

Related

Where are these files? Twtitter userstream via phirehose

I have run the examples scripts from phirehose through terminal and seen the live/active ptint_r of my test tweets. I've loaded up the ghetto queue files and they execute their logging with resounding success. But I can't seem to find where the data went. What file is it actually writing to? The two example files ghetto-queue-collect.php and ghetto-queue-consume.php lead me to believe it is the /tmp directory relative to where the scripts are executing but I see nothing. Any suggestions?
Here are some of the lines from the logs
[03-Mar-2014 02:44:43 America/New_York] Phirehose: Opening new active status stream: /tmp/.phirehose-ghettoqueue.current
[03-Mar-2014 02:45:12 America/New_York] Phirehose: Successfully rotated active stream to queue file: /tmp/phirehose-ghettoqueue.20140303-024512.queue
----- and ------
[03-Mar-2014 03:41:58 America/New_York] Processing file: /tmp/phirehose-ghettoqueue.20140303-024102.queue
[03-Mar-2014 03:41:59 America/New_York] Successfully processed 1 tweets from /tmp/phirehose-ghettoqueue.20140303-024102.queue - deleting.
-- The bits of code in question, I think --
/**
* Subclass specific constants
*/
const QUEUE_FILE_PREFIX = 'phirehose-ghettoqueue';
const QUEUE_FILE_ACTIVE = '.phirehose-ghettoqueue.current';
public function __construct($username, $password, $queueDir = '/tmp', $rotateInterval = 10)
// Set subclass parameters
$this->queueDir = $queueDir;
// Construct stream file name, log and open
$this->streamFile = $this->queueDir . '/' . self::QUEUE_FILE_ACTIVE;
$this->log('Opening new active status stream: ' . $this->streamFile);
$this->statusStream = fopen($this->streamFile, 'a'); // Append if present (crash recovery)
My post to the author's github revealed that the files are written to the actual /tmp directory relative to your hosting or webserver? I'm not sure exactly but, in addition, the files are hidden through Unix because of a precursory '.'
The data is literally being written to /tmp - if you're on a mac or linux machine, you can see these files by opening terminal and running:
ls -la /tmp/
Alternately, you can set $queueDir to whatever you want in the script.
I was able to find them by using putty.

What to check if PharData::buildFromDirectory fails to write contents of a file to a tar?

I have a background script which generates html files (ea 100-500KB in size) as a by-product and when it has accumulated 500 of them, it packs them up in a .tar.gz and archives them. It was running non-stop for several weeks and generated 131 .tar.gz files thus far until this morning when it threw the following exception:
Uncaught exception 'PharException' with message 'tar-based phar
"E:/xampp/.../archive/1394109645.tar" cannot be created, contents of file
"58836.html" could not be written' in E:/xampp/.../background.php:68
The code responsible for archiving
$name = $path_archive . $set . '.tar';
$archive = new PharData($name);
$archive->buildFromDirectory($path_input); // <--- line 68
$archive->compress(Phar::GZ);
unset($archive);
unlink($name);
array_map('unlink', glob($path_input . '*'));
What I've checked and made sure of so far
I couldn't find anything irregular in the html file itself,
nothing else was touching this file during the process,
scripts timeout and memory were unlimited
and enough spare memory and disk space
What could be causing the exception and/or is there a way to get a more detailed message back from PharData::buildFromDirectory?
Env: Virtual XP (in VirtualBox) running portable XAMPP (1.8.2, PHP 5.4.25) in a shared folder of a Win7 host
I solved similar problem after hours of bug-hunting today. It was caused by too little space on one partition of the disk. I had enough space in the partition where tar.gz archive was created but after removing some log files from another partition everything works again.
I think it's possible that object PharData stores some temporary data somewhere and that's why this is happening even if there is enough space on the disk where you create tar.gz archive.

Can't upload a file with Sahi / Mink / Behat in a Symfony2 application

I am using Mink and Sahi for my user interface tests inside a Symfony2 application. But actually I can't manage to upload a file with Sahi.
My Sahi server is up and running:
[09:51:33] coil#ubuntu:~/Webdev/sahi/bin$ ./sahi.sh
--------
SAHI_HOME: ..
SAHI_USERDATA_DIR: ../userdata
SAHI_EXT_CLASS_PATH:
--------
Sahi properties file = /home/coil/Webdev/sahi/config/sahi.properties
Sahi user properties file = /home/coil/Webdev/sahi/userdata/config/userdata.properties
Added shutdown hook.
>>>> Sahi started. Listening on port: 9999
>>>> Configure your browser to use this server and port as its proxy
>>>> Browse any page and CTRL-ALT-DblClick on the page to bring up the Sahi Controller
-----
Reading browser types from: /home/coil/Webdev/sahi/userdata/config/browser_types.xml
-----
My step implementation:
// $element->getXpath() --> (//html/descendant-or-self::*[#id = 'attachment'])[1]
$element->attachFile($file);
Note here that if I use a file that is not /home/coil/Webdev/sahi/userdata directory, I get the following error:
$element->attachFile('toto');
error:_setFile2(_byXPath("(//html/descendant-or-self::*[#id = 'attachment'])[1]"), "toto")
Error: File not found: toto; Base directory is userdata directory: /home/coil/Webdev/sahi/userdata
Error: File not found: toto; Base directory is userdata directory: /home/coil/Webdev/sahi/userdata
at Sahi._setFile (http://dev.project.com/_s_/spr/concat.js:1398:12)
at Sahi._setFile2 (http://dev.project.com/_s_/spr/concat.js:1367:7)
at eval (eval at <anonymous> (http://dev.project.com/_s_/spr/concat.js:3480:14), <anonymous>:1:7)
at Sahi.ex (http://dev.project.com/_s_/spr/concat.js:3480:9)
at <anonymous>:1:11
<a href='/_s_/dyn/Log_getBrowserScript?href=null&n=-1'><b>Click for browser script</b></a>
So, Sahi can "find" the file as it doesn't raise any error with a valid and existing file. But when the form is submitted, the file is never uploaded by the Sahi proxy.
Other checks:
I removed the client side HTML5 and JavaScript validation to be sure there is no side effect.
All my other Sahi tests are Ok, only the 3 with an Upload don't pass
The proxy is set in my testing browser
I can open the Sahi controller in the browser without problem
Same problem on MaxOsX and Ubuntu
Each time I run an upload test, I've got a new entry in /userdata/temp/download named like sahi_11a83f8806be8046fc0aaa80eac076110b95__fr-fr-2-0.bdic
What is really weird, is that I am sure that those tests passed some times ago, something must have changed in my application or configuration that breaks the Sahi file upload but I can't find what.
And before in the Sahi console I had logs about the files that it was uploading, now there is no log at all.
Use absolute system path to which Sahi server has access.
Is the url to which the form is posted different from the url of the web page? _setFile and _setFile2 take a third parameter which can be configured to point to the action URL (the url to which the file should be attached by the Sahi Proxy) http://sahi.co.in/w/_setFile

PHP Copy File Intermittently Fails (Debian Squeeze, PHP 5.4, NGINX 1.2.6)

I have some code that copies a file to a temporary location where it is later included in a zip file.
I already have the source files stored in a local cache directory, and have also stored the SHA1 hash of the original files. The files in question are .png images, ranging from a few kb to around 500kb.
My problem is that at high server loads, the copy intermittently fails. Upon examining my logs, I see that even though a healthy file exists in the source location, the destination contains a file with zero bytes.
So, to try and figure out what was going on and to increase reliability, I implemented a SHA1 check of the destination file, so that if it fails, I can retry the copy using the shell.
99.9% of the time, the files copy with no issue. Occasionally, the first copy fails but then the second attempt succeeds. In a few number of cases (around 1 in 2,500; and always at high server load), both copies will fail. In nearly all these cases SHA1 of the destination file is da39a3ee5e6b4b0d3255bfef95601890afd80709 which is consistent with an empty file.
In all occasions, the script continues, and the created zip includes an empty image file. There is nothing in the Nginx, PHP or PHP-FPM error logs that indicates any problem. The script will copy the same file successfully when retried.
My stack is Debian Squeeze with the .deb PHP 5.4/PHP 5.4 FPM packages and Nginx 1.2.6 on an Amazon EBS backed AMI. The file system is XFS and I am not using APC or other caching. The problem is consistent and replicable at server loads >500 hits per second.
I cannot find any documentation of known issues that would explain this behaviour. Can anyone provide any insight into what may be causing this issue, or provide suggestions on how I can more reliably copy an image file to a temporary location for zipping?
For reference, here is an extract of the code used to copy / recopy the files.
$copy = copy ($cacheFile, $outputFile);
if ($copy && file_exists($outputFile) && sha1_file($outputFile) !== $storedHash) {
// Custom function to log debug messages
dbug(array($cacheFile, sha1_file($cacheFile),
$storedHash, $outputFile,
file_exists($outputFile), filesize($outputFile)),
'Corrupt Image File Copy from Cache 1 (native)');
// Try with exec
exec ("cp " . $cacheFile . " " . $outputFile);
if (file_exists($outputFile) && sha1_file($outputFile) !== $storedHash)
dbug(array($cacheFile, sha1_file($cacheFile),
$storedHash, $outputFile,
file_exists($outputFile), filesize($outputFile)),
'Corrupt Image File Copy from Cache 2 (shell exec)');
}

passing parameters to a bat (or vbs) in php

Objective: Use PHP to call a vbs that converts an xls/xlsx file to a csv.
Question: How can I pass a source file path and a destination file path to a vbs that converts xls/xlsx to csv and run that vbs in a PHP web application?
Details: I have a working vbs that takes a source file path and a destination file path and converts the xls/xlsx at source file path into a csv. I can execute it from the Windows cmd line and it does exactly what I want it to do. I can also put the execution command into a bat file and run the bat file to achieve the same results. However, when I use exec()/shell_exec()/system() in PHP to execute the same command no csv is created. (If I try to run the bat from PHP using system() the contents of the bat file show up on the page, in fact, echo Conversion complete! prints "echo Conversion complete! Conversion complete.") I haven't seen any errors yet.
Note: I know about PHPExcel, I'd prefer not to use it.
excelToCsv.vbs
On Error Resume Next
if WScript.Arguments.Count < 2 Then WScript.Echo "Please specify the source and the destination files. Usage: ExcelToCsv <xls/xlsx source file> <csv destination file>"
Wscript.Quit
End If
csv_format = 6
Set objFSO = CreateObject("Scripting.FileSystemObject")
src_file = objFSO.GetAbsolutePathName(Wscript.Arguments.Item(0))
dest_file = objFSO.GetAbsolutePathName(WScript.Arguments.Item(1))
Dim oExcel
Set oExcel = CreateObject("Excel.Application")
Dim oBook
Set oBook = oExcel.Workbooks.Open(src_file)
oBook.SaveAs dest_file, csv_format
oBook.Close False
oExcel.Quit
batConverter.bat
excelToCsv.vbs conversionTestSourceMS2003.xls batTest.csv
echo Conversion Complete!
index.phtml
<?php
system("cmd /c batConvert.bat")
?>
Note: All of the above files (along with conversionTestSourceMS2003.xls) are in the same directory. I have not implemented any way to pass the parameters (since I can't get it to work even if it's all hard coded...)
Set Up: PHP5, Zend Framework, WAMP, Windows 7 (localhost).
For the sake of simplicity, I merged everything into a single ASP page. This will allow me to hopefully see a similar problem in IIS, and since it is in a single ASP script, I will be able to see the error more directly. My test machine is running on Windows Vista SP2 on IIS7 with Excel 2007 SP3.
excelToCsv.asp
<%
Option Explicit
Dim csv_format, src_file, dest_file, strPath, objFSO
csv_format = 6
src_file = "conversionTestSourceMS2003.xls"
dest_file = "testbat.csv"
strPath = "[HARDCODED PATH HERE]\"
src_file = strPath & src_file
dest_file = strPath & dest_file
Dim objExcel, objBook
Set objExcel = CreateObject("Excel.Application")
Set objBook = objExcel.Workbooks.Open(src_file)
objBook.SaveAs dest_file, csv_format
objBook.Close False
Response.Write "Conversion Complete!"
objExcel.Quit
%>
When running this code, I got a generic ASP error. So, I enabled detailed error messages in ASP and I get this following error...
Microsoft Office Excel error '800a03ec'
Microsoft Office Excel cannot access the file '[HARDCODED PATH
HERE]\conversionTestSourceMS2003.xls'. There are several possible
reasons: • The file name or path does not exist. • The file is being
used by another program. • The workbook you are trying to save has the
same name as a currently open workbook.
/temp/ExcelToCsv.asp, line 18
Now, this is not Apache, but I do believe the problem is related to yours. This error implies there is a security/permission problem where Excel cannot do what it needs to do to access or read the file. In fact, I encountered similar errors when I was executing the VBScript (and passing the error up the chain) from PHP (in IIS).
I believe it can be resolved by changing the Windows User being used to create the process. This can be configured in Services.msc by editing the Apache service and changing the Log On tab to an actual Windows user instead of a Service Account. I have not tested it yet, though, since setting up Apache is not something I can do right now.

Categories