I have a script where user can attach file to the record. File is stored separately (not in a database). When user does not attach file and click on "Save record" it works fine. When user attaches file it does next steps:
File uploads correct
Script takes from DB record details using Idiorm
Script updates field "filesize" on the record (I tested script without this step - same result)
Script tries to save record, and I get this:
PDOException
Code: HY000
Message: SQLSTATE[HY000]: General error: 2006 MySQL server has gone away
File: /home/../includes/idiorm.php
Line: 1675
How can it be solved?
Update:
I found out 2 things:
script returns "General error: 2006 MySQL server has gone away" only when uploaded file is bigger than 20 Mb and I try to update database with Idiorm_record->save().
script does not return "General error: 2006 MySQL server has gone away" when uploaded file is bigger than 20 Mb and I do not try to update database.
I can upload file bigger than 20 Mb and run query generated by Idiorm_record->save() with Idiorm::raw_exec() and catch no error.
Does it mean that problem connected with Idiorm?
The Problem was the "wait timeout" setting in MySQL. But still is interesting why plain sql works fine and update via save method on Idiorm object causes delay that Mysql goes away.
Related
I just made a script in codeigniter which fetches emails and displays them with php imap.
It fetches 500 mails at once no problem.
If i try to fetch about 2000 mails at once, it throws an error.
A Database Error Occurred
Error Number: 2013
Lost connection to MySQL server during query
SELECT GET_LOCK('77dfd862ae2b7bedaec521bc4c3651952b56e6c9', 300) AS
ci_session_lock
Filename: libraries/Session/drivers/Session_database_driver.php
Line Number: 358
This project is live on hostgator. I am not sure if it has mysql workbench or not.
I have no idea where it comes from and why.
I have done nothing with mysql in codeigniter before.
I do not know which file to edit and where to start. Please help.
My PHP site not working on an apache server.It is working fine on localhost but after uploading on the server it showing me the blank pages.Here is my error log file
[28-Feb-2018 04:43:16 America/New_York] PHP Fatal error: Uncaught exception 'mysqli_sql_exception' with message 'Table 'combejcj_waqar-accounts.addsupplier' doesn't exist' in /home/combejcj/waqar.combitpos.com/waqar/public/home.php:24
Stack trace:
#0 /home/combejcj/waqar.combitpos.com/waqar/public/home.php(24): mysqli_query(Object(mysqli), 'SELECT fullname...')
#1 {main}
thrown in /home/combejcj/waqar.combitpos.com/waqar/public/home.php on line 24
Considering the error message it says that there is no table with the name combejcj_waqar-accounts.addsupplier
So, there can be 2 problems:
1) You forgot to upload your DB table
2) You are on a Linux environment which means that you are in a Case sensitive enviroment and in your code you are doing e.g. an insert on combejcj_waqar-accounts.addsupplier but the table name is with Capital First letters for example combejcj_waqar-Accounts.Addsupplier
So, make sure that the table name on the server is exactly the same with the one you are using into your code.
The error is telling you what the problem is. It's saying that your MySQL table combejcj_waqar-accounts.addsupplier doesn't exist. Did you migrate the MySQL database when you changed over to the server?
Table does not exist in the database.
You have 2 possible solutions
- Export the local database and import it on the live server.
- Create the table on the live server.
Bluimp and PHP
Iam using the bluimp upload library to upload my CSV files which have huge data and after upload the data will be inserted into the database. Iam trying to upload like 10 files at a time. And this gives me a error.
xampp is killed and getting the error
"mysqli_connect(): (HY000/2002): An operation on a socket could not be performed because the system lacked sufficient buffer space or because a queue was full."
I observed that bluimp is trying to upload 6 files at a time. How to make it process 1 file at a time.
did anyone encountered a problem with excel_reader2, that the script got aborted by a big amount of rows, for example over 60k rows in the excel file? I just get an error message in error log : Aborted. Thats all. I got more files on my server and the script takes them 1 by 1 but when i get to the second the the message comes and script stops. Its php 5.4.7 btw.
I am not sure about the problem, but phpexcel does not really like large excel files, that may be a problem. Please try to limit the data with reading only chunks, it may help.
Okay so I'm trying to retrieve 130 XML files from a feed and then insert values into a database. Problem is, it crashes at around 40-60 entries and doesn't give an error. I timed it and the script goes for around 13 seconds each time.
I checked my php.ini settings and they are...
Memory Limit = 128M
Time Limit - 30 seconds
So what is causing this error? When I run the script on firefox it just displays a white screen.
EDIT - The error I'm getting on chrome is
"Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the
connection without sending any data"
Have you checked the memory consumption? Also could you do a writeline or output something to the screen to see that it's reading in the files? Add error handling around the read statement to see if it's failing on parsing the XML.