I'm using Laravel on top of SQL Server and I'm trying to store the PHP sessions in the database. I have created the sessions table per the http://laravel.com/docs/session#database-sessions but I am getting the following error when loading a page:
PDOException was thrown when trying to read the session data: SQLSTATE[22001]: [Microsoft][SQL Server Native Client 11.0][SQL Server]String or binary data would be truncated.
Update:
I fixed this by creating the table manually:
CREATE TABLE portal_sessions (
id VARCHAR(255) PRIMARY KEY NOT NULL,
last_activity INT NOT NULL,
payload TEXT NOT NULL);
Make sure that your Payload column is large enough to hold all your session data. I'd suggest making sure that it is "VARCHAR(MAX)"
Related
I have a fairly elaborate multi-page query form. Actually, my site has several for querying different data sets. As these query parameters span multiple page requests, I rely on sessions to store the accumulated query parameters. I'm concerned that the data stored in session, when serialized, might exceed the storage capacity of the MySQL BLOB storage capacity (65,535 bytes) of the data column specified by the CodeIgniter session documentation:
CREATE TABLE IF NOT EXISTS `ci_sessions` (
`id` varchar(128) NOT NULL,
`ip_address` varchar(45) NOT NULL,
`timestamp` int(10) unsigned DEFAULT 0 NOT NULL,
`data` blob NOT NULL,
KEY `ci_sessions_timestamp` (`timestamp`)
);
How can I store my user-entered query parameters and be sure that they will be preserved for a given user?
I considered using file-based-caching to cache this data with a key generated from the session ID:
// controller method
public function my_page() {
// blah blah check POST for incoming query params and validate them
$validated_query_params = $this->input->post();
// session library is auto-loaded
// but apparently new session id generated every five mins by default?
$cache_key = "query_params_for_sess_id" . $this->session->session_id;
$this->load->driver('cache');
// cache for an hour
$this->cache->file->save($cache_key, $validated_query_params, 3600);
}
However, I worry that the session ID might change when a new session ID gets generated for a given user. Apparently this happens by default every five minutes as CodeIgniter generates new session IDs to enhance security.
Can anyone suggested a tried-and-true (and efficient!) means of storing session data that exceeds the 64K blob size?
You could use MEDIUMBLOB, which supports up to 16MB, or LONGBLOB which supports up to 4GB.
See https://dev.mysql.com/doc/refman/8.0/en/string-type-overview.html
Also, if you declare your blob with a length like BLOB(2000000) (whatever is the length you need), it will automatically promote it to a data type that can hold that length of data. For example, BLOB(2000000) will implicitly become MEDIUMBLOB.
mysql> create table t ( b blob(2000000) );
mysql> show create table t\G
*************************** 1. row ***************************
Table: tt
Create Table: CREATE TABLE `tt` (
`b` mediumblob
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4
I'm using my Web Application in Godaddy Hosting Server. Php Version is 7.1 and mysql Version is 5.6
while i'm trying
ALTER TABLE aos_products_quotes
modify COLUMN discount varchar(255) DEFAULT 'Percentage' NULL ,
modify COLUMN parent_type varchar(255) NULL ,
modify COLUMN parent_id char(255) NULL ;
these query in my domain sql server its shows
MYSQL #1071 Error Specified Key value is too long
I think i need to use mysql 5.7 or else need to set the global prefix variable is 1..But the Global setting need super privileges..At Godaddy side they told its impossible on Shared hosting..
I try my best about that issue so anyone guide me to solve that issue..
Thanks in advance...
There are many options to fix this:
You can try by changing the character set of columns so that they
use less bytes in storage.
You can change / remove indexing of the table. May be you are using
composite index whose limit is reached after this alteration.
Try less length of varchar() columns.
share the complete table structure to help you better.
Thanks
I'm developing a project with a small client area with the Silex framework. I want to store the session in database using the SessionServiceProvider and PdoSessionHandler but when I try to login with a test account the session is written in the database but the login is not done correctly and I get the login page in a loop. Also I'm getting the following error in the error log
Uncaught exception 'PDOException' with message 'SQLSTATE[22021]: Character not in repertoire: 7 ERROR: invalid byte sequence for encoding "UTF8"
The sessions table looks like this (got it from the Silex documentation page):
CREATE TABLE sessions (
sess_id VARCHAR(255) NOT NULL,
sess_value TEXT NOT NULL,
sess_time INTEGER NOT NULL,
PRIMARY KEY(sess_id)
);
Anyone can help?
I found myself with the same error some time ago, looks like the silex documentation is incorrect for PostgreSQL, the sess_value field should be a BYTEA type field because the session data can contain characters that are not accepted in UTF-8 like the NULL character 0x00.
Also as Silex uses the Symfony2 components for storing the session in database another field should be added, sess_lifetime that will contain the lifetime of the database session.
So the definition should be:
CREATE TABLE sessions (
sess_id VARCHAR(255) NOT NULL,
sess_value BYTEA NOT NULL,
sess_time INTEGER NOT NULL,
sess_lifetime INTEGER NOT NULL,
PRIMARY KEY(sess_id)
);
I have a Joomla (PHP) website with an existing hosted MySQL database.
I have a Google Cloud SQL Instance with some statistical data in.
I need to query the data across both databases and would like the query to run on the Google Cloud SQL instance.
My research so far has lead me to belive that the best way to do this is to create a federated table inside the Google Cloud SQL database but in attempting to do this I am not getting the results I expect (neither am I getting an error?!)
Joomla MySQL table:
CREATE TABLE test_table (
id INT(20) NOT NULL AUTO_INCREMENT,
name VARCHAR(32) NOT NULL DEFAULT '',
other INT(20) NOT NULL DEFAULT '0',
PRIMARY KEY (id),
INDEX name (name),
INDEX other_key (other)
)
ENGINE=MyISAM
DEFAULT CHARSET=latin1;
Google Cloud SQL:
CREATE TABLE federated_table (
id INT(20) NOT NULL AUTO_INCREMENT,
name VARCHAR(32) NOT NULL DEFAULT '',
other INT(20) NOT NULL DEFAULT '0',
PRIMARY KEY (id),
INDEX name (name),
INDEX other_key (other)
)
ENGINE=FEDERATED
DEFAULT CHARSET=latin1
CONNECTION='mysql://*uid*:*pwd*#*joomla_server_ip*:3306/*database_name*/test_table';
Where
*uid*, *pwd*, *joomla_server_ip* and *database_name*
Are all valid values.
Both statements execute fine with no errors, but after inserting data to *test_table* on Joomla I am unable to see any data in *federated_table* on Google Cloud SQL.
I have tried the federated table creation using both the command line tool (Windows) and using the SQuirrel SQL JDBC client.
Because I am seeing no errors what so ever I'm not sure if the problem is at the Joomla database end or the Google Cloud SQL database end. So any help will be greatly appreciated. I am assuming the problem is with the connection between the two databases, but am open to trying any other theroies that you may throw at me.
EDIT:
I'm now using a different client to connect (MySQL Workbench) and this reports an error when trying to do the same thing
1286 Unknown storage engine 'FEDERATED' 1266 Using storage engine InnoDB for table 'federated_table'
Shortly after asking this question Google added the MySQL Wire Protocol to Google Cloud SQL.
http://googlecloudplatform.blogspot.co.uk/2013/10/google-cloud-sql-now-accessible-from-any-application-anywhere.html
It is now possible to create Federated tables in the normal way.
Im making a 100% javascript and canvas web application, no forms at all. I have a js array with data and I'm wondering how could be possible to pass it to a php script so it gets loaded into the database. Suggestions?
My thoughts are to keep it simple. If you are looking to store arrays based on value/pair, ie a flat file and no relationships between tables, then I would do the following:
Create a mysql database with one table, two rows:
CREATE TABLE `data` (
`data_id` INT(10) NULL,
`data_key` CHAR(50) NULL,
`data_value` TEXT NULL,
`datemodified` TIMESTAMP NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
`datecreated` DATETIME NULL,
PRIMARY KEY (`data_id`),
UNIQUE INDEX `data_key` (`data_key`)
)
COLLATE='latin1_swedish_ci'
ENGINE=MyISAM
ROW_FORMAT=DEFAULT
Anyway create a PHP script that will take a post of 2 variables, a key (a key), and value (this would be an object in javascript).
If you post a key, it should return the value (in the json format so javascript can interpret it into an object)
If you post a key and a value, the script will do an "Insert Ignore" and return the data_id. Run the value through json_encode() (as it will be if posted through javascript) and store it under the key.
You could also make an optional third way of accessing the data using the data_id value.
Just a thought... let us know how much php experience you have and if you need specific details on what functions to use
Security would also be a factor to consider. In this case, you might want to have javascript generate a unique session id, and then add this session_id to the table. So users can only access their own data. Not really sure how your app works at this stage though so sorry I can't suggest something more secure.
create a webservice in php that can connect to mysql db. Your js code would make calls to these webservices to save to db
You can send it as a JSON string to a PHP file using AJAX and then decode the JSON, inserting the data into the database. You should have the PHP file return something to make sure it's done.