I connect the normal way:
$dbh = ibase_connect($host, $username, $password) OR die("could not connect");
Then I run a query:
ibase_query($dbh, 'ALTER TABLE USERS ADD OLDUSERPASS VARCHAR(32) COLLATE NONE') or die(ibase_errmsg());
Directly after this I run:
ibase_query($dbh, 'UPDATE USERS SET OLDUSERPASS = USERPASS') or die(ibase_errmsg());
It complains:
Column unknown OLDUSERPASS At line 1
But when I look in the DB, the column has been created. So, for some reason that split second after run ALTER, the query is not actually committed to the server.
Any ideas why?
Try
ibase_commit($dbh) after alter statement
In Firebird, DDL is under transaction control and you are not allowed to use newly created objects (tables, columns, etc) within the same transaction. So you will first need to commit before executing a query that uses that object.
Related
The structure of the table I'm trying to reach is as such:
Database: INTERNAL_STUFF
Schema: INTERNAL_TEST
Table: TEST_TABLE_SIMPLE
I create a PDO as such:
$dbh = new PDO("snowflake:account=$this->account", $this->user, $this->password);
$dbh->setAttribute( PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION );
If I use this query:
$result = $dbh->query("SELECT * FROM INTERNAL_STUFF.INTERNAL_TEST.TEST_TABLE_SIMPLE");
I end up getting this response - 'Schema 'INTERNAL_STUFF.INTERNAL_TEST' does not exist or not authorized.'. So it appears to be treating the database and the schema as just the schema.
If I use the same query but drop the database from the front:
$result = $dbh->query("SELECT * FROM INTERNAL_TEST.TEST_TABLE_SIMPLE");
I end up getting this response - 'SQLSTATE[22000]: Data exception: 90105 Cannot perform SELECT. This session does not have a current database. Call 'USE DATABASE', or use a qualified name.'
What am I doing wrong here? My user has access to the correct role to view the table, and that exact query (the longer of the two) works just fine in a Snowflake Worksheet.
You may also set the default namespace(database & Schema) for your user using Alter User statement.
Details: https://docs.snowflake.com/en/sql-reference/sql/alter-user.html#usage-notes
Can you execute query USE DATABASE INTERNAL_STUFF and next query USE SCHEMA INTERNAL_TEST before you execute your main Sql query.
how to create a table using createCommand in Yii?
I've followed the instructions in the http://www.yiiframework.com/doc/api/1.1/CDbCommand but I am still confused. Can you give an example of creating a table using createCommand in yii.
thank you
Assuming you've named your database connection string db (in protected/config/main.php) as follows,
'components'=>array(
'db'=>array(
),
),
you should be able to create a table using the following command:
$sqlQuery = "CREATE TABLE IF NOT EXISTS pet (name VARCHAR(20), owner VARCHAR(20))";
$sqlCommand = Yii::app()->db->createCommand($sqlQuery);
$sqlCommand->execute();
You should also be able to replace the first line with any SQL statement and execute it successfully. If you want to query for data, you will need to use queryRow, queryColumn, or queryScalar (as defined in the documentation).
Hope this helps!
I use this code and sucess..
Yii::app()->db->createCommand("CREATE TABLE {$nama_data}(id serial, {$list_field}, x text,y text,wkt text, the_geom geometry,PRIMARY KEY(id));")->query();
I am using PHP PDO to access a PostgreSQL database with various schemas, so first i create a connection and then i set the correct schema, like below:
$Conn = new PDO('pgsql:host=localhost;port=5432;dbname=db', 'user', 'pass');
$result = $Conn->exec('SET search_path TO accountschema');
if ( ! $result) {
die('Failed to set schema: ' . $Conn->errorMsg());
}
Is this a good practice? Is there a better way to do this?
In order to specify the default schema you should set the search_path instead.
$Conn->exec('SET search_path TO accountschema');
You can also set the default search_path per database user and in that case the above statement becomes redundant.
ALTER USER user SET search_path TO accountschema;
I've been searching google a lot for this issue and really found nothing. People just keep copying MySQL documentation on last_insert_id and nothing else.
I've got an issue regarding last_insert_id, because for both situations (php & sql) it returns 0.
YES: I've set a PRIMARY & UNIQUE field with AUTO_INCREMENT value
YES: i've done some inserting before
NO: Making double query with INSERT AND SELECT LAST... doesn't work.
I've created a class Db for maintaining connection & query:
class Db
{
private function connect()
{
$db = new mysqli($this->db_host, $this->db_user, $this->db_pass, $this->db_name, $this->db_port);
if(mysqli_errno($db))
{
file_put_contents(date('Y-m-d').'mysql_error.txt',mysqli_error($db),FILE_APPEND | LOCK_EX);
echo "Connection error";
exit();
}
else
{
return $db;
}
}
public function insert($i_what, $i_columns, $i_values, $i_duplicate='') {
$insert = $this->connect()->query("INSERT INTO ".$i_what.$i_columns.$i_values.$i_duplicate);
$last_id = $this->connect()->insert_id;
$this->connect()->close();
return $last_id; }
}
id int(11) AUTO_INCREMENT PRIMARY UNIQUE
name varchar(32) utf8_general_ci
firstname varchar(64) utf8_general_ci
lastname varchar(64) utf8_general_ci
And it doesn't work.
All you need is common sense.
Issue you described is an improbable one. So - you have to check again.
there are hundreds of questions already, where opening posters were 100% they did everything right. At first.
And in every one of them it turned out a silly mistake like inserting in one db and selecting from another.
So as I said. Yeah, you don't close your connection. For some reason you just open a completely NEW one for the every database call. No wonder brand new clean connection returns 0.
Instead of writing a class of your own, better use a ready made one, like https://github.com/colshrapnel/safemysql
It would be WAY safer and convenient. And it returns insert id
I have a sqlite3 database on my harddrive (file.db) with 5 tables.
I'd like to copy 3 of these tables to an in-memory database (:memory:).
Is there a simple way to do so using PHP5's PDO format?
Not a pdo-specific solution that may or may not be sufficient in your case:
create a :memory: database
Attach the existing database file
CREATE TABLE ... AS SELECT * FROM ...
Detach the database file
edit: an example
First an example database stored in mydb.sq3
<?php
$pdo = new PDO('sqlite:mydb.sq3');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$pdo->exec('CREATE TABLE foo(x INTEGER PRIMARY KEY ASC, y, z)');
$stmt = $pdo->prepare("INSERT INTO foo (x,y,z) VALUES (:x,:y,:z)");
$stmt->bindParam(':x', $x);
$stmt->bindParam(':y', $y);
$stmt->bindParam(':z', $z);
for($x=0; $x<100; $x++) {
$y = $x*2;
$z = $x*2+1;
$stmt->execute();
}
Now we have a :memory: database and want to transfer the table foo
<?php
$pdo = new PDO('sqlite::memory:');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$pdo->exec('ATTACH "mydb.sq3" as filedb');
$pdo->exec('CREATE TABLE bar AS SELECT * FROM filedb.foo');
$pdo->exec('DETACH filedb');
Done. But let's take a look at the sqlite_master table
foreach($pdo->query('SELECT sql FROM sqlite_master') as $row) {
echo $row['sql'];
}
this prints
CREATE TABLE bar(x INT,y,z)
The INTEGER PRIMARY KEY ASC declaration is lost. Might be sufficient though....
If that's what you need to do, then VolkerK's answer is the one I'd provide, but I feel that I have to point out that you're going to read the contents of those tables into memory each time you run that code (every time that page loads?), so it might be better just to query the data files from disk.
Note that one could always use some kind of shared memory mechanism (e.g. APC, memcache, etc..) to keep sqlite's in-memory databases persistent across connections.
You can dump the database at the end of the connection, save it as apc variable and then load and run again from apc at the beginning of the next execution.
Using the method outlined by VolkerK roughly doubled the performance of my code when using a ~150Mb sqlite database.
Loading the database into an in-memory sqlite db didn't require any other changes to my existing code.
My use case was batch processing data so I didn't have to deal with the problems Wez Furlong highlights.
Reading the data into memory was surprisingly fast. The whole 150Mb was loaded into memory from SSD in less than two seconds. It seemed too good to be true so I checked and rechecked the data.
Many thanks to VolkerK for a magic solution!
I also found that when I indexed the in-memory tables my queries executed three times faster
//adapting VolkerK's example...
//loop through tables from the local sqlite db
$tables = array('companies', 'directors', 'previous_names');
foreach($tables as $table){
//load each table into memory
$pdo->exec("CREATE TABLE $table AS SELECT * FROM filedb.$table");
//index each table on the relevant columns
$pdo->exec("CREATE INDEX IF NOT EXISTS `".$table."_company_number`
ON $table (`company_number`);");
}
If you need a small database for tests, you may export your databse to an SQL file and then execute it as a single query in PHP:
class EphemeralPDO extends \PDO {
public function __construct() {
parent::__construct('sqlite::memory:', null, null, [PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION]);
$queries = file_get_contents(__DIR__ . '/database.export.sql');
$this->exec($queries);
}
}