Hi Cipherinfo,
When working on legacy databases it is fairly common to have to make changes to your database if you want it to work in a new system. You cannot just expect the new system to handle your database as is, as every database has its own quirks.
The idea of a procedure to add necessary fields or modify them is a good one - though this would not require any change to nuBuilder itself but rather an additional procedure, and this is actually a task that would often be required in some form when shifting a database to a new platform. However, as
jlcsusara has pointed out, it is possible to to change your field types so that they are compatible with nuBuilder while retaining the existing numeric fields, however any subsequent records created by nuBuilder would use its standard uniquid fields. As this is an open source system, users are encouraged to develop solutions and provide them back to the community to be included in modules or future releases.
Regarding the code you provided, another option is to do a mass update in batches by building a large update (UNTESTED - don't run this on your live database!), which may (or may not) be quicker:
Code: Select all
//This assumes you have an index on the new id field, conservazione_id, but it is not your primary key
//Set up base queries
$insertSQL = "INSERT INTO conservazione (old_id, conservazione_id) VALUES ";
$duplicateSQL = " ON DUPLICATE KEY UPDATE conservazione_id = VALUES(conservazione_id) ";
//if you are not familiar with ON DUPLICATE KEY UPDATE, it does an update if a key duplicate is found (i.e. every case in this instance)
$existingResultset = nuRunQuery("SELECT old_id FROM conservazione ORDER BY old_id");
$i = 0;
while($result = mysql_fetch_object($existingResultset)) {
$newId = uniqid();
if($valuesSQL != '') $valuesSQL .= ",";
$valuesSQL .= "('{$result->old_id}','$newId')"; //Construct a mass update query
if($i >= 1000) { //May be more efficient if higher or lower
nuRunQuery($insertSQL.$valuesSQL.$duplicateSQL); //Run when it gets to a certain number of records to update
$i = 0;
}
$i++;
}
//Finish off the last batch
if($i != 0)
nuRunQuery($insertSQL.$valuesSQL.$duplicateSQL);
Another option is to create another table with the two columns, old_id (autonumber), new_id (VARCHAR(15)), populate with mass inserts:
"INSERT INTO id_temp_table (new_id) VALUES ('uniquid1'),('uniquid2'),('uniquid3'),('uniquid4')...('uniquid1000')", run enough times to match the records in your table.
Then "UPDATE conservazione INNER JOIN id_temp_table USING (old_id) SET conservazione_id = new_id;" (You may use LIMIT 10000 and WHERE conservazione IS NULL to do this in batches instead of all at once).
Once again, this assumes you have appropriate indexes on your tables, and I have not tested it, so it may or may not be the most efficient.
Bear in mind though, that when you are dealing with updates of millions of records, any system will take a while - that is why many reports and procedures are run in overnight batch jobs. But in one-off procedures like this, the time is less critical.
Or, you can just do
what jlcsusara suggested.
Cheers,
Christopher