[Maria-developers] Feature request: Add support for a syntax check for SQL scripts
Hi, we have a lot of developers writing a lot of SQL scripts against a lot of different databases. These scripts usually consists of mixed DDL (e.g. change a column definition) and DML (e.g. adding new master data). Since our applications are constantly under development we'd like to add these scripts to our continuous integration environment. For that it would be great to have a feature that checks the script against a given database without actually executing it. It should not only check the SQL syntax but also if all mentioned tables, columns etc. exists in the given database. It is not possible to use transactions for that since DDL is (currently) not transactional. I'm thinking of something like this: SET DISABLE_EXECUTION="TRUE"; -- here come the sql statements SET DISABLE_EXECUTION="FALSE"; I've tried to write such a check with an external tool but as it turned out this is virtually impossible and has a lot of loopholes. Do you think it would be possible to implement something like this in the server? Thanks, Raphael
It is similar to a request we had from our users from time to time. An
option to *parse* SQL statement(s) *server side* without actually executing.
(EXPLAIN and EXPLAIN EXTENDED are non-complete solutions).
I think it is difficult. MySQL was never designed for it. Basically I think
that all SQL would need to go into a 'virtual blackhole' and errors and
warnings should be returned like if physical storage was used.
Some clients will do some degree of *client side* parsing (or at least
validation). Most notably Workbench. Obviously Oracle has no license issues
with porting the YACC/bison parser code in the MySQLserver to a client of
their own. But there are also issues with WB and subtle differences of how
SQL is handled between different server versions. A client side solution
will always suffer from this I think. The solution is ideally *server side*
IMHO.
A solution that allows filtering SQL through the parser, executing in a
'blackhole environment' and returns errors and warnings like doing in a
'non-blackhole environvent' would do would be extremely nice. But I also
think it is extremely difficult and will require a completely revamped
architecture of the server (what may also break compability with vanilla
MySQL).
Peter
(Webyog)
On Thu, Oct 13, 2011 at 23:20, Raphael Vullriede wrote: Hi, we have a lot of developers writing a lot of SQL scripts against a lot
of different databases. These scripts usually consists of mixed DDL
(e.g. change a column definition) and DML (e.g. adding new master data).
Since our applications are constantly under development we'd like to add
these scripts to our continuous integration environment.
For that it would be great to have a feature that checks the script
against a given database without actually executing it. It should not
only check the SQL syntax but also if all mentioned tables, columns etc.
exists in the given database.
It is not possible to use transactions for that since DDL is (currently)
not transactional. I'm thinking of something like this: SET DISABLE_EXECUTION="TRUE"; -- here come the sql statements SET DISABLE_EXECUTION="FALSE"; I've tried to write such a check with an external tool but as it turned
out this is virtually impossible and has a lot of loopholes. Do you think it would be possible to implement something like this in
the server? Thanks,
Raphael _______________________________________________
Mailing list: https://launchpad.net/~maria-developers
Post to : maria-developers@lists.launchpad.net
Unsubscribe : https://launchpad.net/~maria-developers
More help : https://help.launchpad.net/ListHelp
Raphael: You are looking at validating your scripts for your nightly build. (continous integration) I believe you can get the QA from executing the scripts on a test database instead of working on your live data. If you have a lot of stored procedures or scripts, just point them to a test database for continous integration and then when you want feel ready for the nightly release, point them towards the production DB. If you need to, copy your production data over to your test DB so that you can run the test script. I actually dump all my system DDL/DML into a text file and then refactor the SQL using text find/replace and then re-load the SQL back into the server. Then I run my test scripts against a test database before sending to the beta server so that users can look at the result before release. This happens almost every night. That helpful? Marco dentro de perez zeledon On 10/13/2011 6:11 PM, Peter Laursen wrote:
It is similar to a request we had from our users from time to time. An option to *parse* SQL statement(s) *server side* without actually executing. (EXPLAIN and EXPLAIN EXTENDED are non-complete solutions).
I think it is difficult. MySQL was never designed for it. Basically I think that all SQL would need to go into a 'virtual blackhole' and errors and warnings should be returned like if physical storage was used.
Some clients will do some degree of *client side* parsing (or at least validation). Most notably Workbench. Obviously Oracle has no license issues with porting the YACC/bison parser code in the MySQLserver to a client of their own. But there are also issues with WB and subtle differences of how SQL is handled between different server versions. A client side solution will always suffer from this I think. The solution is ideally *server side* IMHO.
A solution that allows filtering SQL through the parser, executing in a 'blackhole environment' and returns errors and warnings like doing in a 'non-blackhole environvent' would do would be extremely nice. But I also think it is extremely difficult and will require a completely revamped architecture of the server (what may also break compability with vanilla MySQL).
Peter (Webyog)
On Thu, Oct 13, 2011 at 23:20, Raphael Vullriede
mailto:lists.raphael@vullriede.de> wrote: Hi,
we have a lot of developers writing a lot of SQL scripts against a lot of different databases. These scripts usually consists of mixed DDL (e.g. change a column definition) and DML (e.g. adding new master data). Since our applications are constantly under development we'd like to add these scripts to our continuous integration environment. For that it would be great to have a feature that checks the script against a given database without actually executing it. It should not only check the SQL syntax but also if all mentioned tables, columns etc. exists in the given database. It is not possible to use transactions for that since DDL is (currently) not transactional.
I'm thinking of something like this:
SET DISABLE_EXECUTION="TRUE";
-- here come the sql statements
SET DISABLE_EXECUTION="FALSE";
I've tried to write such a check with an external tool but as it turned out this is virtually impossible and has a lot of loopholes.
Do you think it would be possible to implement something like this in the server?
Thanks, Raphael
_______________________________________________ Mailing list: https://launchpad.net/~maria-developers https://launchpad.net/%7Emaria-developers Post to : maria-developers@lists.launchpad.net mailto:maria-developers@lists.launchpad.net Unsubscribe : https://launchpad.net/~maria-developers https://launchpad.net/%7Emaria-developers More help : https://help.launchpad.net/ListHelp
_______________________________________________ Mailing list: https://launchpad.net/~maria-developers Post to : maria-developers@lists.launchpad.net Unsubscribe : https://launchpad.net/~maria-developers More help : https://help.launchpad.net/ListHelp
On Fri, Oct 14, 2011 at 00:25, md@rpzdesign.com
Raphael:
You are looking at validating your scripts for your nightly build. (continous integration)
I believe you can get the QA from executing the scripts on a test database instead of working on your live data.
Not quite IMHO Because an erroneous script may destroy the usability of the test database. Then it will have to be reloaded what could take hours (if a copy of the live database) . But a 'virtualization of SQL effect' would require huge memory and/or huge numbers of temporary tables (that also may take hours to rebuild!). + lots of changes in the server.
The current and practical solution is using a staging server, transactions (with or without savepoints), synchronization systems, file system snapshots (for fast return to last non-error situation) etc. etc. But a *visionary approach* should be welcomed! And even with a 'virtualization of SQL effect'' a staging server should still be used.
If you have a lot of stored procedures or scripts, just point them to a test database for continous integration and then when you want feel ready for the nightly release, point them towards the production DB.
If you need to, copy your production data over to your test DB so that you can run the test script.
I actually dump all my system DDL/DML into a text file and then refactor the SQL using text find/replace and then re-load the SQL back into the server. Then I run my test scripts against a test database before sending to the beta server so that users can look at the result before release. This happens almost every night.
That helpful?
Marco dentro de perez zeledon
On 10/13/2011 6:11 PM, Peter Laursen wrote:
It is similar to a request we had from our users from time to time. An option to *parse* SQL statement(s) *server side* without actually executing. (EXPLAIN and EXPLAIN EXTENDED are non-complete solutions).
I think it is difficult. MySQL was never designed for it. Basically I think that all SQL would need to go into a 'virtual blackhole' and errors and warnings should be returned like if physical storage was used.
Some clients will do some degree of *client side* parsing (or at least validation). Most notably Workbench. Obviously Oracle has no license issues with porting the YACC/bison parser code in the MySQLserver to a client of their own. But there are also issues with WB and subtle differences of how SQL is handled between different server versions. A client side solution will always suffer from this I think. The solution is ideally *server side* IMHO.
A solution that allows filtering SQL through the parser, executing in a 'blackhole environment' and returns errors and warnings like doing in a 'non-blackhole environvent' would do would be extremely nice. But I also think it is extremely difficult and will require a completely revamped architecture of the server (what may also break compability with vanilla MySQL).
Peter (Webyog)
On Thu, Oct 13, 2011 at 23:20, Raphael Vullriede < lists.raphael@vullriede.de> wrote:
Hi,
we have a lot of developers writing a lot of SQL scripts against a lot of different databases. These scripts usually consists of mixed DDL (e.g. change a column definition) and DML (e.g. adding new master data). Since our applications are constantly under development we'd like to add these scripts to our continuous integration environment. For that it would be great to have a feature that checks the script against a given database without actually executing it. It should not only check the SQL syntax but also if all mentioned tables, columns etc. exists in the given database. It is not possible to use transactions for that since DDL is (currently) not transactional.
I'm thinking of something like this:
SET DISABLE_EXECUTION="TRUE";
-- here come the sql statements
SET DISABLE_EXECUTION="FALSE";
I've tried to write such a check with an external tool but as it turned out this is virtually impossible and has a lot of loopholes.
Do you think it would be possible to implement something like this in the server?
Thanks, Raphael
_______________________________________________ Mailing list: https://launchpad.net/~maria-developers Post to : maria-developers@lists.launchpad.net Unsubscribe : https://launchpad.net/~maria-developers More help : https://help.launchpad.net/ListHelp
_______________________________________________ Mailing list: https://launchpad.net/~maria-developers Post to : maria-developers@lists.launchpad.net Unsubscribe : https://launchpad.net/~maria-developers More help : https://help.launchpad.net/ListHelp
_______________________________________________ Mailing list: https://launchpad.net/~maria-developers Post to : maria-developers@lists.launchpad.net Unsubscribe : https://launchpad.net/~maria-developers More help : https://help.launchpad.net/ListHelp
On Thu, 13 Oct 2011 23:20:24 +0200, Raphael Vullriede
we have a lot of developers writing a lot of SQL scripts against a lot of different databases. These scripts usually consists of mixed DDL (e.g. change a column definition) and DML (e.g. adding new master data). Since our applications are constantly under development we'd like to add these scripts to our continuous integration environment. For that it would be great to have a feature that checks the script against a given database without actually executing it. It should not only check the SQL syntax but also if all mentioned tables, columns etc. exists in the given database.
This (of course) won't always work. ALTER TABLE ADD COLUMN and then mention that column. As an alternative, maybe start up a local server and load data in it to test the scripts on. -- Stewart Smith
Hi,
Zitat von Stewart Smith
On Thu, 13 Oct 2011 23:20:24 +0200, Raphael Vullriede
wrote: we have a lot of developers writing a lot of SQL scripts against a lot of different databases. These scripts usually consists of mixed DDL (e.g. change a column definition) and DML (e.g. adding new master data). Since our applications are constantly under development we'd like to add these scripts to our continuous integration environment. For that it would be great to have a feature that checks the script against a given database without actually executing it. It should not only check the SQL syntax but also if all mentioned tables, columns etc. exists in the given database.
This (of course) won't always work. ALTER TABLE ADD COLUMN and then mention that column.
Maybe I was not clear enough here. It should perform all possible checks that are also done if you try to actually execute the script. For an ADD COLUMN this check would of course be that the column does _not_ exists yet.
As an alternative, maybe start up a local server and load data in it to test the scripts on.
That's another approach we're working on. But for some databases the check must be done against a live db (that is not easily clonable due to its size). Cheers Raphael
Hi, Raphael! On Oct 13, Raphael Vullriede wrote:
For that it would be great to have a feature that checks the script against a given database without actually executing it. It should not only check the SQL syntax but also if all mentioned tables, columns etc. exists in the given database.
I'm thinking of something like this:
SET DISABLE_EXECUTION="TRUE";
-- here come the sql statements
SET DISABLE_EXECUTION="FALSE";
You will get a similar effect (not exactly the same, but close) by wrapping your sql statements in CREATE PROCEDURE ... BEGIN END DROP PROCEDURE your sql wil be parsed, the server will check all mentioned tables, columns, etc. But nothing will be executed.
I've tried to write such a check with an external tool but as it turned out this is virtually impossible and has a lot of loopholes.
External tool is possible too. Looks at perl modules DBIx::MyParse and DBIx::MyParsePP. Regards, Sergei
Hi, Sergei,
Zitat von Sergei Golubchik
Hi, Raphael!
On Oct 13, Raphael Vullriede wrote:
For that it would be great to have a feature that checks the script against a given database without actually executing it. It should not only check the SQL syntax but also if all mentioned tables, columns etc. exists in the given database.
I'm thinking of something like this:
SET DISABLE_EXECUTION="TRUE";
-- here come the sql statements
SET DISABLE_EXECUTION="FALSE";
You will get a similar effect (not exactly the same, but close) by wrapping your sql statements in
CREATE PROCEDURE ... BEGIN
END DROP PROCEDURE
your sql wil be parsed, the server will check all mentioned tables, columns, etc. But nothing will be executed.
Thanks for the hint, I didn't know that! I'll try that, sorry for the noise. Thanks, Raphael
participants (5)
-
md@rpzdesign.com
-
Peter Laursen
-
Raphael Vullriede
-
Sergei Golubchik
-
Stewart Smith