SQL with Groovy .

SQL with Groovy .

Groovy provides a handy way to handle jdbc when compared to java. Playing with database becomes very easy with groovy. To use SQL scripting with groovy we have to use the  groovy.sql.Sql package.

import groovy.sql.Sql

sql = Sql.newInstance( 'jdbc:jtds:sqlserver://serveripName/databaseName-CLASS;domain=yourdomainname', 'dbusername', 'dbpassword', 'driverclass' )

sql.eachRow( 'select * from tableName' ) {
println "$it.slno -- ${it.userName} --"
}

The Sql.newInstance method helps to create database connection. It takes the parameters like database username, database password, driver class and jdbc url.
The sql.eachRow method is used to fire SQL query and iterate the resultset aswell.

Groovy has provision to handle performance with transaction based query. We can use the batch withBatch method to handle the performance issue.

sql.eachRow("SELECT * FROM employee WHERE filter_criteria ", { employee ->
// Process employee and add update statement to batch
def updateStmts = processEmployee(employee)
stmt.addBatch(updateStmts)
})
})

 

For the sake of integrity we can define the withTransaction block along withBatch

sql.withTransaction {
def result = sql.withBatch({ ... })
...
}

 

When we know that the result set is only one row.

row = sql.firstRow('select columnA, columnB from tableName')
firstRow method helps us to retrive the first row.
def ans = sql.firstRow("select * from PERSON where location_id < $location")
println ans.firstname

groovy insert statement with closure

sql.execute("insert into employee (employeName, empAddress) values (${employeeName}, ${empAddress})")

sql.close().

This method closes the sql connection, and takes care of the connection.

These were only few methods. To know more we can explore the groovy.sql package.

Batch-Insertion in Hibernate


Batch-Insertion in Hibernate

To enable batch-processing  to upload a large number of records into your database using Hibernate.

If you are undertaking batch processing you will need to enable the use of JDBC batching. This is absolutely essential if you want to achieve optimal performance. Set the JDBC batch size to a reasonable number (10-50, for example):

hibernate.jdbc.batch_size 20

the code snippet look like this ,

Session session = sessionFactory.openSession();
Transaction txInstance = session.beginTransaction();
   
for ( int i=0; i<100000; i++ ) {
    Student student = new Student(.....);
    session.save(student);
    if ( i % 40 == 0 ) { 
        session.flush();
        session.clear();
    }
}
   
txInstance.commit();
session.close();

When making new objects persistent flush() and then clear() the session regularly in order to control the size of the first-level cache because by default, Hibernate will cache all the persisted objects in the session-level cache and ultimately your application would fall over with an OutOfMemoryException.

A JDBC batch can target one table only, so every new DML statement targeting a different table ends up the current batch and initiates a new one. Mixing different table statements is therefore undesirable when using SQL batch processing.

PUSHPRAJ KUMAR (BI Developer)