HI (Helical Insight) is a reporting tool (BI tool) that helps in Data Visualization of various kinds
like Tables, BarCharts, LineCharts, etc. It is a very good tool since its use is simple
and easy.

HI has a feature called “Adhoc data visualizer” or simply “ADHOC”.
Adhoc makes data visualization amazingly easy since the user need not know anything about
SQL queries, javascript or html and yet he can get the desired report in the desired format.
However, if the user is well versed in the above mentioned subjects, he can tweak the reports
to make them jazzy and attractive.

All other Reporting tools being used at present require the user to be a trained report developer and or have in depth knowledge of the application, the data and different visualization methods.
To overcome this problem, ADHOC combines all the robustness of the existing tools and adds an easy to use layer on top of it.
However, this ease of use does not make the tool bulky or slow. On the contrary, ADHOC is lightning fast is processing large amounts of data, robust and highly secure.

Using the inbuilt functionalities of ADHOC, the data can be tested and any potential problems can be identified easily and early on since it does all the work and the user just needs to check the end results.
The ability of this module to present the data visualization as per the user requirements with very minimal input from the user makes this a very easy to use tool.

The Adhoc Module is completely UI based where an entire process of data visualization can be done by just drag and drop and renaming columns (setting alias).

The major benefit of using ADHOC is that the user need not spend resources in mastering the knick-knacks of Report Development, and can instead create equally good, if not better, data visualizations with basic understanding of the data and SQL.

Step one is mentioning the data source, where the user needs to specify the location of his data on the server which will be used to create a metadata for the report. The application will automatically access the datasource and pull all the tables automatically for metadata generation.

Step two is defining the metadata. Here, the user can define joins between tables by simple drag and drop and specifying the joining column. The application automatically creates the joins based on this information and the user need not worry about it during the report creation. Here creation of views is also possible if the report requires it. Once all the joins are specified, the Metadata is ready for use in the report.

Step three and the final step is data visualization. Once the metadata is specified, the report creation begins. The user now needs to search the field he needs to add in the report in the metadata-toggle bar and drag and drop it in the column addition location on screen.
The application creates the SQL query on its own and executes it to show the desired report.

There are a few more intrinsic properties and features, in this module along with the in depth guide to using the ADHOC module efficiently which will be explained in the next blog post “Guide to Adhoc”.

Jaspersoft Ad hoc reports: Quick performance measurement & optimization tips

Measuring Ad-hoc performance

1. Log on JasperReports server as system admin (superuser by default).

2. Navigate to Manage > Server Setting > Ad hoc cache.

The Ad Hoc Cache page appears, displaying all the datasets that are in the cache, sorted by age.


3. Ad Hoc performance can be quickly measured by checking values of Query/Fetch

  • Query (msec): Time from when query was sent to the db until the first row was received. If this is slow then we should apply proper indexes in DB.
  • Fetch (msec): Time from when first row was received until the last row was received. If this is slow there might be a network bottleneck.

Performance tuning in Ad Hoc

Setting query limit

  1. Log on JasperReports server as system admin (superuser by default).
  2. Navigate to Manage > Server settings > Ad Hoc Settings.
  3. There are two parameters there that are worth testing for possible performance gains:

a)      Ad Hoc Filter List of Values Row Limit

  • The maximum number of items that should be displayed in the Condition Editor when a user defines filters for an Ad Hoc report that is based on a Domain. If this limit is exceeded when users define filters, Jasperserver displays a message to that effect.
  • Setting this to a lower value can improve performance.

b)      Ad Hoc Dataset Row Limit

  • The maximum number of rows that an Ad Hoc report can return. JasperServer truncates the data when the limit is reached.
  • Setting this to a lower number may improve performance, but your reports may not reflect the full data set.




(Hope this blog would help you find quicker ways to resolve Ad-hoc performance related issues.)

Best Regards,





Date Calculation using Joda Time Library In iReport

Dear readers, this blog talks about Date Calculation using Joda Time Library In iReport

By using Joda Time Library , we can give default value expression for any datetype parameter .

Steps for using joda time library in iReport  :

  1. Download joda time library ,

2. Put its jar file “joda-time.jar” in the path of iReport : Tools/Options/Classpath/.

joda ireport

3. In the main report section properties , set import property to “org.joda.time.DateTime

Without  double quotes.

joda ireport 2

4. Now with the above configuration , we will be able to do any type of date calculation


5. Example :

   a. Start Day Of Current Month :

Take two parameters : now and startday respectively

For ‘now’ parameter give its class name as ‘java.util.Date’ and default value expression as new DateTime.now().dayOfMonth().withMinimumValue().withTimeAtStartOfDay()


b. Last day of previous Month :

Take two parameters : now and endday respectively

For ‘now’ parameter give its class name as ‘java.util.Date’ and default value expression as new DateTime.now().minusMonths(1).dayOfMonth().withMaximumValue()

In the similar way we can do any type of date calculation for default value expression.

Rupam Bhardwaj

Helical IT Solutions

Debug Mondrian Cubes

This blog will talk and educate the reader on how to debug Mondrian Cubes.

Question: How to look at the queries that Mondrian generates while the user is navigating the OLAP cube. ? It’s really useful to look at Mondrian log files because they give us a lot of useful information about how our system is behaving. We can
o look at sql statements and MDX queries,
o Have some profiling information on queries that are executed,
o Get other useful debugging information.

The following Steps illustrate how to enable Mondrian debugging logs, adding some properties to the Mondrian configuration file. After that, we’ll configure two new log4j appenders to have the desired log files properly written on our file system.

Step 1: Enable Mondrian debug log – Mondrian has a big set of configuration settings that can be modified. In our case, to enable Mondrian debug information follow the steps detailed below: Open the mondrian.properties file located in <bi-server_home>/pentaho-solution/system/mondrian and add the following line.

Mondrian.properties file location.
D:\Installation Softwares\Pentaho\biserver-ce-4.8.0-stable\biserver-ce\pentaho-solutions\system\mondrian

Debug Mondrian Cube

Step 2: Update log4j configuration
At this point we’re going to modify the log4j configuration file adding the required appenders categories to have our logging information displayed properly Open the log4j.xml file located in <bi-server_home>/tomcat/webapps/pentaho/WEB-INF/classes Based on what you want to log, add the one or each of the following lines to the file. They will create two new RollingFileAppenders. You’re free to use the kind of appender you prefer.

Example: Location of log4j.xml file

D:\Installation Softwares\Pentaho\biserver-ce-4.8.0-stable\biserver-ce\tomcat\webapps\pentaho\WEB-INF\classe Add the following code.

NOTE: The code is already available within the file.. Just we need to un-comment.

<appender name=”MDXLOG” class=”org.apache.log4j.RollingFileAppender”>
<param name=”File” value=”../logs/mondrian_mdx.log”/>
<param name=”Append” value=”false”/>
<param name=”MaxFileSize” value=”500KB”/>
<param name=”MaxBackupIndex” value=”1″/>
<layout class=”org.apache.log4j.PatternLayout”>
<param name=”ConversionPattern” value=”%d %-5p [%c] %m%n”/>

<category name=”mondrian.mdx”>
<priority value=”DEBUG”/>
<appender-ref ref=”MDXLOG”/>

<!– ========================================================= –>
<!– Special Log File specifically for Mondrian SQL Statements –>
<!– ========================================================= –>

<appender name=”SQLLOG” class=”org.apache.log4j.RollingFileAppender”>
<param name=”File” value=”../logs/mondrian_sql.log”/>
<param name=”Append” value=”false”/>
<param name=”MaxFileSize” value=”500KB”/>
<param name=”MaxBackupIndex” value=”1″/>
<layout class=”org.apache.log4j.PatternLayout”>
<param name=”ConversionPattern” value=”%d %-5p [%c] %m%n”/>

<category name=”mondrian.sql”>
<priority value=”DEBUG”/>
<appender-ref ref=”SQLLOG”/>


Step 3: Enable the new log settings To have the new log settings enabled restart the Pentaho bi-server instance.

Log files location
D:\Installation Softwares\Pentaho\biserver-ce-4.8.0-stable\biserver-ce\tomcat\logs
After restarting the server and when you run the CUBE-OLAP…
You can find the following two files in the above location.

Files are:
i) Mondrian_mdx.log
ii) Mondrian_sql.log

Debug Mondrian Cube 2

Now, Enjoy analysing SQL queries that generated while performing run actions in various tools like Pentaho Analyser, Saiku Analysis.

Get in touch with us at Helical IT Solutions