Date Field Format Customisation in Adhoc View on Japser UI

Date Field Format Customisation in Adhoc View on Japser UI

Often while working with Adhoc Reports on jasper we get limited option for the availble date format in jasper. There are number of ways to fix this but one I am going to discuss here is by adding date format options in the backend of Jasperserver.

How we can add the different date formats in the Jasper UI itself is what I am going to describe in this blog. Here we are going to add the date format for “06/24/2016 11:15:25 a.m”.

1.First navigate to the path:
where jasperserver is installed.
2. Open the file
3. Edit the file content which appear somewhat like this:-

blog 1

4. Add “ADH_100_MASK_timestamp_5 = MM/dd/yyyy HH:mm:ss a” at required location.
5. In similar fashion we can customise the UI for the required date formats.
6. After making the changes save the file and restart the server. As new changes get reflected only after restarting the server.
7. Once the server gets restarted in the jasperserver you can notice the new option for date format by clicking on the date field in adhoc view, which we have just added in the backend file.

blog 2

Similarly we can edit the formatting of integer and decimal. Adding the format options in this way gives you more generalised approach to handle data formatting related tasks in jasper adhoc.

Nisha Sahu

Pentaho Analysis: Workflow

OLAP Analysis: Workflow


To prepare data for use with the Pentaho/Jaspersoft OLAP Analysis (and Reporting, to a certain extent) client tools, you should follow this basic workflow :

Design a Star or Snowflake Schema

The entire process starts with a data warehouse.The end result should be data model in the star or snowflake schema pattern. You don’t have to worry too much about getting the model exactly right on your first try. Just cover all of your anticipated business needs; part of the process is coming back to the data warehouse design step and making changes to your initial data model after you’ve discovered what your operational needs are.

Populate the Star/Snowflake Schema

Once your data model is designed, the next step is to populate it with actual data, thereby creating your data warehouse. We can use  Pentaho Data Integration or Talend for ETL.

Build a Mondrian Schema

Now that your initial data warehouse project is complete, you must build a Mondrian schema to organize and describe it in terms that Pentaho/Jasper Analysis can understand. This is also accomplished through Pentaho Data Integration by using the Agile BI plugin. Just connect to your data warehouse and auto-populate your schema with the Modeler graphical interface. Alternatively, you can use Schema Workbench to create an analysis schema through a manual process.

Initial Testing

At this point you should have a multi-dimensional data structure with an appropriate metadata layer. You can now start using Pentaho Analyzer, Jasper Ad-hoc and JPivot to drill down into your data and see if your first attempt at data modeling was successful. In all likelihood, it will need some adjustment, so take note of all of the schema limitations that you’re unhappy with during this initial testing phase.

Do not be concerned with performance issues at this time – just concentrate on the completeness and comprehensiveness of the data model.

Adjust and Repeat Until Satisfied

Use the notes you took during the testing phase to redesign your data warehouse and Mondrian schema appropriately. Adjust hierarchies and relational measure aggregation methods. Create virtual cubes for analyzing multiple fact tables by conforming dimensions. Re-test the new implementation and continue to refine the data model until it matches your business needs perfectly.

Test for Performance

Once you’re satisfied with the design and implementation of your data model, you should try to find performance problems and address them by tuning your data warehouse database, and by creating aggregation tables. The testing can only be reasonably done by hand, using Pentaho Analyzer, Jaspersoft Ad-hoc or JPivot. Take note of all of the measures that take an unreasonably long time to calculate. Also, enable SQL logging and locate slow-performing queries, and build indexes for optimizing query performance.

Create Aggregation Tables

Using your notes as a guide, create aggregation tables using Pentaho Aggregation Designer or talend to store frequently computed analysis views. Re-test and create new aggregation tables as necessary.

If you are working with a relatively small data warehouse or a limited number of dimensions, you may not have a real need for aggregation tables. However, be aware of the possibility that performance issues may come up in the future.

Check in with your users occasionally to see if they have any particular concerns about the speed of their BI content.

Deploy to Production

Your data warehouse and Mondrian schema have been created, tested, and refined. You’re now ready to put it all into production. OLAP reports help users to do multi-dimensional and cross functional analysis, slice and dice, drill-up and drill-through the data. It gives power in the hands of business users to generate ad-hoc reports and get better insight of their business and and make better decisions based on OLAP analysis.

Differences between Reporting and Analysis – Concepts

Differences between Reporting and Analysis – Concepts


       1.The process of orga­niz­ing data into infor­ma­tional sum­maries in order to mon­i­tor how dif­fer­ent areas of a busi­ness are per­form­ing.


        1.The process of explor­ing data and reports in order to extract mean­ing­ful insights, which can be used to bet­ter under­stand and improve busi­ness performance.

       2.Report­ing trans­lates raw data into infor­ma­tion

        2.Analy­sis trans­forms data and infor­ma­tion into insights.

       3.Good report­ing should raise ques­tions about the busi­ness from its end users

        3.The goal of analy­sis is to answer ques­tions by inter­pret­ing the data at a deeper level and pro­vid­ing action­able rec­om­men­da­tions

      4.Report­ing shows you what is hap­pen­ing

      4.Analy­sis focuses on explain­ing why it is hap­pen­ing and what you can do about it.


       5.Reporting: Build­ing, con­fig­ur­ing, con­sol­i­dat­ing, orga­niz­ing, for­mat­ting, and sum­ma­riz­ing.

        5.Analy­sis:  focuses on dif­fer­ent tasks such as ques­tion­ing, exam­in­ing, inter­pret­ing, com­par­ing, and con­firm­ing.


       6.Report­ing fol­lows a push approach

         6.Analy­sis fol­lows a pull approach

       7.Reports are pushed to users who are then expected to extract mean­ing­ful insights and take   appro­pri­ate actions for them­selves (i.e.,     self-serve).


Dashboard: These custom-made reports com­bine dif­fer­ent KPIs and reports to pro­vide a com­pre­hen­sive, high-level view of busi­ness per­for­mance for spe­cific audi­ences. Dash­boards may include data from var­i­ous data sources and are also usu­ally fairly static.

7.Par­tic­u­lar data is pulled by an ana­lyst in order to answer spe­cific busi­ness ques­tions.

Ad hoc responses
Analy­sis pre­sen­ta­tions

Approach of Reporting and Analysis


Extract year,quarter,month & day from date input control in pentaho CDE using java script – MDX Query Scenario

Hello guys…!!

Some times you need to extract the parts(year,month,day) of Date for some specific use..
For example:
Assume you are creating a report with MDX query which has dimension called “Date” having levels “Year”, “Quarter”,”Month” & “Day”.

(Note : Assume your schema is having
Year: yyyy Quarter: 1 or 2 or 3 or 4 Month : 1,2,3,4…….12  Day : 1,2,3…. 31)

Also assume you do not have direct date dimension in your schema (i.e, you do not have a dimension which takes ‘yyyy-MMM-dd’ column.

But, you need to display date(yyyy-QQ-MMM-dd) or (yyyy-MMM-dd) on X-axis of chart.. Remember you are not having any direct date in your schema but have “Date” with year,quarter,month & day as levels.

From your start_date(or end_date) input control you can extract the individuals using the following java script for CDE and use them properly in your MDX date range place.

This should be done in “Pre Execution” section of Chart component

function extract_function(){

tmp_date = new Date(param_start_date);
var quarter = [‘Q1′,’Q2′,’Q3′,’Q4’];
var month = [‘JAN’,’FEB’,’MAR’,’APR’,’MAY’,’JUN’,’JUL’,’AUG’,’SEP’,’OCT’,’NOV’,’DEC’];

param_start_year = tmp_date.getFullYear();
param_start_quarter = q[Math.floor((tmp_date.getMonth()+ 3) / 3)-1];
param_start_month = m[tmp_date.getMonth()];
param_start_day = tmp_date.getDate();

tmp_date = new Date(param_end_date);
param_end_year = tmp_date.getFullYear();
param_end_quarter = q[Math.floor((tmp_date.getMonth()+ 3) / 3)-1];
param_end_month = m[tmp_date.getMonth()];
param_end_day = tmp_date.getDate();


* quarter and month variables are taken as arrays with default values.
* You need to calculate the month and send it to array ..
When you calculate months
1 becomes JAN, 2 becomes FEB and etc as well
When you calculate quarters
1 becomes Q1, 2 becomes Q2, 3 becomes Q3 & 4 becomes Q4

Forget about your problems …!!!! and Meet us @


BI developer


Pentaho BI Server community 4.8 Installation in existing tomcat with PostgreSQL in Linux/ubantu OS

This post teach you how to install the Pentaho BI Server community 4.8 Installation in existing tomcat with PostgreSQL in Linux/ubantu OS.

I’ve gone through many posts but could not find all the stuff in a single place. I just worked out and sharing the experience with it.
If you find any difficulty in the below steps feel free to drop a mail @ for help.

Prerequisites :
1. Pentaho BI server CE 4.8.0 stable
2. tomat 6 server
3. PostgreSQL

1)  Download the biserver-stable-4.8.0 using the following command in some folder.
Syntax :
wget URLOfTheDownloadLocation
wget  /

2) After downloading completed unzip it using uznip command.
Syntaz :
unzip .zipfileName
After unzipping you can find two folders .. They are i)administration-console & ii) biserver-ce

3) Install tomcat server externally(archive based installation) in your favorite location.
I’m taking jasperserver tomcat to install the pentaho server.

4) Executing .sql files in postgresSQL
* You need to build the two databases they are i) hibernate & quartz
*  Reason: You are going to install the pentaho bi server with postgreSQL(not with the hsql which directly comes with the download to interact with the server),hence you need to build the two databases for pentaho server to work properly.
* Where you can find the .sql scripting files ?
Check in the location :

( biserver-ce is the folder where you unzipped in step-2)

Scripting file names:

Commands to run the .sql files from putty :

root@sadakar-server:/opt/jasperreports-server-cp-5.0.0/postgresql/bin# ./psql -U postgres -p 5432 -a -f /home/sadakar/softwares/pentaho/biserver-ce/data/postgresql/

In the similar way execute the remaining scripting files.. you just need to change the file name in the above command.

Imp points to NOTE when you run the script files * You need to go to the “bin” folder of postgres installed and run the above command.
* In my case I’m using the the postgresql that installed with jasper server.
* In the above command -U user name -p Port number of the postgreSQL
* Must specify   -a -f  in the command otherwise the script will not run.
* When you run the script it’ll ask you for postgreSQL password : give password as “password”.
If you use any other password for postgres give that password
* When you run the script it’ll ask for database user names :
Open the script files in your fav editor and find this line

CREATE USER pentaho_user PASSWORD ‘password’;
This means for the quartz database password is “password” and for the same follows to other scripting files while executing.

* Once you execute all the scripting files check the postgreSQL databases whether the “hibernate” and “quartz” databases created or not.
* If you do not find the databases you might done wrong some where , cross check again the steps.
* And find 12 tables in “quartz” database and 1 table in “hiberante”database.

Hmmm… You are not done with the databases actually… b’z you do not have all the tables in “hibernate” database. B’z the scripting files do not have all the data & tables.

I’ll give you the links here to run the scripting files to get the tables.
At present do not think of it. Find this in following steps…!!

5. Changes in config.xml file of tomcat server
* You need to add the following code to the config.xml file
* location of the file : tomcat/confg/context.xml
* In my case the location is :

<Resource name=”jdbc/Hibernate” auth=”Container” type=”javax.sql.DataSource”
factory=”org.apache.commons.dbcp.BasicDataSourceFactory” maxActive=”20″ maxIdle=”5″
maxWait=”10000″ username=”hibuser” password=”password”
driverClassName=”org.postgresql.Driver” url=”jdbc:postgresql://localhost:5432/hibernate”
validationQuery=”select 1″ />

<Resource name=”jdbc/Quartz” auth=”Container” type=”javax.sql.DataSource”
factory=”org.apache.commons.dbcp.BasicDataSourceFactory” maxActive=”20″ maxIdle=”5″
maxWait=”10000″ username=”pentaho_user” password=”password”
driverClassName=”org.postgresql.Driver” url=”jdbc:postgresql://localhost:5432/quartz”
validationQuery=”select 1″/></Context> 
6. Adding postgresql-driver in the lib folder of tomcat
* You need to copy the postgresql-driver in the lib folder of tomcat
* location of the lib folder for tomcat is :  tomcat/lib
* In my example it is there at
* You can directly download the postgresql driver using the following command or copy and paste it in lib folder if you are already using in some other place in your machine.
* Command is :
7. Changes need to do inside pentaho-solutions folder

* This is quite interesting thing to work here.
* Before you do modifications in pentaho-solutions folder, you need to copy this folder similar to tomcat installation location( You can keep this folder any where you want).
* For example : I have copied this folder from bi-server folder to similar location where the tomcat is installed .(from step 2 of this artical)
i.e.,  At root@sadakar-server:/opt/jasperreports-server-cp-5.0.0# ls

apache-ant     common        installation.log  license.txt        properties.ini    scripts                  uninstall.dat
apache-tomcat  java              pentaho-solutions  releaseNotes.txt  Third-Party-Notices.pdf
buildomatic    docs          licenses          postgresql         samples           uninstall

* You need to configure the the settings for postgresql in applicationContext-spring-security-jdbc.xml file
* location of this file is : pentaho-solutions/system/applicationContext-spring-security-jdbc.xml

<bean id=”dataSource”
<property name=”driverClassName” value=”org.postgresql.Driver” />
<property name=”url”
value=”jdbc:postgresql://localhost:5432/hibernate” />
<property name=”username” value=”hibuser” />
<property name=”password” value=”password” />

* Next, you need to configure setting in :
* location of this file is : pentaho-solutions/system/


8. Changes need to do in hibernate folder
Navigate to “hibernate” folder from “system” folder of same “pentaho-solutios” folder.
* You’ll find different .xml files for different databases.
* You need to touch
i) hibernate-settings.xml and
ii) postgresql.hibernate.cfg.xml          files.. i.e., you need to do some modifications in these two files.
Changes in :
i) hibernate-settings.xml file
Comment this line

Enable this line

ii) postgresql.hibernate.cfg.xml

* You need not to do any modifications in this but you need to have an eye in this file.
if your postgresql port number is different than 5432 , just give your one and if you give the appropriate host if you use any host number .

9. Changes in context.xml file of META-INF folder of tomcat
* You need to modify the “context.xml” file located in the tomcat/webapps/pentaho/META-INF folder.
* In my example: It is located at


<Context path=”/pentaho” docbase=”webapps/pentaho/”>
<Resource name=”jdbc/Hibernate” auth=”Container” type=”javax.sql.DataSource”
factory=”org.apache.commons.dbcp.BasicDataSourceFactory” maxActive=”20″ maxIdle=”5″
maxWait=”10000″ username=”hibuser” password=”password”
driverClassName=”org.postgresql.Driver” url=”jdbc:postgresql//localhost:5432/hibernate”
validationQuery=”select count(*) from INFORMATION_SCHEMA.SYSTEM_SEQUENCES” />

<Resource name=”jdbc/Quartz” auth=”Container” type=”javax.sql.DataSource”
factory=”org.apache.commons.dbcp.BasicDataSourceFactory” maxActive=”20″ maxIdle=”5″
maxWait=”10000″ username=”pentaho_user” password=”password”
driverClassName=”org.postgresql.Driver” url=”jdbc:postgresql://localhost:5432/quartz”
validationQuery=”select count(*) from INFORMATION_SCHEMA.SYSTEM_SEQUENCES”/>

NOTE: We deployed “pentaho” and “pentaho-style” folders in weapps folder of tomcat server.

10. Changes in web.xml file of WEB-INF folder of tomcat
You need to modify web.xml of WEB-INF folder of tomcat server. i.e,. tomcat/webapps/pentaho/WEB-INF/web.xml
* In my example the location of the file is :


 NOTE: give the path for the “pentaho-solutios” b/w <param-value> and </param-value> tags

* You also need to check the port number & URL for the pentaho server in the same web.xml file.

NOTE: if you use some other port number for tomcat other than 8080 , you must specify the port number as shown above.

11.Tomcat server shutdown & startup
*  Go to the bin folder of tomcat server and shutdown the server if it already runs.
* Start the tomcat server.
* Commands :
Shutdown: ./
Startup :   ./

12. Type the pentaho server URL in any browser
* Go to the URL of any browser( Mozilla firefox is preferable as it is having firebug facility to track the errors if you get any)

Meet us if you have a business @

Sadakar(BI developer)