Tag Results

Use case: “Real Time Bancassurance data exchange program”- To incorporate ACORD standard to exchange policy, commission & premium related information with its partner broker bank and Insurance firm, in near real-time via XML SOAP messages and roll out similar data exchange program across different operating countries using a sustainable solution that could be deployed over any environment with ease. Goal Statement: To implement an industry-standard, robust, scalable, automated and near real-time data exchange solution for commission reconcile and payouts. Challenge: The […]

Use case: Mobile Application with offline first approach One of Ashnik’s customers – a top Insurance company in Asia was looking to develop a mobile application with online & offline capabilities. The needs being: To enable the field agents to access their customers’ insurance related information for read and offline operations Subsequently, sync all the insurance related customer data, captured by the field agents while being offline, back to the data centre  – when the device goes online later The Goal of the […]

Open source is future, they say. In more ways than one it stands absolutely true. It has been well accepted across industries since ages but a lot of enterprises took time to open up to open source. Today however, there is an overwhelming response from banking, insurance & retail enterprises to embrace open source tools and technologies to solve their business problems. Very recently, we delivered 2 projects for one of Singapore’s top Insurance companies. We were able to provide the […]

A blueprint for big data success – What is the “Filling the Data Lake” blueprint? The blueprint for filling the data lake refers to a modern data onboarding process for ingesting big data into Hadoop data lakes that is flexible, scalable, and repeatable.  It streamlines data ingestion from a wide variety of source data and business users, reduces dependence on hard-coded data movement procedures, and it simplifies regular data movement at scale into the data lake. The “Filling the Data Lake”blueprint provides developers […]

May be its time to look at creating Pentaho Data Service. In recent years, many of the enterprise customers are inclined to build self-service analytics, where members in specific business users have on-demand access to query the data. This not only helps enhancing the IT productivity, but also empowers the business users to perform a quick analysis. Many organizations which need to blend and visualize large data sets find it challenging to build the data warehouse. This is where building Pentaho Data […]

  Date: 12th April 2016 | 2.00 – 2.30 PM IST | 4:30 – 5:00 PM SGT Register for the Hangout here and add to your  Catch our Google Hangout to see and learn how to build Data Integration and Transformations using Pentaho. It helps you transform the data and allows you to easily access, prepare, analyze and immediately derive value from both traditional and big data sources. This hangout session will showcase the Data Integration capabilities of Pentaho which helps in building data transformations, through two demonstrations: How to […]

It’s been over five years since Pentaho’s CTO, James Dixon coined the now-ubiquitous term data lake in his blog. His metaphor contrasted bottled water which is cleansed and packaged for easy consumption with the natural state of the water source – unstructured, uncleansed, and un-adulterated. The data lake represents the entire universe of available data before any transformation has been applied to it. Data isn’t compromised by giving it undue context in order to fit it into existing structures, which could potentially compromise its utility […]

Last year our team was quite busy assisting our customers on some varied technical assignments. Engagements ranged from consulting on highly scalable web-centric applications to designing Big Data solution to implementation of highly available database and web-infrastructure. During these engagements we saw that some database trends were quite dominant viz the adoption of Big Data, need to store unstructured data, scalable web facing application etc. The engagements and discussions we had last year left us pondering about where does it go […]