Tag Results

Use case: “Real Time Bancassurance data exchange program”- To incorporate ACORD standard to exchange policy, commission & premium related information with its partner broker bank and Insurance firm, in near real-time via XML SOAP messages and roll out similar data exchange program across different operating countries using a sustainable solution that could be deployed over any environment with ease. Goal Statement: To implement an industry-standard, robust, scalable, automated and near real-time data exchange solution for commission reconcile and payouts. Challenge: The […]

Teletrac Navman won the 2017 Pentaho Excellence Award for Internet of Things (IoT), a category that recognizes organizations for leveraging data from devices to improve customer engagement, enhance operations and drive new revenue opportunities through machine learning or predictive analytics. Fleet Management Platform Tracks Enormous Volume and Variety of Data With 25 years of telematics experience, Teletrac Navman provides advanced data tools for location tracking, fuel monitoring, reporting, safety and compliance. Its powerful fleet management platform is used by 40,000 organizations, in six continents, […]

Use case: Mobile Application with offline first approach One of Ashnik’s customers – a top Insurance company in Asia was looking to develop a mobile application with online & offline capabilities. The needs being: To enable the field agents to access their customers’ insurance related information for read and offline operations Subsequently, sync all the insurance related customer data, captured by the field agents while being offline, back to the data centre  – when the device goes online later The Goal of the […]

Today, our parent company Hitachi, a global leader across industries, infrastructure and technology, announced the formation of Hitachi Vantara, a company whose aim is to help organizations thrive in today’s uncertain and turbulent times and prepare for the future. This new company unifies the mission and operations of Pentaho, Hitachi Data Systems and Hitachi Insight Group into a single business as Hitachi Vantara. Together, we give business leaders an advantage to find and use the value in their data, innovate intelligently and […]

Computing is evolving. It’s likely that the forces described by Moore’s Law and Dennard Scaling will mean that many of the ideas that we’ve associated with technological advancement will change. It won’t be about faster, cheaper, and smaller computers in the year 2025; prediction and where we apply it will be the big story. Most readers will be old enough to remember what photography was like in 1995.  We bought film, took photos and hoped for the best when we dropped […]

After living and breathing the world of analytics and data for 20 years, I can let you in on a little secret of how successful customers achieve business value — It does not come from the tool, but from the data.  It’s about the data.  It always has been. Maybe the reason many organizations don’t start with the data is frankly, data is the hardest part. The data is distributed, on-premise, in the cloud, messy, unwieldy and never stops pouring in. […]

The consensus among industry experts is that the 2016 shopping season in the run-up to Christmas will be another record-breaking period for cybersecurity breaches. For any sized company across all industries, cybercrime IS a problem that costs businesses billions of dollars each year. Additionally, governments are holding company executives accountable, with the potential to someday hold these individuals personally liable.  Indeed, there is a lot at stake. Until recently, technology has struggled to stay one step ahead of the criminals, in […]

Pentaho’s strategy of BI and DI “better together” has been consistent since our early days, but this value proposition has never been stronger than it is today with the announcement of Pentaho 7.0. The world of data is too vast and fast to try to deliver analytic insights without solving data challenges first. Growing amounts of data – coming from both traditional and big data sources – has created even more complexity. Plus, increasing demand for self-service analytics from the business […]

A blueprint for big data success – What is the “Filling the Data Lake” blueprint? The blueprint for filling the data lake refers to a modern data onboarding process for ingesting big data into Hadoop data lakes that is flexible, scalable, and repeatable.  It streamlines data ingestion from a wide variety of source data and business users, reduces dependence on hard-coded data movement procedures, and it simplifies regular data movement at scale into the data lake. The “Filling the Data Lake”blueprint provides developers […]

It’s been over five years since Pentaho’s CTO, James Dixon coined the now-ubiquitous term data lake in his blog. His metaphor contrasted bottled water which is cleansed and packaged for easy consumption with the natural state of the water source – unstructured, uncleansed, and un-adulterated. The data lake represents the entire universe of available data before any transformation has been applied to it. Data isn’t compromised by giving it undue context in order to fit it into existing structures, which could potentially compromise its utility […]