Each Analytical System must constantly evolve, bringing more and more benefits to its owners and users. There are plenty of evidence of how smart decisions could be achieved only with the help of smart analytics. We are completely convinced that literally any lost minute of not taking next steps in data-driven directions is a loss of great opportunity for companies in the modern world!

Below you could see a concise material about our vision on the field of developing of Enterprise Data Warehouse systems and get familiar with short list of successfully completed projects and tools that we have at our disposal.

Enterprise level reporting

The first, basic, but developing throughout the lifetime of the analytic system is the level of access to corporate information.

At this stage, users of the system have a unique opportunity to use the basic properties of an Enterprise Data Warehouse:

  • The data are combined from various source systems;
  • The data passed the verification and cleansing phases;
  • The data is accessible for a long period of time;

Resource-intensive analytical queries do not burden the enterprise Online Transaction Processing (OLTP) Systems.

Here the process of getting an answer to the question “What’s going on?” becomes a routine process of creating new types of reports.

Level 1. Reporting

Business users access data through a wide range of interfaces:

  • Specialized Business Intelligent (BI) tools;
  • Scheduled reports in the form of extracts to CSV / Excel files. File sharing occurs via mail, share point services or FTP/SFTP;
  • The protocol of interaction with the Enterprise Data Warehouse development team for creating ad-hoc requests (Report Request Service);
  • Power Users have access to the warehouse data model using SQL.

There would be sufficient to have just five main components on that stage:

  • Relational Database Management System (RDBMS). That is as a core component of the Enterprise Data Warehouse solution: it is intended to store data and give means of access to that;
  • Extract, Transfer and Load (ETL) tool. Self-descriptive collections of software components without which no one successfully working solution could most probably exist;
  • Workload Automation tool. Highly desirable component of solid Data Warehouse solution that provides a reliable and convenient way to monitor and control the execution of the steps of loading data into the storage;
  • Business Intelligence (BI) tool. The tool designed to retrieve, analyze, transform and report data;
  • Data Model. There is a list of well-established paradigms in building data warehouses: a thorough and comprehensive analysis should be carried out of when to choose the approach of Inmon or Kimball, Data Vault or Anchor Modeling in each particular case. A great advantage here is the presence of a core data layer in which the information is stored in a normalized form, which allows flexibly and in a short time get any new projection of data.

 

We provide a list of tools with which our team has extensive experience:

Type of task

Free (conditionally paid) options

Paid options

Relational Database Management System

MySQL, PostgreSQL, MariaDB, GreenPlum

Teradata, Oracle, Vertica, Microsoft, Netezza

Extract, Transfer and Load Tools

Pentaho Data Integration, Talend

Informatica PowerCenter, SSIS, Oracle Data integrator, IBM Infosphere DataStage

Workload Automation Tools

Apache Airflow, Teradata SLJM, JobScheduler

Automic Workload Automation (UC4), Control-M, Tidal Workload Automation, ActiveBatch,

Business Intelligence Tools

BIRT, Knowage, RapidMiner, Microsoft Power BI Desktop, Tableau Public

Tableau, MicroStrategy Analytics, Qlik Sense, Microsoft Power BI, SAP Business Objects, IBM Cognos Analytics, Sisense, Looker

Just a short list of projects in which we were lucky enough to take part:

GDPR

Project on the organization of operations with data in accordance with the requirements of the General Data Protection Regulation (GDPR) in the Bank. Within the scope of this project, changes were made in the main process of loading and processing data in the Data Warehouse, which allowed to realize such requirements as Pseudonymisation, Right to erasure and Right of access of personal data of clients, which were not formalized from the very beginning in the architecture of the solution.

Client 360°

End-to-end implementation of the Data Warehouse to cover base reporting needs of the Bank. The core of the solution was based on the Teradata RDBMS to receive 360° representation of their individual clients. Within the scope of this project, the Data Warehouse core was filled with more than 500 attributes describing clients, their agreements and activities, using a corporate data model based on the Financial Data Model (FDM) provided by Teradata Corp.

Migration

A tool for automating the scripts creation based on templates was used in the project of migration of a critical data source of a large telecommunications operator. This effort allowed significantly shortening the development time and improving the reliability of the result, minimizing the risk of human error.

OLAP systems

An important stage in the growth of the functionality of the analytical solution is the implementation of specialized tools Online Analytical Processing (OLAP). These tools allow you to build reports on huge amounts of information (terabytes of raw data) using convenient and familiar to business users interfaces (such as PivotTable in Excel), while not attracting IT staff each time.

A powerful set of built-in functionality allow on the fly to receive reports in different slices, for different periods, including using complex calculated key performance indicators (KPIs). At the same time, at every stage of the report generation, the possibility of obtaining the initial data sets is preserved, which ensures complete transparency of the work with the data.

Our experience includes the use of tools such as:

Type of task

Free (conditionally paid) options

Paid options

OLAP System

Druid, Mondrian OLAP server

Microsoft Analysis Services, SAS OLAP Server, Jedox OLAP Server, Oracle Database OLAP Option

Just a short list of projects in which we were lucky enough to take part:

Call-Center Cube

The project to create an Analytical Cube that allowed call-center managers of the telecommunications operator to increase the depth and speed of analyzing the productivity of their employees and build on it a more equitable employee incentive scheme.

Call Data Record Cube

An Analytical Cube has been created that allows analyzing the activity of the telecommunications operator’s clients with deep historical data in a vast list of dimensions, but at the same time being able to get a report with the built-in functionality at the transaction level that reveals details of what data forms the basis of the particular piece of analysis.

 

MDM Solutions

At a certain stage of the growth of the analytical solution, it becomes necessary to create a golden record for the key entities of the enterprise: products, customers, points of sale, and so on. Master Data Management (MDM) tool is a right tool to pay attention to in that case.

This mechanism allows you to obtain a unified view of the underlying assets of the enterprise and avoid duplication, contradictions and omissions in data that inevitably occur if more than one accounting system exists in the enterprise landscape.

An important factor in implementing this level of solution is not only the choice of a suitable instrument. The greater the willingness of the enterprise to solve the possible difficulties associated with the quality of data at the level of the whole enterprise, the willingness to transform work with this on the way to creating a single centralized corporate data model and related organizational initiatives.

Our field of experience:

Type of task

Free (conditionally paid) options

Paid options

MDM System

Taled Master Data Management

Informatica MDM, Oracle MDM, Ataccama One, InfoSphere MDM

Just a short list of projects in which we were lucky enough to take part:

MDM Remedy

Project for optimizing the performance of MDM solution for one cost containment services provider to the workers’ compensation industry in the United States. The original solution provider did not provide stability and productivity in the production environment, the task of our employees was to ensure the reliability, transparency and sufficient performance of the entire implemented solution.

Informatica Multi Domain MDM

Design and delivery of MDM system in a clothing and accessories retailer based on the Informatica Multi Domain MDM tool. This solution was built into the overall IT landscape of the enterprise, which ensured the availability of a transparent and manageable mechanism for managing the company’s core entities through a single mechanism.

Statistical models

At the next stage of the maturity of the analytical system of basic reports and just getting answers to the question “What’s going on?” becomes insufficient, and the basic requirements to the analytical system for the transition to the next stage – getting answers to the question “Why is this happening?” – have already been implemented:

  • There is an access to all critical source systems;
  • Calculations of a large volume of approved new indicators were made;
  • The enterprise has a consensus on the information that the company has;
  • A sufficient level of quality of information available for analysis;
  • The analytical system has earned a significant level of user confidence and requests to the system are becoming increasingly strategic.

At this stage, the transition to a mass creation of models of statistical analysis is possible. Based on the prepared data sets with the help of specialized tools, business users get access to a new layer of information: answers to the questions “Why?”, the establishment of cause-effect relationships.

We have already successfully implemented projects with following tools and technologies:

Type of task

Free (conditionally paid) options

Paid options

Construction of statistical models

R Foundation for Statistical Computing, Puthon, KNIME

SPSS (IBM), SAS, MATLAB

Just a short list of projects in which we were lucky enough to take part:

 

Clients` Activity

A project to create a flat data model for each customer of the telecommunications operator, in which more than 300 pre-calculated Key Performance Indicators (KPIs) are presented on a par with demographic information, which allow building models for regression analysis and to produce Analysis of Variance (ANOVA).

Factor Analysis

A project on Factor Analysis at the enterprise for the provision of insurance services in the health insurance market. This method has opened a lot of new areas to optimize the work of the enterprise: from more flexible product configuration to another method of fighting with fraud behavior.

 

HADOOP, Cloud, Building machine learning models

With the increasing number of tasks and the depth of analysis of basic tools for data storage and processing becomes insufficient:

  • The data from certain sources come in a semi-structured form (logs of customer actions on the site, audio-video information, etc.). A better-suited model in that case is a Schema-on-Read, providing a flexible interaction with the original data, while allowing data to be stored without the requirements for their structure.
  • Data from traditional sources occupy significant volumes, which leads to an increase in the cost of ownership of this information. It is more profitable to use a model in which all the raw data is stored on relatively cheap media, and only a selected part of the information falls into the traditional data store.

With such a diverse and large amount of information, it becomes natural to seek nontrivial regularities with the help of advanced approaches Data Science — Machine Learning. To the range of answers to the questions “What’s going on?” and “Why is this happening?” the possibility of answering the question “What can happen in the future?” is covered.

Most cloud providers provide full-featured platforms through which the enterprise accesses both storage systems and extensive advanced analytics methods and tools.

Tools and platforms that we have experience with:

Type of task

Free (conditionally paid) options

Paid options

Big Data

HDFS, Hbase, Hive, Spark, Zookeper

HortonWorks, Cloudera, MapR

Cloud

Microsoft Azure, Amazon Web Services, Google Cloud Platform

Machine Learning

H20.ai, KNIME, Spark, R, Python

Alteryx, SAS

  Just a short list of projects in which we were lucky enough to take part:

Click Stream

Collecting of information on activity (including Click Stream) on the site and in mobile applications of a large Bank for use in further stages of analytics, including building a model of customer behavior in the channels of interaction, and customer satisfaction with the site and mobile application.

Data Lake

The implementation of the Data Lake concept in the telecommunication company for storing the huge volume of the Event Data Recorders (EDRs). Part of the analytics was performed directly inside the Big Data systems, and the key piece of data were extracted for further processing in the core data store.

Migration to Google Cloud

The project for migration of the Analytical Solution from on-promises to the cloud Google Cloud Platform for the beverage producer.

Kafka, IoT, NoSQl, Search Engine: Active Data Warehouse

At the most advanced stage of its development, the analytical system is built into the structure of critical enterprise systems, where operational decisions are made automatically in real time. The models are self-taught, and staff participation is reduced.

Real-time data collection and processing systems in highly loaded environments, together with integration systems at this stage, are particularly important. Fast navigations and other features of Enterprise Search Engines become inevitable part of the entire solutions.

The list of technologies we use in our projects:

Type of task

Free (conditionally paid) options

Paid options

Streaming Platforms

Apache Kafka, RabbitMQ

IBM Websphere MQ,  Microsoft MQ

Streaming Analytics

Apache Samza, Apache Storm

IBM InfoSphere Streams, Microsoft StreamInsight, Informatica Vibe Data Stream

NoSQL Databases

Apache Cassandra, MongoDB, OrientDB

Neo4j, ArangoDB, IBM Domino

Search Engines

Apache Solr, Elasticsearch

Just a short list of projects in which we were lucky enough to take part:

Campaign Management

Implementation of a marketing campaign management system in the telecommunication company that provides management and analysis of the results of interaction with customers in near real-time, with the ability to respond quickly to actions and movements of the clients.

Data Catalogue

Creation of a user-friendly interface for navigating in data and metadata inside the integrated Enterprise Data Warehouse solution, which includes Big Data objects, RDBMS objects and exported reports in the bank.

Network Intelligence

Development of an operational response system for online events on the telecommunication company equipment, which allows anticipating and reacting quickly to breakdowns and loading incretion on network elements.

Our advantages
  • The most affordable rates of our specialists in a near-shore region;
  • High-class specialists with extensive experience in building a Enterprise Data Warehouse of any complexity;
  • Lower country risks in comparison with the unstable situation in the near-shore countries, oriented towards outsourcing (Ukraine and Russia);
  • Support at the level of the state of the work of the IT company (special preferential business in this area, support for specialized education, simplified entry and stay regimes for foreign IT professionals), specialization of Belarusian IT industry in providing outsourcing;
  • Almost complete coincidence of working hours between Norway and Belarus (difference is one hour during the summertime and two hours during the wintertime).
Our offer

We suggest that you include our team in the list of companies with whom you are ready to cooperate and we will do everything possible to prove not in words but in practice our professionalism and focus on results.

Our Company is ready to provide the whole complex of technical solutions at every stage of development of your Corporate Data Warehouse.

We look forward to fruitful cooperation with you!