Data Warehouse / Data Solution

We deliver end-to-end data solutions starting from the data integration to the business insights and reporting.

A Data Warehouse collects all relevant data into one large database from several sources, e.g. business systems, public data, log data, excel spreadsheets, web pages, API sources etc. Information is loaded and processed automatically at regular intervals and transformed into a structure that leverages the value of data by complex reporting, decision support, data analysis, artificial intelligence solutions, analytical CRM and many other methods.
The key features of a data warehouse are the following:

  • Data is integrated and put together from several sources
  • The system keeps historical data
  • Data is transformed to make it effectively usable
  • It reduces the load on transaction systems by serving reporting and analysis from its database which stores the items in an optimized way

We, at Vortess have comprehensive experience in designing, implementing, and operating high quality data warehouses for large organizations, handling tens of TBs of data.
There are also similar, but simpler solutions than a complex data warehouse: data mart, data lake, data mesh and data fabric. We are happy to recommend a design and deliver the architecture that is optimal for your organization.

Data Visualization and Dashboards

We let you see the invisible in your data.
Data visualization is the best way of using data for decision making and getting business insights. Even if there are plenty of metrics and dimensions, a nice dashboard can charm your colleagues, guide your attention, tell stories behind your data and show hidden correlations and patterns.
We design and develop the visualizations of your information that give quick answers to your business questions. It can be an ad-hoc tool but it can be designed to long-term usage so it can be refreshed regularly and support the everyday decisions making process. In some cases, a pre-defined alerting system embedded to the visualization tool can be very useful (e.g. when a metric reaches a threshold, or a predefined event happens then notification appears on the dashboard or a push message is delivered to the stakeholders).

Data Strategy

Create strategy for data, analytics, business intelligence and data warehousing
Having a data strategy is the key for a successful data-driven business. It must be future proof and innovative but also tailored to the local corporate landscape (enterprise architecture, features of the local business and organization, etc.). Nonetheless, it is supposed to be comprehensive covering all the key aspects:

  • Collecting and organizing data: e.g. data architecture (business and technical architecture), data management, data governance, master data management, operations/maintenance, security, quality, etc.
  • Using data: e.g. business goals to serve, reporting, visualization, analysis, predictive analytics, decision making based on data, embedded data applications, etc.

We can support and drive these steps of a new strategy creation:

  • Maturity assessment:
    When we set up a new data strategy, first we need to know what the current situation is. We need the see the main problems, obstacles for the evolution, strengths and weaknesses. This is the starting point for a successful strategy creation.
  • Data vision & data strategy creation:
    The vision describes the ideal state in the future. Strategy gives the methods and details about the main aspects of handling data. The strategy must be approved and accepted by all affected stakeholders. It is not just a document for the archives but a set of clear guidelines that drive the everyday work and decisions at all levels of an organization.
  • Roadmap:
    Roadmap is the planned way of achieving the goals described in the vision and strategy. It includes the following steps: actions, projects, tasks with responsible and deadlines. A roadmap is a live document, and it must be updated according to the actual situation regularly.

Data Management

Data governance made easy. We are pleased to help in these data and analytics related topics by providing consulting services:

  • KPI selection and specification
    KPIs (Key Performance Indicators) are the main metrics that drive business performance. It is critical to select and define the most powerful KPIs which can serve the business strategy. Then we create and implement a methodology how to calculate, refresh and to publish them. KPIs must push business decisions and processes and can be the basis of incentive systems as well.
  • Business Glossary
    Collection of terms, dimensions and KPIs with their business definitions, calculation method and their metadata. A business Glossary is critical for a successful BI and analytics approach. It is suggested to launch a framework and a set of processes to maintain the glossary. We can offer a complete glossary methodology customized to the local requirements and circumstances and we also support the implementation.
  • Business Layer / Semantic Layer
    Analytical layer on the top of the raw data that implements the calculated business terms (KPIs, dimensions, business data structures). All the reports, analytics, data marts, visualizations etc. can use the same calculated data without implementing the calculations redundantly. This method gives the basis of the “Single version of truth” approach and can save a lot of time and efforts by the reusable calculations.
  • BI and Analytics tool deployment
    We offer vendor independent consultancy services for the successful deployment of a new BI and analytics tool. This service may include the selection process of the best fitting tool, a pilot project testing the capabilities, the design and implementation.
  • Data Governance
    Data governance requires a complete framework to achieve the data quality goals and to implement a successful data strategy. The framework includes processes, roles, responsibilities, standards, communications, and tools. Our consultants are ready to help designing the framework which fits to the organization and data strategy goals. We can support the implementation, refinement, and establishment of all the data governance related efforts.

Data Migration

Data migration is most often an underrated element during a system replacement project.It is the “dirty job” that must be done – nobody wants to work with an empty box!
When done properly, migrating data requires all the useful information to be extracted from the legacy system, transformed, selected, and then loaded to the new system. It is essential to know the old system and the new system from both a business and a technical aspect, heavily focusing on the data model itself.
The migration methodology we use can handle various challenges. Our toolset (called “Rapid Loader”) supports a range of architectural designs, including data migration to/from on premise and cloud. It can deal with complex cases, i.e., when required to consolidate information from four, five or more legacy systems to one new solution.
Key points of our methodology include:

  • Using a separate migration database where we can analyze and transform data. This approach gives an infinite freedom in terms of analysis and transformation. It can be on premise or cloud-based, depending on the requirements.
  • Capturing data quality issues at several points during the migration process. When faced with discrepancies, several options are available:
    • Find a transformation rule to eliminate the issue
    • Go back to users, production or business teams to correct data manually
    • Skip the wrong data
  • Using our “Rapid Loader” data loader tool when data is ready. Rapid Loader can be optimized to the target system. It is a framework created by us to streamline the data load process. Metadata capture, package handling, status tracking and error logging are supported. Follow up of the migration process and reporting of results can be displayed via a dashboard. The data loader framework can take care of the incremental migration when we need to track changes of legacy data and load the transformed data incrementally to the new system.
  • Implementing an automatic tester workflow that reads back data after loading it to the new system and compares it to the original.