Download PDF

Personal Summary

Senior Data Architect and Data Migration Expert with 25+ years of experience designing and delivering enterprise data solutions across telecom, financial services, public sector, energy, pharmaceutical, and manufacturing environments. Combines deep expertise in data architecture, data migration, DWH/BI, data modelling, ETL/ELT, and data integration with hands-on experience across both traditional database platforms and modern cloud technologies, including Snowflake and GCP BigQuery. Proven track record in translating complex business requirements and legacy data landscapes into scalable, governed, and implementation-ready data solutions. Brings strong strategic thinking, leadership capability, and end-to-end delivery experience, from concept and detailed design through execution, coordination, and successful implementation of data-driven initiatives.

AREAS OF EXPERTISE

  • DQM / Data Management
  • ETL/ELT/EtLT
  • CRM, ERP
  • Billing/Accounting Systems
  • iGaming
  • RDBMS: Oracle, MSSQL, PostgreSQL, MySQL
  • GCP - Big Query, Hadoop, Kafka 
  • SQL/PLSQL/TSQL
  • Snowflake
  • Data Analysis
  • Data architect tasks
  • Business Analysis
  • Error Handling
  • Defect tracking
  • Data Mapping
  • Data modelling
  • Master Data Management
  • Metadata Management
  • Database design
  • Project Management
  • Agile methods
  • Testing
  • Reporting
  • BI/DWH
  • Data Lake / Data Lakehouse
  • Data migration
  • Data mesh
  • Big data
  • Data Governance

Work experience

Stratis Ltd.
01/2026

Senior Data Architect, Data Migration Expert as a contractor

Working in parallel on two strategic public sector data initiatives: a data migration programme supporting the rollout of a new administrative system for a transport regulatory authority, and a large-scale public sector data management modernisation programme.

VIHAR core system implementation - Data Migration

  • Took over the data migration workstream in an ongoing programme supporting the introduction of a new core administrative system after nearly one year of prior project execution.
  • Performed a rapid situation assessment and built on existing project results to define a more effective and controlled migration approach.
  • Developed the missing data migration strategy, execution process, and control framework, including testing and audit steps.
  • Redesigned the migration database architecture and initiated the first development activities for a selected real-source data domain using MSSQL, T-SQL, and Python.
  • Led a 3-member migration database development team and provided both professional and day-to-day delivery coordination.
  • Supported business analysts and domain experts in target-system architecture and data modelling decisions affecting migration readiness and implementation.

Hungarian State Treasury - Pension Systems Modernisation

  • Led the data management domain in the planning phase of the Hungarian State Treasury’s pension systems modernisation programme, a high-volume, high-complexity, multi-year transformation initiative.
  • Shaped the concept and planning foundations of the migration platform, including the migration database, Data Repository, and data cleansing approach.
  • Defined key principles, methods, and governance expectations for long-term data management and migration execution.
  • Coordinated and supported cross-functional experts working on data migration, data governance, and related planning activities.
SAMEGRID FZCO
11/202401/2026

Senior Data Architect as a contractor

I worked as a Senior Data Architect & Data Analyst Engineer at Samegrid. In this innovative iGaming startup, I led and developed a next-generation cloud-based DWH solution for the iGaming area.

Technologies used: Apache Flink, Snowflake, AWS S3, DBT.

Key responsibilities:

  • Designed and implemented a greenfield Snowflake-based cloud DWH architecture for an iGaming data platform aimed at delivering near real-time betting and financial insights through standardised BI solutions.
  • Owned the structural and operational design of the DWH layer, covering Staging, Transformation, Core (SCD1/SCD2), Analytics/Data Mart, and semantic layers.
  • Built the first working MVP independently, then supported the scaling of the delivery model as the team expanded to 5 specialists in a matrix setup.
  • Worked closely with data engineers, analysts, BI developers, and data scientists to translate business and analytical requirements into a scalable and consumable cloud DWH solution.
  • Designed the platform to support 2 initial source systems, later extended to 5, covering 3 countries and 4–5 brands during the development phase.
  • Contributed to a near real-time processing model in which data ingested via Apache Flink and staged through AWS S3 became available in BI within approximately 10 minutes.
  • Defined a layered, performance-aware Snowflake architecture using Streams/Tasks and Dynamic Tables, while gradually replacing the initial dbt-based approach, which added limited value.
  • Applied star-schema-based design principles with denormalised modelling patterns optimised for Snowflake’s columnar architecture and analytical workload characteristics.
  • Established core metadata and mapping foundations and supported lineage and observability using Snowflake Data Lineage and OpenMetadata.
  • Delivered 3 MVP versions; the latest reproduced the same KPIs, reports, and dashboard results as an existing customer DWH/BI solution with better performance, demonstrating that the platform was viable for replacing legacy solutions.
TenzorTech Kft, Sagemcom
2024.062025.03

Senior Data Architect

I worked as a Senior Data Architect on the NAV eReceipt (eNyugta) project, focused on designing and implementing a robust Data Management strategy.
 

Technologies used: Kafka, Redis, DELL ECS, ELK Stack (Elasticsearch, Logstash, and Kibana).

Key responsibilities:

  • Worked as part of a multi-architect team on the NAV eNyugta / eReceipt programme, supporting the data architecture of a national real-time receipt reporting and electronic receipt service.
  • Contributed to the design of operational data flows, data handling rules, and validation concepts for receipt and report ingestion, with a strong focus on quality assurance, risk identification, and implementation readiness.
  • Reviewed specifications and architectural documents, provided QA findings, and highlighted delivery risks, control points, and design pitfalls across the planned data handling processes.
  • Supported the design of a real-time, event-driven solution involving Kafka for message intake and communication logging, Redis for caching, Oracle for a separate DWH layer, DELL ECS for object storage and archiving, and ELK for operational monitoring.
  • Participated in defining the principles for validation, logging, control checkpoints, storage, and processing rules in a highly controlled public-sector environment.
  • Worked in a delivery setup involving approximately 4–5 architects and 10–15 engineers/developers, primarily in a QA and review-oriented architecture role.
  • Contributed during the phase in which the team evaluated the results of the first developer test cycles.
Nortal
10/202207/2024

Data Modeller, Data Architect as a contractor

Projects: NEOS projects in NEOM

Technologies, tools: Apache Kafka/Spark, MySQL, Iceberg, PostgreSQL, Python, DBeaver, PowerBI, Apache Superset, Snowflake, DBSchema, DBT, SAP

Key responsibilities:

  • Joined Nortal’s delivery team as a contractor on the NEOM / NEOS project, contributing to the design of a data mesh-based framework for multiple sectors, including tourism, education, HR, construction, waste management, and management-level aggregated data.
  • Designed logical and physical data models for the data products required by the data mesh approach, translating sector-specific business needs into implementable analytical structures.
  • Performed key data architecture tasks, including source system and source data analysis, data flow design, and the definition of core architectural elements supporting the target solution.
  • Worked closely with data engineers, data analysts, and BI specialists, supporting implementation planning and the translation of modelling concepts into practical delivery steps.
  • Contributed to BI enablement by supporting reporting needs and developing initial Power BI dashboards for selected use cases.
  • Produced test datasets and supported production data anonymisation to enable development, testing, and design validation activities.
  • For the Tourism domain, where source systems were not yet available, proposed logical and physical data models and generated representative test data to validate the planned solution approach.
  • Prepared, published, and presented modelling and design outputs to project stakeholders.
Ceva-Phylaxia Vaccine Producer Plc.
09/202212/2022

Data Modeller, Data Architect asa contractor

Project: Quality finished product release project

Technologies, tools: GCP - BigQuery, Talend

Key responsibilities:

  • Worked on a regulated pharma / vaccine production data integration project, loading manufacturing, testing, and inventory process data into GCP BigQuery for finished product release and process analytics.
  • Designed the BigQuery Staging and Core model for 6 source systems and around 20 core tables.
  • Adapted existing and developed new Talend pipelines to support data loading into BigQuery and data flow between the staging and core layers.
  • Combined architecture and engineering responsibilities across physical design, implementation, and documentation in a 3-person matrix team.
  • Applied denormalised BigQuery structures and date-based partitioning, and supported data harmonisation through data dictionaries for analytical aggregation.
  • Delivered a Core MVP and analytical SQL/database views consumed by the BI team.
EISDEV Consulting LTD
03/202009/2022

Data Architect, Data Modeller as a contractor

Projects: Vodafone Hungary: Legacy DWH, ODS, Tactical reporting, Nucleus GCP DWH

Worked as part of a broader architecture and delivery team on parallel enterprise data architecture assignments for Vodafone Hungary, supporting cloud DWH transformation, tactical reporting continuity, and related data platform initiatives.

Technologies, tools: DMaaP, CDAP, ORACLE, ODI, Hadoop, Hive, Yarn, GCP, BigQuery, Snowflake, QlickView

Key responsibilities:

Nucleus – Vodafone Global / Google cloud DWH programme

  • Contributed to the Hungarian implementation stream of Nucleus, a Vodafone Global and Google initiative to establish a modern cloud DWH solution in GCP BigQuery.
  • Worked as one of several Data Architects in a collaborative architecture setup involving 5–7 parallel Data Architects, alongside Infrastructure, Solution, and Enterprise Architects across the wider programme.
  • Contributed to the design of source and target data models and the related mapping specifications required to migrate selected legacy DWH capabilities into the new cloud environment.
  • Analysed end-to-end legacy DWH data flows from loading to the semantic layer, including complex billing and network usage data paths, to understand transformations, aggregations, and existing data issues.
  • Supported design, build, and test activities, including validation of successful data loading, data volumes, and target-state consistency in BigQuery.
  • Performed reverse engineering and quality review activities on existing DWH processes to assess correctness, identify loading issues, and improve migration readiness.
  • Worked in an international delivery setup involving enterprise architecture support from Portugal, technology collaboration with Google, and business/project coordination from the UK.

Tactical Reporting – interim GCP reporting solution

  • Contributed to the design and implementation of the Loading and Transformation layers of a tactical reporting solution built in GCP to support Hungarian business needs until the global Nucleus platform became available.
  • Planned and implemented data loading, data cleansing steps, including address standardisation, and business-driven aggregations for reporting use cases as part of the wider project team.
  • Supported data anonymisation, BI solution design, and the integration of data flows between Tactical Reporting and the Nucleus target environment.
  • Helped deliver an interim solution supporting billing, finance, usage, and service-related analytics, which is still in use for Hungarian BI and dashboard reporting.
  • Contributed to a focused, business-oriented architecture that differed from the broader, standards-driven Nucleus model by prioritising speed, relevance, and local reporting needs.

Connected parallel initiatives – Netcracker, Legacy DWH, ODS

  • Supported the Netcracker billing programme from a data architecture perspective, performing quality review activities and representing Tactical Reporting and Nucleus data requirements within the broader delivery context.
  • Contributed to data model design for an ODS initiative, aligning it with the needs and dependencies of other parallel data-driven projects.
  • Worked across a highly complex telecom data landscape involving approximately 10,000 legacy DWH tables, around 200 Tactical Reporting tables, and 1,000+ Nucleus tables.
  • Contributed in a context where daily usage data volumes reached 700–800 million rows, requiring careful analysis of source flows, aggregations, and target structures.
BCA Hungary LTD
20192020

DQM, ETL Expert as a contractor

Projects: Hungarian Nuclear Plant - BI

Worked as part of a delivery team in a highly regulated nuclear energy environment, supporting BI modernisation, data governance, and migration-related initiatives.

Technologies: ORACLE, PL/SQL, OBIEE

Key responsibilities:

  • Worked as part of a delivery team in a highly regulated nuclear energy environment, supporting BI modernisation, data governance, and migration-related initiatives.

  • Contributed to the replacement of legacy Oracle Forms-based BI solutions with Oracle OBIEE, supporting the technology and content upgrade of the reporting environment for operational use.
  • Acted as the professional lead on the BI workstream, designing and implementing the OBIEE semantic model/repository layer and supporting the transition to a modernised reporting architecture.
  • Planned and developed data loading, PL/SQL-based transformation, data correction, and aggregation logic required for the BI solution.
  • Worked in a fragmented source landscape with multiple siloed systems, contributing to the standardisation of product/asset/material master data in line with the plant’s strict rule-based operating environment.
  • Contributed as a senior expert to a Data Governance initiative aimed at redesigning enterprise data management principles and operating practices beyond isolated local solutions.
  • Supported a parallel migration project related to the plant’s new control and management system by helping design data loading, transformation, and preparation processes.

Delivered key outputs, including the BI data model, master data foundations, and successful test migrations, while contributing to effective team-based delivery and loading processes as project priorities shifted towards migration.

DQM Mentor Ltd.
20112018

DQM, DB and ETL/Data Migration Expert, Data Analyst, Business Analyst, PM, DBA

Worked across multiple enterprise data quality, data migration, and system implementation projects, translating complex business requirements into practical data solutions and supporting delivery from analysis through implementation.

Professional Position:

  • Senior DQM, ETL/Data Migration, BI, DB, Data Analysis and Error Handling Expert, SQL/PLSQL Developer and PM (2011-2018)

Main tasks:

  • Designed and developed DQM, data cleansing, ETL/ELT, and data migration solutions
  • Supported DWH, reporting, and BI initiatives
  • Performed data analysis and source environment assessment
  • Supported data integration processes and platforms
  • Contributed to master data, metadata, and data governance activities
  • Supported team coordination and project delivery

Main Projects:

  • T-Systems - implementation of the Asset Handling System as DQM/ETL expert, SQL/PLSQL Developer, Data Analyst and Scrum Master (Oracle, MS SQL, Talend, Oracle DWH, Jira)
  • Telekom Hungary (General Contractor: HPE) - AD/AM Projects as Business Analyst (Scrum Team Member, Product Owner) (Testing, Reporting, System design)
  • Vodafone Hungary (General Contractor: HPE) - AD/AM Projects as Business Analyst, BI expert (Scrum Team Member) (BI/BO, Testing, Reporting, System design)
  • Vodafone Hungary - Salesforce.com implementation as DQM and ETL expert, SQL/PLSQL Developer and Data Analyst (Oracle, Reporting)
  • Vodafone Hungary - implementation of the Siebel CRM system as DQM, ETL and Data Analysis expert, SQL/PLSQL Developer (Oracle, DWH, Reporting, Jira)
DSS Consulting Ltd.
20012011

DQM Expert, Professional Leader, PM

Worked on multiple DQM, migration, DWH/BI, and system implementation projects across the full project lifecycle, with responsibility for expert team leadership, data cleansing methodology development, and client relationship management.

Professional Position:

  • Professional Leader (DQM), Team Leader, Project Manager, Senior DQM and Migration Expert, SQL/PLSQL Developer, Database Developer, Presales Manager (2004-2011)
  • Senior DQM Expert, Database and Business Analyst, Project Manager, SQL/PLSQL Developer, Presales Expert (2002-2004)
  • DQM Expert, Project Manager (2001-2002)

Main tasks:

  • Designed and delivered DQM, data cleansing, ETL/ELT, and data migration solutions
  • Supported DWH, reporting, and BI initiatives
  • Performed source data and source environment analysis
  • Supported data integration processes and platforms
  • Contributed to master data, metadata, and data governance activities
  • Led teams and supported project delivery

Main Projects

  • OTP Bank - Consolidation of Bank customers (DQM) as PM, Business Analyst, SQL/PLSQL Developer (Oracle, Reporting, DWH, System and DB design)
  • Telenor - DQM projects as PM, DQM Expert, SQL/PLSQL Developer (Oracle, Reporting, System and DB design)
  • INVITEL - DQM, ETL and Building of Business Data Bank DB projects as PM, DQM and ETL Expert, Business Analyst, SQL/PLSQL Developer. (Oracle, Erwin, DWH, Reporting, DB design)

K&H Bank (KBC Bank N.V. in Hungary)
19972001

Bank clerks, Branch office manager, Team leader, PM

Started in branch banking operations and progressed rapidly into team leadership and branch management roles, before moving to the Bank Centre, where I first became involved in Data Quality Management and took on subproject management responsibilities.

Professional Position:

  • DQM Subproject Manager, Team Leader (2000-2001)
  • Business Process Manager, Business Analyst (1999)
  • Branch Manager (1998)
  • Deputy Branch Manager, Team Leader (1998)
  • Bank clerks (1997)

Main Projects:

  • K&H Bank – Data Cleansing subproject within the implementation of the new account management system (Equation), working as Project Manager and Business Analyst (Oracle, Reporting, System Design)

Career Snapshot

  • 25+ years of experience across the IT, telecom, financial services, public sector, energy, and pharma industries, with a strong focus on enterprise data architecture, data migration, DWH/BI, and data management.
  • Extensive hands-on and architectural experience in data modelling, ETL/ELT, data transformation, data integration, database development, reporting, and analytical solution design across complex enterprise environments.
  • Strong expertise in both traditional and modern data platforms, including Oracle, MSSQL, PostgreSQL, SQL/PLSQL, as well as modern cloud technologies such as Snowflake and GCP BigQuery, applied in large-scale transformation and analytics initiatives.
  • Business analysis experience across CRM, billing, ERP, and operational data domains, with the ability to translate complex business requirements into scalable and practical data solutions.
  • Proven leadership and strategic planning capabilities in designing, structuring, and driving data-focused initiatives from concept and detailed planning through implementation and delivery, while coordinating cross-functional teams and aligning business and technical stakeholders.
  • Experienced in working in Agile and hybrid delivery environments, bridging business needs and technical execution in complex, data-driven programmes.

Education