Belfast SQL Server User Group

Welcome to the Belfast Chapter

Next Meeting

Thursday,

April

11

Using Azure Data Factory for data integration with a Data Vault data warehouse

  • In-Person @ 18 Ormeau Avenue Belfast, Antrim, United Kingdom (map)
  • 18:15 - 20:15 GMT Daylight Time
  • Language: English

Approx. Running order for the night

  • 6:15 - 6:30: Registration & Intro
  • 6:15 - 7:15: Part 1.
  • 7:15 - 7:30: Break and Refreshments
  • 7:30 - 8:30: Part 2.

Sign up https://www.eventbrite.co.uk/e/using-azure-data-factory-for-data-integration-with-a-data-vault-data-warehouse-tickets-57892921286

Featured Presentation:

Using Azure Data Factory for data integration with a Data Vault data warehouse

Dan Galavan, Data Architect Independent

Summary In this session you will receive an introduction to both Azure Data Factory v2 and the Data Vault data modelling methodology. We will start with an overview of both followed by creating a Data Vault data warehouse along with an associated virtualized access layer in Azure. Finally, the pipelines needed to integrate data from an upstream application database will be implemented using Azure Data Factory v2. Azure Data Factory Azure Data Factory is a cloud-based data integration service that allows you to create workflows for orchestrating and automating data movement and data transformation. The data-driven pipelines (workflows) can ingest data from disparate data stores. Azure Data Factory can process and transform data by using compute services such as Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning. Additionally, you can publish data to data stores for business intelligence (BI) applications to consume. Ultimately, through Azure Data Factory, raw data can be organized into meaningful data stores for better business decisions. Data Vault data modelling: Over 80% of data warehouses in the Netherlands have been implemented using the Data Vault methodology, and has been touted as the next generation data modelling approach. The Data Vault makes a clear distinction between data acquisition & auditing vs. data interpretation & interpolation. This talk will show you how a single version of the facts may just edge out a single version of the truth. The knock-on effect is high adaptably to changing business requirements, something that can be a challenge with traditional data warehousing approaches. This also leads to faster sprints. Additionally, the Hub (core business concept), Link (relationship) and Satellite (context and history) design lends itself well to massively parallel ETL processing.

About Dan:
Dan has been working with databases for 20 years. He has delivered data platform solutions in Ireland and Germany in a variety of industries including Retail, Corporate and Investment Banking, Aerospace, CRM, Social Protection, Telecoms, and Health Product Regulation. Specializes in database design & development, data migration, analytics, master data management, enterprise data warehousing, data & solution architecture, and the data life-cycle in general. Regular contributor to SQL Server User Group community. Data Vault 2.0 certified practitioner. Member of the Munich Colmcilles European Gaelic Football League winning team...a long time ago!

PASSChapterLogo.jpg

Broadtree Solutions

Ormo Baths

 

 

Back to Top
cage-aids
cage-aids
cage-aids
cage-aids