Senior Customer Data Platform Architect

IT & Telekom
Detta uppdrag är inte längre tillgängligt.
Currently we are looking for Senior Architect to help lead the development of a Customer Data Platform within Snowflake that will be used by the business. 
 
The successful candidate will work in collaboration with a Technical Analyst, business, as well as product and technology stakeholders to come up with an integration architecture.  This architecture will then be successfully converted to a project with partner execution teams with the Candidate providing oversight on the successful roll-out of the solution.
 
Possibility of 100% remote work.
Tasks
Document the detailed CDP architecture following the Architecture Documentation and Review Standard
Support the business on the selection of SaaS Composable CDP capabilities for Audience Building, and Reverse ETL
Complete the Privacy by Design process and ensure sign-off review with all Business, Product, and Legal stakeholders
Ensure all Data Sources are ingested into the Data Platform Snowflake instance and available for use in future analytics use cases
Document a detailed and prioritized Use Case Backlog for the CDP
Document a versioned and extension CDP Data Model that outlines how Individuals, their data, and activity, will be modelled
Document and version the High-Level Individual Activity Events 
Deliver the CDP Data Product as a series of immutable dbt transformations
Document and deliver the Analytics Engineering Development, Test, and Deployment Pipeline using Git, GitHub Actions, dbt, and Apache Airflow
Define and document the CDP Operating Model
Train the future CDP Analytics Engineering Team
Train the future CDP Activation Team to Integrate CDP data to target operational systems and validate scenarios end-to-end
Requirements
Min. 8 years of IT experience with at least 5 years in a Lead Designer or Architect role
Broad knowledge and experience with Cloud Engineering on AWS, Data Integrations, Data Warehousing, and strong proficiency in Snowflake
Experience with building data products from multiple data assets assembled via an Extract, Transform and Load (ETL) approach
Proficient with concepts of data modeling and data development lifecycle
Ability to lead and mentor junior team members, balance technical decisions against user needs and business constraints
Strong programming skills in the following languages and tools: SQL, dbt, Git, and Python
Familiarity with data governance, data security, and privacy regulations
English on an advanced level (written and spoken)
SQL
AWS
Git
Python
Data Warehouse
Documentation
ETL
English
GitHub Actions
Software as a service
Snowflake
dbt
Apache Airflow
Data Modeling
Use cases
 Warsaw, Mazowieckie (Masovian)
Period
ASAP - Öppen