Key responsibilities include:
• Design conceptual and logical data architectures that enable the implementation of intended business outcomes and re-usable, enterprise data products.
• Provide thought leadership and expert input on projects requiring extensive data requirements gathering, data modeling, data integration, and testing.
• Collaborate with business clients, data owners, and product owners to anticipate future data architecture needs and to resolve challenging data taxonomy and data integration challenges.
• Engage business and technology leaders in strategic discussions to better understand objectives that could impact data management and platform best practices, guidelines, and policies.
• Communicate implications of architectural decisions, issues and plans to business and technology leadership.
• Provide expert knowledge of data management principles, standards, and best practices to business and technology teams.
• Work directly with vendors to understand and influence overall vendor roadmap.
• Work with internal technology teams to define long-term data management strategy and capability roadmaps.
• Collaborate with other technology teams to define technical evaluation criteria for product and technology selection.
The role does not require any line management responsibility.
Qualifications Required:
• Bachelor's degree in Computer Science, Business Administration or equivalent educational or professional experience and/or qualifications. An advanced degree is also preferred.
• Minimum of 10 years of in-depth experience in Data Lifecycle Management, solutions design, and delivery of enterprise level data platform solutions.
• Minimum of 2 years of experience managing people/project lifecycle and delivery of mid-to-large enterprise technologies.
• Experience working with business application owners and development teams to document and clarify business and user requirements and manage scope of defined data requirements during project lifecycle.
• Technical experience in modern data platform concepts like lakehouse architecture, delta lake, data fabric, AI/ML.
• Experience designing conceptual and logical data models that enable the implementation of intended business outcomes and re-usable, enterprise data products.
• Strong analytical abilities, presentation skills, and organizational/planning abilities.
• Ability to adjust communications to both technical and non-technical audiences.
• Experience working directly with clients and providing consultation.
The role will report to the DT-US CDO Enterprise Data Architect (US).
Preferred:
• Strong knowledge of Cloud Hyperscalers Infrastructure/Services
• Strong knowledge in Machine learning and AI
• Strong experience in big data technologies like Databricks, Azure Data Lake, Delta Lake, Snowflake, etc.
• Strong experience in Relational and NoSQL databases (SQL Server / CosmosDB / MongoDB etc.)
• Experience in implementing quality guidelines, standards, and procedures.
• Experience implementing and automating models created by data science teams (i.e., Spark, Scala, Hive, Python, R, etc.)
• Experience with data integration tools like Informatica, Azure Data Factory, AWS Glue, etc.
• Experience with visualizations tools like Tableau, PowerBI, QlikSense, etc.
• Ability to understand and relate to business objectives and as a result provide a high-quality automated solution.
• Ability to drive teams to meet consensus and have vivid passion for applications delivery.
• Strong project management and team management skills.
• Creative - able to think outside the box; uses knowledge gained through prior experience, education, and training to resolve issues and remove project barriers.
• Drive for results - maintains constant awareness of projects and clients; keeps product owners focused on short- and long-term milestones.
• Experience interfacing with external software design and development vendors preferred.
• Ability to travel minimally required, up to 10%.