At UBS, we re-imagine the way we work, the way we connect with each other – our colleagues, clients, and partners – and the way we deliver value. Being agile will make us more responsive, more adaptable, and ultimately more innovative.
We’re looking for a Data Engineer to:
• engineer reliable data pipelines for processing the data answering both functional and non-functional requirements.
• design data distribution pipelines selecting appropriate data distribution technologies and using data platform infrastructure appropriately.
• participate in the development of our standardized and harmonized data product.
• participate to the growth of the data platform by participating in its tooling and automation.
• understand, represent, and advocate for stakeholder needs.
• build observability into our solutions, monitor production health, help to resolve incidents, and
• remediate the root cause of risks and issues.
Your Career Comeback
We are open to applications from career returners. Find out more about our program on
Your team
You will be working in Pune, India, in the newly set up Financial Crime Prevention Data Products team which is part of GCRG (Group Compliance, Regulatory & Governance) Technology. Our mission is to build a standardized, complete, and easy-to-use suite of Data Products for the various Financial Crime Prevention (FCP) applications / processes. Our team consists of talented Data Engineers, and Data Analysts who take pride in the quality of work that we deliver. As a data provider, we interact closely with our data consumers in business and in IT, as well as other cross-functional teams that enable us. You will be part of a significant digital transformation that our business area is going through and will have the opportunity to shape it with your experience.
Your expertise
• A bachelor’s or master’s degree, preferably in Information Technology or related field (computer science, mathematics, etc.) focusing on data engineering.
• 5+ years of relevant experience as data engineer in Big Data is required.
• Strong experience in executing complex data analysis and running complex SQL/Spark queries.
• Strong experience in building complex data transformations in SQL/Spark.
• Strong knowledge in Database technologies is required.
• Strong Knowledge of programming languages (Python / Scala) and Big Data technologies (Spark, Databricks or equivalent) is required.
• Strong knowledge in Azure Cloud is advantageous.
• Good understanding and experience with Agile methodologies and delivery.
• Strong communication skills with the ability to build partnerships with stakeholders.
• Strong analytical, data management and problem-solving skills.
• Flexible and resilient team player with strong interpersonal skills taking initiative to drive things forward.
• Keen interest in understanding business context for the data.
• Excellent written and verbal communication skills and ability to work as part of a global team.
• Fluent in English.
About us