Job Overview

Location
Bengaluru, Karnataka
Job Type
Full Time
Date Posted
4 months ago

Additional Details

Job ID
26550
Job Views
264

Job Description

Description & Requirements

3+ years of experience with Hadoop or any Cloud Big Data components

3+ (specific to the Data Engineering role), preferably Google Cloud

3+ Platform

Expertise in Java/Scala/Python, SQL, Scripting, Hadoop (Sqoop, Hive, HDFS), Spark (Spark Core & Streaming), Kafka or equivalent Cloud Big Data components Proficiency in programming languages: Scala, Python, Java, and Big Data technologies, both On-Premises and Cloud Knowledge of data modeling and database design Hands-on experience with GCP services: BigQuery, Dataflow, Dataproc, Composer/Airflow, Google Storage, Pub-Sub Experience with CI/CD processes: Jenkins, Cloud build tools, Docker, Kubernetes, Artifact Registry Ability to design scalable data architectures on GCP using services like Dataproc, BigQuery, and Composer Experience with ETL (Extract, Transform, Load) processes Responsible for optimizing data pipelines and queries for performance, troubleshooting issues, and proactive monitoring Knowledge of quality checks, data governance, and security policies

Qualification

Any Graduate

Experience Requirements

Fresher Experience

Location

Similar Jobs

Full Time

Vanguard

Data Analyst

Full Time

Stripe

Data Analyst

Full Time

Caterpillar

Data Scientist

Full Time

Cookies

This website uses cookies to ensure you get the best experience on our website. Cookie Policy

Accept