Job Description
What will your job look like?
Perform Development & Support activities for Data warehousing domain using Big Data Technologies
- Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues
- Perform Development & Deployment. Should be able to Code, Unit Test & Deploy
- Creation necessary documentation for all project deliverable phases
Technical Skills
Mandatory
- Data Engineering :
- Strong Experience on one of the big data platform ( Hadoop / Snowflake/ ADLS / Big query )
- Hands on experience on Python/Java programming
- Experience on Spark or Azure Data bricks
- Strong SQL analysis skills
- Experience of working with Kafka
Good to have
- Cloud Skills – should have knowledge on either AWS or Azure or GLP.
- Sound knowledge of Kubernetes and deployment methodologies
All you need is...
- 2+ or 4+ years of experience in the role of implementation of high end software products.
- Sound knowledge of Kubernetes and deployment methodologies
- Bigdata either of Spark/Pyspark is mandatory , with strong problem-solving skills, and able to thrive with minimal supervision.
- Knowledge of database principles, SQL, and experience working with large databases.
Behavioral Skills
- Eagerness & Hunger to learn
- Good problem solving & decision making skills
- Good communication skills within the team, site and with the customer
- Ability to stretch respective working hours when necessary, to support business needs
- Ability to work independently and drive issues to closure
- Consult when necessary with relevant parties, raise timely risks
Why You Will Love This Job
- You will be challenged to design and develop new software applications.