Job Description :
– Able to partner with others in solving complex problems by taking a broad perspective to identify innovative solutions.
– Strong skills building positive relationships across Product and Engineering.
– Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
– Able to quickly pick up new programming languages, technologies, and frameworks.
– Experience working in Agile and Scrum development process.
– Experience working in a fast-paced, results-oriented environment.
– Experience in Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2, IAM etc.
– Experience working with Data Warehousing tools, including SQL database, Presto, and Snowflake
– Experience architecting data product in Streaming, Serverless and Microservices Architecture and platform.
– Experience working with Data platforms, including EMR, Airflow, Databricks (Data Engineering & Delta Lake components, and Lakehouse Medallion architecture), etc.
– Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc.
– Experience working with distributed technology tools, including Spark, Python, Scala
– Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture
– Working knowledge of Reporting & Analytical tools such as Tableau, Quicksite etc.
– Demonstrated experience in learning new technologies and skills.
– Bachelor’s degree in Computer Science, Information Systems, Business, or other relevant subject area
How to Apply:
- First, read through all of the job details on this page.
- Scroll down and press the Click Here button.
- To be redirected to the official website, click on the apply link.
- Fill the details with the information provided.
- Before submitting the application, cross-check the information you’ve provided.