🏢 Company Overview:
Impetus is a globally recognized technology solutions company specializing in data engineering, cloud modernization, and AI-driven analytics. With a mission to help enterprises unlock the full potential of their data, Impetus delivers cutting-edge solutions across industries using platforms like Google Cloud Platform (GCP), AWS, and Azure.
💼 Job Details:
- Employment type: Full Time
- Experience: 4 to 6 Years
- Salary: ₹47,000 – ₹67,000 monthly
- Location: Remote
- Work timing: 9:15 AM to 6:15 PM
- Working Days: 5 Days
- Education: Any Degree
📝 Job Description:
Impetus is looking for an experienced Lead GCP Data Engineer to design, develop, and optimize large-scale data pipelines and cloud-based architectures. The ideal candidate will have deep expertise in GCP data services, big data technologies, and a proven track record of delivering high-quality, production-grade data solutions. You will lead a team of data engineers and collaborate with cross-functional teams to ensure efficient data integration, storage, and analysis.
🔑 Key Responsibilities:
- Lead the design and implementation of scalable data pipelines and architectures on Google Cloud Platform
- Work with stakeholders to understand business requirements and translate them into data engineering solutions
- Optimize data flow and data collection processes using best practices
- Manage and mentor a team of data engineers to ensure project success
- Implement ETL/ELT pipelines using tools such as Dataflow, Dataproc, BigQuery, and Pub/Sub
- Ensure data quality, integrity, and governance across systems
- Collaborate with analytics, AI/ML, and DevOps teams for integrated data-driven solutions
- Monitor and enhance system performance and cost efficiency
🛠️ Required Skills & Qualifications:
- Strong hands-on experience with GCP services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer)
- Proficiency in Python, SQL, and Java/Scala
- Experience with ETL/ELT tools and data pipeline orchestration frameworks
- Deep understanding of data warehousing, data modeling, and data lake architectures
- Familiarity with CI/CD, Docker, and Kubernetes in data engineering workflows
- Excellent communication, leadership, and problem-solving skills
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
🌟 Benefits:
- Opportunity to work with world-class data engineering teams
- Access to cutting-edge GCP technologies and global enterprise projects
- Learning and certification support for cloud technologies
- Collaborative and innovation-driven work environment
- Flexible working options and career advancement opportunities
 
									 
					