🏢 Company Overview:
Tranzmeo is an innovative technology company specializing in data intelligence, AI-driven analytics, and enterprise-grade software solutions. The company focuses on leveraging big data and machine learning to help organizations transform raw data into actionable insights. With a commitment to technological excellence and client success, Tranzmeo continues to deliver cutting-edge solutions across industries such as energy, finance, logistics, and telecom.
💼 Job Details:
- Employment type: Full Time/ Part Time
- Experience: 9+ Years
- Salary: 10L to 12L Yearly
- Location: Work For Home
- Work timing: 9:45 AM to 6:45 PM
- Working Days: 5 Days
- Education: Any Degree
📝 Job Description:
Tranzmeo is seeking a skilled Big Data Application Support Engineer to provide end-to-end support for data-driven applications and platforms. The ideal candidate will be responsible for maintaining, troubleshooting, and optimizing big data systems to ensure reliability, performance, and scalability. You will work closely with data engineers, developers, and DevOps teams to ensure smooth functioning of all big data pipelines and services.
🔑 Key Responsibilities:
- Monitor, maintain, and support large-scale big data applications and systems
- Troubleshoot and resolve technical issues related to data ingestion, transformation, and storage
- Collaborate with cross-functional teams to optimize application performance
- Perform root-cause analysis and provide preventive maintenance solutions
- Manage data pipelines and ensure continuous data flow and availability
- Maintain proper documentation and support logs for issue tracking
- Implement automation tools for efficient application support and monitoring
- Participate in 24×7 on-call rotations for production support (if required)
🛠️ Required Skills & Qualifications:
- Strong knowledge of Hadoop ecosystem (HDFS, Hive, Spark, HBase, Oozie, etc.)
- Experience with Linux/Unix systems administration and shell scripting
- Proficiency in Python, Java, or Scala for automation and support tasks
- Understanding of cloud environments such as AWS, GCP, or Azure
- Familiarity with monitoring tools (Grafana, Prometheus, Kibana, etc.)
- Experience in managing ETL pipelines and big data job scheduling
- Excellent problem-solving and analytical skills
- Bachelor’s degree in Computer Science, Information Technology, or related field
🌟 Benefits:
- Exposure to advanced big data technologies and real-world projects
- Opportunity to work with a collaborative and innovative team
- Continuous learning and professional development opportunities
- Flexible working environment and supportive culture
