Blumetra logo

Sr. Data Engineer (Tableau)

Blumetra
Full-time
On-site
Telangana, India

Job Title: Senior Data Engineer

Experience: 5 - 8 Years
Work Location: Hybrid 
Employment Type: Full-Time

About the Role

We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize our data systems and pipelines. The ideal candidate will have strong expertise in SQL, Python, AWS, and Tableau, and a passion for transforming raw data into actionable insights that drive business decisions.

Key Responsibilities

  • Design, build, and maintain scalable ETL / ELT data pipelines for ingestion, transformation, and storage.

  • Work with business stakeholders to identify data requirements and translate them into effective data solutions.

  • Develop and optimize SQL queries, stored procedures, and data models for high performance.

  • Implement data integration solutions using AWS services such as S3, Glue, Redshift, Lambda, and RDS.

  • Collaborate with analysts and business users to enable data visualization and reporting in Tableau.

  • Ensure data accuracy, quality, and consistency through validation, testing, and monitoring.

  • Automate workflows and data quality checks using Python scripting.

  • Support data governance, documentation, and adherence to security and compliance standards.

Required Skills & Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.

  • 5+ years of hands-on experience in Data Engineering or similar roles.

  • Strong expertise in SQL for data extraction, transformation, and optimization.

  • Proficiency in Python for data manipulation, automation, and scripting.

  • Solid experience with AWS cloud ecosystem (S3, Glue, Redshift, Lambda, IAM, etc.).

  • Hands-on experience with Tableau for dashboard development and data visualization support.

  • Deep understanding of data warehousing, modeling, and ETL design patterns.

  • Strong problem-solving skills and ability to work in agile, collaborative environments.

Good to Have

  • Experience with Airflow or other workflow orchestration tools.

  • Exposure to Snowflake, Athena, or Data Lake architectures.

  • Knowledge of API integration and data streaming tools (Kafka, Kinesis).

  • Understanding of CI/CD, Git, and modern DevOps practices in data environments.

Apply now
Share this job