Plan Your Next Integration Move with a Free 4-Hour Consultation. Schedule Now

Data Engineer

India
|
3 to 6 Years

Job Description

We are seeking a highly skilled and motivated Data Engineer to join our growing technology team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines, ensuring high-performance data availability, and supporting advanced analytics and business intelligence initiatives. You will work closely with cross-functional teams to translate business requirements into reliable, secure, and optimized data solutions.


Job Roles & Responsibilities

  • Design, build, and optimize batch and real-time data pipelines.
  • Develop and maintain ETL/ELT processes to ingest, clean, transform, and structure data.
  • Work with cloud data platforms (e.g., AWS, GCP, Azure) to manage data lakes, warehouses, and related services.
  • Implement and maintain data models supporting analytics, reporting, and operational needs.
  • Collaborate with Data Analysts, Data Scientists, and product teams to deliver reliable datasets.
  • Monitor pipeline performance, troubleshoot issues, and maintain high data quality.
  • Manage and optimize storage solutions for structured and unstructured data.
  • Ensure compliance with security, governance, and data privacy standards.
  • (Good to Have) Build and maintain API-based integrations between internal systems and external data sources.
  • (Good to Have) Work with REST, GraphQL, or event-driven integrations for data ingestion or distribution.

Skills and Qualifications

  • 3+ years of experience in Data Engineering or a similar role.
  • Strong SQL skills and experience with relational and NoSQL databases.
  • Solid experience with at least one programming language (Python preferred).
  • Hands-on experience with data processing frameworks (e.g., Spark, Beam, Pandas, DBT).
  • Experience with cloud platforms (AWS/GCP/Azure) and services such as Redshift, BigQuery, Snowflake, Databricks, ADF, etc.
  • Knowledge of data modeling, warehousing concepts, and schema design.
  • Experience with version control (Git) and CI/CD workflows.
  • Understanding of data governance, data quality, and metadata management.


Good to Have

  • Experience building or consuming APIs (REST, GraphQL).
  • Familiarity with API management tools or integration platforms (e.g., Mulesoft, Apigee, AWS API Gateway).
  • Understanding of event-driven architecture (Kafka, Pub/Sub, Kinesis).
  • Experience with containerization (Docker) and orchestration tools (Airflow, Prefect, Dagster).
  • Exposure to microservices, cloud networking, and system integration patterns.


Soft Skills

  • Strong problem-solving skills and attention to detail.
  • Ability to work independently and collaboratively in a cross-functional team.
  • Good communication skills and ability to translate technical concepts for non-technical stakeholders.
  • Proactive, curious, and continuously learning new technologies.

Salary and Perks

  • In commensuration with the skills and experience of the individual.

 

Interested candidates can send their application and CV to [email protected]

Apply Here

Join the team of innovators who are turning challenges into meaningful change.

One file only. 10 MB limit. Allowed Formats: pdf, doc, docx

By submitting this form, you agree to our processing of your personal data in accordance with our Privacy Policy.