Enable job alerts via email!

Analytics Engineer (Kafka, Grafana) | Permanent Remote OR WFH

Carnation Infotech

United States

Remote

USD 100,000 - 130,000

Full time

Today
Be an early applicant

Job summary

A tech company is seeking an Analytics Engineer to design executive dashboards and manage data pipelines. The ideal candidate will have a strong background in Kafka, times-series databases, and dashboard design. This is a full-time remote position requiring 5-8 years of experience, with a focus on delivering actionable insights for business decision-making.

Qualifications

  • 5-8 years of relevant experience required.
  • Strong expertise in streaming data transformation.
  • Experience with time-series databases.

Responsibilities

  • Collaborate with the CTO and leadership team to understand business requirements.
  • Design real-time data pipelines from Kafka into TimescaleDB.
  • Develop and maintain Grafana dashboards to present insights.

Skills

Hands-on experience with Apache Kafka
Proficiency in SQL
Experience in designing Grafana dashboards
Understanding of ETL/ELT pipelines
Proficiency in programming languages (Python, Java, Scala)
Strong analytical skills

Education

B. Tech/M. Tech/BCA/MCA

Tools

TimescaleDB
PostgreSQL
Docker
Kubernetes
Job description
Job Specifications
  • Job Title: Analytics Engineer
  • Department: Software Development & Engineering
  • Seniority Level: Mid-Senior Level
  • Employment Type: Full-Time Contract (8 Hours Support Needed)
  • Relevant Experience: 5-8 Years
  • Qualification: B. Tech/M. Tech/BCA/MCA
  • Shift Timings: Afternoon Shift (3:30 PM-12:30 AM IST)
  • (Shift timings are subject to change according to project requirements)
  • Job Location: 100% Remote / Work from Home
Position Overview

We are seeking a highly skilled Analytics Engineer to design and implement executive-level dashboards that provide real-time operational insights for our C-suite leadership team. This role involves end-to-end ownership of the data transformation pipeline from ingesting event streams via Kafka to storing business-ready datasets in TimescaleDB and finally exposing them through Grafana dashboards.

The ideal candidate will have strong expertise in streaming data transformation, time-series databases, and dashboard design, combined with the ability to translate business requirements into technical solutions.

Key Responsibilities
  • Collaborate with the CTO and leadership team to understand business requirements for executive insights.
  • Design and implement real-time data pipelines from Kafka into TimescaleDB.
  • Define and build robust data transformation logic to convert raw event streams into structured, business-ready datasets.
  • Develop and maintain Grafana dashboards to present actionable insights on daily operations.
  • Optimize queries, schema design, and storage strategies in TimescaleDB for high performance and scalability.
  • Ensure data accuracy, consistency, and availability for real-time decision making.
  • Implement monitoring and alerting for the data pipeline to ensure reliability and resilience.
  • Stay up to date with advancements in real-time analytics and propose improvements to the overall architecture.
Required Qualifications
  • Strong hands-on experience with Apache Kafka (data ingestion, stream processing, connectors).
  • Proficiency in SQL and working with TimescaleDB or PostgreSQL (time-series database experience).
  • Experience in designing and deploying Grafana dashboards for business reporting.
  • Solid understanding of ETL/ELT pipelines, real-time data transformations, and data modeling.
  • Proficiency in one or more programming languages (e.g., Python, Java, Scala) for building data pipelines.
  • Strong analytical skills and ability to translate business needs into technical solutions.
Nice to Have
  • Exposure to stream processing frameworks such as Apache Flink, Kafka Streams, or Spark Streaming.
  • Familiarity with containerized environments (Docker/Kubernetes) and CI/CD practices.
  • Experience with monitoring and logging tools (Prometheus, ELK stack, etc.).
  • Background in executive reporting, KPI definition, or operational analytics.
  • Knowledge of cloud platforms (AWS, GCP, Azure) for hosting and scaling real-time data solutions.
Traits
  • Strong business acumen to align technical solutions with executive priorities.
  • Clear and concise communication skills for both technical and non-technical stakeholders.
  • Collaborative and proactive in gathering requirements and delivering solutions.
  • High ownership, accountability, and attention to detail in execution.
  • Adaptable and solution-oriented, comfortable working in fast-changing environments.
How to Apply

If you are comfortable with this JD, please share your updated resume and details at abhinav.jaiswal@carnationit.com.

Application Details (for recruiter use)

Full Name (As per Aadhaar):

Current CTC:

Expected CTC:

Notice period:

How soon you can join:

Current location:

Hometown:

Preferred Location:

Any Offers in Hand:

Last Working Day:

Overall Experience:

Relevant Experience:

Current Organization:

Comfortable to work as a Contractor:

Date Of Birth:

Kinldy share the link of your LinkedIn ID:

Reason for job change:

Reason for Gap (If any):

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.