Enable job alerts via email!

Data Engineer - Data Integration specialist

Network.com

Singapore

Remote

SGD 80,000 - 120,000

Full time

15 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading company in the data engineering domain is seeking a Senior Data Engineer to join their team on a contract basis. You will focus on developing and maintaining robust data pipelines and enhancing the cloud data platform. The ideal candidate will have strong experience in data warehousing and integration, with proficiency in tools like Snowflake and Python. Join a dynamic team and contribute to implementing automation and data management best practices that drive business success.

Qualifications

  • Minimum 5 years of experience in data warehousing, analytics, and pipeline development.
  • Proficiency in Python and expertise in Snowflake required.
  • Experience with data ingestion from platforms like Salesforce preferred.

Responsibilities

  • Design, build, and maintain real-time and batch data pipelines.
  • Develop and optimize code for data ingestion and transformation.
  • Manage production releases and support data migrations.

Skills

Data Integration
Data Warehousing
Data Transformation
Data Governance
Data Security
Data Modeling
Python
Analytics

Education

Bachelor's degree in Computer Science or a related technical field

Tools

Snowflake
Matillion ETL Tool
Tableau

Job description

    Job Title: Senior Data Engineer Data Integration SpecialistLocation: RemoteExperience: 5+ YearsEmployment Type: ContractDomain: Data Engineering / Cloud Data WarehousingJob Summary:We are seeking an experienced Data Engineer to join our team on a contract basis. This role focuses on supporting data integration efforts through the development and maintenance of robust, scalable, real-time, and batch data pipelines. You will work in a dynamic, agile environment and collaborate with global teams to enhance our cloud data platform, implement automation, and drive data management best practices.---Key Responsibilities: Design, build, and maintain data pipelines (real-time and batch). Develop and optimize code for data ingestion and transformation. Validate data integrity and perform comprehensive testing. Manage production releases and support data migrations. Prepare and maintain technical documentation. Work effectively in an Agile development environment. Optimize and fine-tune code for improved performance. Promote and implement core data management principles. Lead projects and initiatives across multiple geographies. Continuously enhance the data platform and introduce automation adhering to best practices.Skills & Experience Required: Bachelor's degree in Computer Science or a related technical field. Minimum 5 years of experience in data warehousing, analytics, and pipeline development. Expertise in Snowflake Cloud Data Warehouse. Proficiency in Python. Experience with Matillion ETL Tool is preferred. Familiarity with Data Vault 2.0 methodology. Proven experience with data ingestion from platforms like Salesforce, Netsuite, etc., into Snowflake. Excellent communication skills and a collaborative mindset. Strong technical skills for building scalable and efficient solutions. Knowledge of Salesforce and Netsuite is a plus. Experience with Tableau or other BI/reporting tools is advantageous.,

Sign-in & see how your skills match this job

Find Your perfect Job

Sign-in & Get noticed by top recruiters and get hired fast

ETL, Data Transformation, AWS, Data Governance, Data Security, Data Modeling, Data Warehousing, Data Integration, Informatica, Talend, ERwin, PowerDesigner, Cloud Computing, Python, Java, Scala, Data Visualization, Tableau, Power BI, QlikView, Analytics, Machine Learning, Project Management, Relational Databases, Hadoop, Spark, Snowflake, Kafka,Cloud Native Data Platform, Microsoft Stack, Data Trends, Data Fabric, Data Mesh, Data Tools, Qlik, Apache NiFi, ERStudio, NoSQL Databases, CloudBased Databases, Big Data Technologies, Databricks, Data Integration Techniques

Snowflake, Apache Kafka, Python, Jenkins, GitLab, Data Architecture, Project Management, Agile Methodology, SQL, DynamoDB, Oracle, Data Warehousing, Agile Methodologies, JIRA, Trello, Business Analysis, Analytical Skills, Communication Skills, Interpersonal Skills,Lambda, S3, Redshift, Databricks, Spark Streaming, PySpark, Data Modelling, Database Management Systems, Teradata, Elastic Platforms, Multi Cloud Solution Design, Data Governance Frameworks, Product Ownership, Data Analytics Tools, Banking Domain Knowledge

ETL, Data Transformation, AWS, Data Governance, Data Security, Data Modeling, Data Warehousing, Data Integration, Informatica, Talend, ERwin, PowerDesigner, Cloud Computing, Python, Java, Scala, Data Visualization, Tableau, Power BI, QlikView, Analytics, Machine Learning, Project Management, Relational Databases, Hadoop, Spark, Snowflake, Kafka,Cloud Native Data Platform, Microsoft Stack, Data Trends, Data Fabric, Data Mesh, Data Tools, Qlik, Apache NiFi, ERStudio, NoSQL Databases, CloudBased Databases, Big Data Technologies, Databricks, Data Integration Techniques

Snowflake, Apache Kafka, Python, Jenkins, GitLab, Data Architecture, Project Management, Agile Methodology, SQL, DynamoDB, Oracle, Data Warehousing, Agile Methodologies, JIRA, Trello, Business Analysis, Analytical Skills, Communication Skills, Interpersonal Skills,Lambda, S3, Redshift, Databricks, Spark Streaming, PySpark, Data Modelling, Database Management Systems, Teradata, Elastic Platforms, Multi Cloud Solution Design, Data Governance Frameworks, Product Ownership, Data Analytics Tools, Banking Domain Knowledge

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.