Join to apply for the Software Engineer role at Alamere Software Inc.
Get AI-powered advice on this job and more exclusive features.
- Interfacing with internal stakeholders to understand business requirements and converting them into technical specifications.
- Analyzing user needs and software requirements to assess feasibility, preparing high-level designs, and providing estimations.
- Architecting Big Data solutions using Hortonworks Data Platform, Amazon Web Services, and Apache Hadoop.
- Designing Hadoop pipelines to fetch data from external systems and load into Data Lake, Data Warehouse (DWH).
- Developing and demonstrating proof of concept (POC) projects.
- Building infrastructure pipelines using Cloud and Hybrid Platforms for application development.
- Developing programs in Big Data applications using HDFS, SQOOP, HIVE, OOZIE, SPARK, and PYTHON.
- Managing multi-tenant environments for real-time and batch analysis.
- Maintaining Hadoop clusters and mitigating outages, including upgrades.
- Developing complex programs and mappings, integrating into workflows, and scheduling via Informatica BDM or control-mscheduler.
- Automating cluster configurations and maintenance using Ansible.
- Migrating Kafka streaming platforms from physical servers to containerized environments.
- Testing new tools in lower environments before deployment.
- Conducting peer reviews and incorporating stakeholder feedback for quality improvement.
- Developing APIs using Java, Python, and Unix bash scripting for Big Data applications.
- Collaborating with users and project teams daily for development and pre-activities.
- Remote telecommuting from anywhere in the US is an option.
- The position requires a Master's degree in Computer Science or a related field, with experience or education in AWS, Azure, Python, Ansible, SQL, Shell scripting, Unix, CI/CD, change management, build and release, containerization, orchestration, Big Data stack, ELK, and Kerberos.