Overview
We are seeking the Senior DevOps Engineer (AWS - Python/Bash) to be a part of a team to lead the transformation to DevOps and Continuous Delivery across the organization. You will work with multiple development projects and collaborate with other DevOps engineers.
Responsibilities
- Support both development and production environments
- Implement monitoring processes and design/deploy monitoring dashboards
- Coaching, mentoring of team members
- Work directly with development teams to bring new products into the cloud
- Help to design and support internal development environments inside AWS
- Design, Implement and Maintain all Amazon Web Services Infrastructure and Operational Services
- Participate in agile process across multiple development teams
- Create process to perform continuous deployment, including full orchestration of deployment process
- Develop tools and scripts to improve efficiency of operational task
- Create and provide best practices to the organization for DevOps, builds, continuous integration and deployment and infrastructure
- Build and support system automation, deployment, and continuous integration tools
- Effectively communicate project status, metrics, and issues
Basic Qualifications
- 3-5+ years of experience as a DevOps Engineer
- 3-5 years working in cloud providers AWS
- 3-5 years working in Linux environments
- 3-5 years scripting languages with at least one of the following: Bash or Python
- 3-5 years using SCM Git, GitHub, and solid understanding branching and merging
- 3-5 years’ experience with the SDLC (Software Development Life Cycle)
Preferred Skills
- Bachelor’s degree in Computer Science, Computer Engineering or related field of study
- Expert in Cloud infrastructures and deployment models
- Expertise in using AWS services and cli
- Expertise in Bash and/or Python scripting skills
- Experience with Continuous Integration and Deployment (CI/CD) tools such as Jenkins, Azure DevOps, and/or GitLabs
- Strong knowledge of Cloud best practices, operations and security
- Experience with big data processing tools such as Hadoop or Spark
- Experience administering Databases such as Postgres, MariaDb, Mongo, MySQL, and or MSSQL
- Experience with performance tuning and benchmarking
- Knowledge of Build technologies, Maven/Gradle/Node/NPM
- Skilled with Log aggregation using SumoLogic or other tools
- Experience with Container Infrastructure, ECS, Kubernetes, Docker Swarm
- Skilled with Jira, Confluence, Jenkins, Azure DevOps, Artifactory, DockerHub, Github
- Nginx experience
- Experience with Agile methodology
- MongoDB and cluster configurations
- RabbitMQ and cluster configurations
- ElasticSearch and cluster configurations
- Experience with NoSQL technologies like Memcache or Redis
- Excellent communication and written skills