Lead Software Engineer – Cloud
Summary:
The Cloud Engineer is responsible for the plan, design, as well as deployment automation of platform solutions on AWS, and is instrumental in profiling and improving front-end and back-end application performance, mentor team members and take end to end technical ownership of applications. Must be able to stay on top of technology changes in the market and continuously look for opportunities to leverage new technology.
Responsibilities:
- Design, build and implement cloud platform solutions using Typescript, Java, and Python.
- Design and build data pipelines for supporting analytical solutions.
- Provide level of effort estimates to support planning activities.
- Provide microservices architecture and design specifications.
- Fix defects found during implementation process or reported by the software test team.
- Support software process definition and improvement initiatives and release process working with DevOps team in CI/CD pipelines developed with Terraform as Infrastructure-as-Code.
- Execute security architectures for cloud systems.
- Understand and recognize the quality consequences which may occur from the improper performance of their specific job; has awareness of system defects that may occur in their area of responsibility, including product design, verification, and validation, and testing activities.
- Mentor less experienced team members.
- Collaborate with Product Designers, Product Managers, Architect and Software Engineers to deliver compelling user-facing products.
- Must be able to perform the essential functions of the job, with or without reasonable accommodation.
Requirements:
- Bachelor’s degree in Computer Science / related engineering field OR equivalent experience in related field.
- 10+ years of experience in cloud application development.
- Expert proficiency in either Typescript or Java.
- Experience in architecting and developing cloud-based solutions.
- Experience in AWS services including API Gateway, S3, CloudFront, Lambda, ECS, EKS, Step
- Functions, SQS, Event Bridge, Cognito, Dynamo, Aurora PostgreSQL, Redshift,
- OpenSearch/Elasticsearch, eMR (Kafka) and Data Pipelines.
- Extensive experience in developing applications in POSIX compliant environments
- Strong knowledge of containerization, with expert knowledge of either Docker or Kubernetes.
- Proficient in IAM security and AWS Networking.
- Expert understanding of building and working with CI/CD pipelines.
- Experience in designing, developing, and creating data pipelines, data warehouse applications and analytical solutions including machine learning
- Deep cloud domain expertise in architecture, big data, microservice architectures, cloud technologies, data security and privacy, tools, and testing
- Excellent programming skills in data pipeline technologies like Lambda, Kinesis, S3, Glacier, Glue, MSK (Kafka), eMR (Apache Spark), Athena, RedShift.
- Extensive experience with Service Oriented Architecture, microservices, virtualization and working with relational databases and non-relational databases.
- Excellent knowledge of building big data solutions using NoSQL databases.
- Experience with secure coding best practices and methodologies, vulnerability scans, threat modeling, and cyber-risk assessments.
- Familiar with modern build pipelines and tools
- Ability to understand business requirements and translate them into technical designs
- Familiarity with Git code versioning tools
- Good written, verbal communication skills
- Great team player
Salary: $145-160k
Position Type: C2H or Direct Hire
Location: Remote (US)
Requirements:
Bachelor’s degree in Computer Science / related engineering field OR equivalent experience in related field.
10+ years of experience in cloud application development.
Expert proficiency in either Typescript or Java.
Experience in architecting and developing cloud-based solutions.
Experience in AWS services including API Gateway, S3, CloudFront, Lambda, ECS, EKS, Step
Functions, SQS, Event Bridge, Cognito, Dynamo, Aurora PostgreSQL, Redshift,
OpenSearch/Elasticsearch, eMR (Kafka) and Data Pipelines.
Extensive experience in developing applications in POSIX compliant environments
Strong knowledge of containerization, with expert knowledge of either Docker or Kubernetes.
Proficient in IAM security and AWS Networking.
Expert understanding of building and working with CI/CD pipelines.
Experience in designing, developing, and creating data pipelines, data warehouse applications and analytical solutions including machine learning
Deep cloud domain expertise in architecture, big data, microservice architectures, cloud technologies, data security and privacy, tools, and testing
Excellent programming skills in data pipeline technologies like Lambda, Kinesis, S3, Glacier, Glue, MSK (Kafka), eMR (Apache Spark), Athena, RedShift.
Extensive experience with Service Oriented Architecture, microservices, virtualization and working with relational databases and non-relational databases.
Excellent knowledge of building big data solutions using NoSQL databases.
Experience with secure coding best practices and methodologies, vulnerability scans, threat modeling, and cyber-risk assessments.
Familiar with modern build pipelines and tools
Ability to understand business requirements and translate them into technical designs
Familiarity with Git code versioning tools
Good written, verbal communication skills
Great team player
- Category : Information Technology & Telecom
- Company Name : Vaco
- Salary : $145,000-160,000 per year
You must be logged in to post a review.