Key technologies used - (AWS - | EC2, ECS, Lambda, VPC, Codepipeline, ECR, MWAA, SQS, cloudformation, s3, cloudwatch |, Docker, Python, OOPS, Fast API, PostgreSQL, Opensearch, GitHub, Linux, AI Agents, Heroku )
- I have worked in a remote team environment with employees from various countries, in a product-based company, using the Agile method of working.
- I am deeply involved in understanding and migrating the company infrastructure from Heroku to AWS within the deadline.
- I worked on creating and automating the EC2 instance schedule according to the business use case, and saved resource usage and about 80 percent of the costs.
- Created a setup and used AWS MWAA (Airflow) as a cron job to trigger various scraper runs multiple times throughout the day. Used intrinsic functionalities within Airflow to limit the number of workers assigned for a DAG.
- Frequently used S3 to store and fetch data, relied upon CloudWatch to monitor resources and logs from code, and Systems Manager Resource to store parameters.
- I am responsible for managing IAM for all resources created and managed. Following the 'Principle of Least Privilege'. I also assisted my team members in creating roles and securing their IAM permissions.
- I have written Dockerfiles and used Docker to package the whole project as images to local and AWS (ECR). Worked extensively with variations in Docker setup (bash, shell), and have a firm understanding of Docker commands.
- I made use of VPC for the AWS resources I created by placing them in private and public subnets, according to the requirements. I created a VPN and Bastion host to make use of private subnet resources.
- Created VPC peering for resources placed in different VPCs in the same or different regions.
- Used and managed CodePipeline in AWS to automate Docker builds when changes were pushed to the selected GitHub branch in a project.
- Created AWS RDS in a private subnet, and maintained its state during my time in the company.
- Used CloudFormation scripts in YAML format to create and delete the resources in AWS according to the requirements.
- Worked extensively with Lambdas and also created a Python script to automatically update Lambda images for all.
- Created and managed Elastic Container Registry (ECR). Involved in creating various tasks with ECR using Fargate, suitable task definitions, and computing memory resources.
- I was responsible for managing costs and billing management for all the AWS resources, and was involved in cost-cutting by implementing suitable practices, making some resources redundant in the process.
- I extensively used the boto3 library (AWS SDK) to connect to AWS resources from Python, to interact programmatically and used this to greater scope.
- I used Python to connect to an RDS (Postgres) database in many cases and performed tasks such as inserting, updating, deleting, checksum, and others.
- I used Python to perform data engineering work as the data we collected needed adjustments, manipulation, checking, and appending, among various others.
- I expertly employed object-oriented programming concepts to make the code reusable and eliminate redundancy.
- I always maintained the code well and was involved in debugging others' code as part of my routine tasks.
- In our company, we used GitHub to streamline the code among team members. I have always followed the best practices that are put in place.