Details:
Skillsets required (Mandatory Skills):
- Working knowledge in AWS platform and their key services, AWS Amplify, Lamda Functions, Dynamo DB, CF (Cloud Formation) templates, S3, Step Functions, Sagemaker, ECS (Elastic Container Service), EMR (Elastic Map Reduce),ECR, RDS (Relational Database Service), VPC (Virtual Private Cloud), Apache Airflow
- Understand how to build scalable, real time, Big Data systems
- Prior experience in understanding of Exploratory Big Data analysis, Big Data mining for inference and/or analytical algorithms
- Experience with ETL (Extract, Transform, Load) , data warehousing and data mining.
- Implement basic security and compliance aspects of the AWS platform and the shared security model & Define the billing, account management, and pricing models.
- Working experience in Apache Spark, Hive
- Programming: Python, Core Java, Boto3 Lib
Optional Skills:
- Microsoft Azure, GCP (Google Cloud Platform) knowledge
- AWS Certifications
- Dockers, Kubernetics
- Background and Experience in: Machine Learning, Data Mining, Mathematical and Statistical Modelling
- React application knowledge
Min. Qualification:
Educational Qualifications: B.E/B.tech
Skills Required:
AWS Amplify, Lamda functions, Dynamo DB, AWS cerifications
Roles:
Skillsets required (Mandatory Skills):
- Working knowledge in AWS platform and their key services, AWS Amplify, Lamda Functions, Dynamo DB, CF (Cloud Formation) templates, S3, Step Functions, Sagemaker, ECS (Elastic Container Service), EMR (Elastic Map Reduce),ECR, RDS (Relational Database Service), VPC (Virtual Private Cloud), Apache Airflow
- Understand how to build scalable, real time, Big Data systems
- Prior experience in understanding of Exploratory Big Data analysis, Big Data mining for inference and/or analytical algorithms
- Experience with ETL (Extract, Transform, Load) , data warehousing and data mining.
- Implement basic security and compliance aspects of the AWS platform and the shared security model & Define the billing, account management, and pricing models.
- Working experience in Apache Spark, Hive
- Programming: Python, Core Java, Boto3 Lib