Project timeline: 3 weeks

• Evaluate the external APIs(5+) information and its data's
• Design & Document data integration technique of ingesting external/internal APIs into AWS datalake
• Design & Document  AWS datalake solutions using AWS services, methods & tools
• Current state of the data is mostly batch ingestion
• Document data required to pull from the given APIs
• Document any limit & throttling, pagination of the given APIs
• Design & Document suitable AWS database solutions

Expected Outcome:

• Powerpoint presentation of the High-level architecture, component details, Cost estimation
• Terraform framework to create required services
•  Run book to perform end to end testing
• Strict to the mentioned timeline
• Identified candidate can be used for implementing/Leading the whole project once design & POC is approved

Preferred Qualifications

• Experience with highly-available, fault-tolerant architectures
• Experience with Serverless and Containers-based architectures
• AWS Data and Analytics Certification
• 5+ years experience working with AWS Cloud Data and Analytics
• 5+ years of experience designing, implementing or consulting in Big Data Infrastructure
• 3+ experience with Terraform/pulumi/serverless
• Experience communicating with technical and non-technical audiences and executive-level stakeholders and clients.
• 5+ years of hands-on Big Data Solution Development

Budget: $1,000
Posted On: April 04, 2023 02:25 UTC
Category: Solution Architecture
Skills:Amazon Web Services, ETL Pipeline, Big Data, Amazon S3, Apache Spark, Amazon Redshift, SQL, Python, Amazon RDS, API
Country: Canada
click to apply