14.05.2024
AWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2212050
iSanqa
South Africa, Midrand
About the jobAWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2212050Our client requires the services of a Data Engineer/Scientist(Expert) - Midrand/Menlyn/Rosslyn/Home Office rotation. Amazing brand with cutting-edge technology. Excellent teams in Global team collaboration. High work-life balance with Flexible hours. Agile working environment.POSITIONContract until December 2026EXPERIENCE8+ Years related working experience.COMMENCEMENTAs soon as possibleQUALIFICATIONS/EXPERIENCESouth African citizens/residents are preferred.Relevant IT / Business / Engineering DegreeCertifications: Candidates with one or more of the certifications are preferred.AWS Certified Cloud PractitionerAWS Certified SysOps AssociateAWS Certified Developer AssociateAWS Certified Architect AssociateAWS Certified Architect ProfessionalHashicorp Certified Terraform AssociateESSENTIAL SKILLSAbove Average experience/understanding (in order of importance):TerraformPython 3xSQL - Oracle/PostgreSQLPy SparkBoto3ETLDockerLinux / UnixBig DataPowershell / BashGROUP Cloud Data Hub (CDH)GROUP CDEC BlueprintExperience in working with Enterprise Collaboration tools such as Confluence, JIRA.Experience developing technical documentation and artefacts.Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV.Experience working with Data Quality Tools such as Great Expectations.Knowledge of the Agile Working Model.Any additional responsibilities assigned in the Agile Working Model (AWM) Charter.ADVANTAGEOUS TECHNICAL SKILLSDemonstrate expertise in data modelling Oracle SQL.Exceptional analytical skills analysing large and complex data sets.Perform thorough testing and data validation to ensure the accuracy of data transformations.Strong written and verbal communication skills, with precise documentation.Self-driven team player with ability to work independently and multi-task.Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.Familiar with data store such as AWS S3, and AWS RDS or DynamoDB.Experience and solid understanding of various software design patterns.Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.Strong organizational skills.Experience developing and working with REST API's is a bonus.Basic experience in Networking and troubleshooting network issues.Basic experience/understanding of AWS Components (in order of importance)GlueCloudWatchSNSAthenaS3Kinesis Streams (Kinesis, Kinesis Firehose)LambdaDynamoDBStep FunctionParam StoreSecrets ManagerCode Build/PipelineCloudFormationBusiness Intelligence (BI) ExperienceTechnical data modelling and schema design (not drag and drop)Kafka AWS EMR RedshiftROLEData Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.NB: By applying for this role, you consent to be added to the iSanqa database and to receive updates until you unsubscribe. Also note, that if you have not received a response from us within 2 weeks, your application was unsuccessful.#J-18808-Ljbffr
Attention! You will be redirected to another site