AUTHORIZED IT-70 SCHEDULE PRICELIST

Build complex workflows to move data around. Spark code is in AWS, Scala and Python. Performance tuning, working in big data environments, and AWS IAM concepts are all needed skills. Minimum 2 years of Spark, 3 years of experience with AWS services as applied to Big Data – i.e. Redshift, S3, EMR, Lambda, and Athena. ................
................