Skip to content

 

Job Application

 
 
 

Please answer the following questions in order to process your application.

 
 
Email Address *
 
Select your working status in the UK *
 
 
 
File Attachments:
(2MB file maximum. doc, docx, pdf, rtf or txt files only)
 
Attach a CV * 
 
Optional covering letter 
OR
Clear covering letter
 
 
 * denotes required field
 
 
 
Additional Information:
 
First Name
 
Last Name
 
Address
 
Country
 
Home Telephone
 
Mobile/Cell
 
Availability/Notice
 
Hourly Rate GBP
 
Approximately how far are you willing to travel to work (in miles) ?
 
 
 

Key Privacy Information

When you apply for a job, ComputerJobs will collect the information you provide in the application and disclose it to the advertiser of the job.

If the advertiser wishes to contact you they have agreed to use your information following data protection law.

ComputerJobs will keep a copy of the application for 90 days.

More information about our Privacy Policy.

 

Job Details

 

Data Analyst - AWS - Snowflake - Redshift - DBT - Programming (Python/Scala) - ETL - SQL (Contract)

Location: London/Remote Country: UK Rate: £350 - £400 a day
 

Data Analyst - AWS - Snowflake - Redshift - DBT - Programming (Python/Scala) - ETL - SQL

Data Analyst - AWS - Snowflake - Redshift - DBT - Programming (Python/Scala) - ETL - SQL requirement for an urgent start. The Data Analyst - AWS - Snowflake - Redshift - DBT - Programming (Python/Scala) - ETL - SQL will be working for an award winning consultancy

Data Analyst - AWS - Snowflake - Redshift - DBT - Programming (Python/Scala) - ETL - SQL Key Skills:

  • Proficiency in AWS services relevant to data engineering such as S3, Glue, EMR, Athena, and Lambda.
  • Hands-on experience with Snowflake, Redshift cloud data warehousing solutions.
  • Familiarity with DBT (Data Build Tool) for managing transformations in the data pipeline.
  • Strong programming skills in technologies like Python, Scala, Spark, PySpark or Ab Initio for building data pipelines and ETL processes.
  • Experience with data pipeline orchestration tools like Apache Airflow .
  • Knowledge of SQL and ability to write complex queries for data transformation and analysis.
  • Understanding of Datamodelling Principles and best practices.
  • Familiarity with version control systems like Git for managing codebase and configurations.
  • Prior experience of leading Data Engineering Team
  • Agile Practitioner; Strong Stakeholder Management;
  • Should be able to manage and Prioritize the backlog

Data Analyst - AWS - Snowflake - Redshift - DBT - Programming (Python/Scala) - ETL - SQL - Contract - UK/Remote - Inside IR35


Posted Date: 02 May 2024 Reference: JS Employment Business: We Are Orbis Group Ltd Contact: Tom Jeffs