Devops Engineer
Job description
My Client are looking for Data Engineers with a range of expertise in the Data Engineering space with the eagerness to learn and deliver large complex Digital Transformation programmes for one of our major public sector clients. As a Data Engineer you will have worked on Data Integration into Cloud Data Warehouses or Data Lakes, programming, APIs, etc. in an Agile environment
What you'll Be Doing
- Implement data flows to connect operational systems, data for analytics and business intelligence (BI) systems
- Document source-to-target mappings
- Re-engineer manual data flows to enable scaling and repeatable use
- Support the build of data streaming systems
- Write ETL scripts and code to make sure the ETL process performs optimally
- Develop business intelligence reports that can be re-used
- Build accessible data for analysis
- Optimise the code to ensure processes perform optimally
- Lead work on database management
What You'll Bring
- The Data Engineers will need to have expertise or exposure to the following skillset
- AWS or Azure Cloud
- ETL
- Talend
- Pentaho PDI
- Cloudera Hadoop
- Denodo
- SQL
- AWS Redshift
- GitLab
- Java backend development
Nice to includes: Kafka, Bash scripting, Spark / Java Elastic Search, Git Runner, Ansible, Artifactory, Linux, Berlin scheduler, CRON, Kibana, Sensu ,Grafana, Confluence/ Jira, Service Now, Pentaho Reporting, Power BI, Oracle RDS and Business Objects."
How can we help?
If you would like us to contact you with more information on what we do and what we can offer, please let us know.
Contact Us