Jobs

Apply Now

Applications processed via employer's online application form

Job Details

Category

IT Consultancy

Location

Belfast, Northern Ireland

Closing Date

2024-02-26 00:00:00

Share

Data Virtualisation Engineer Denodo

  • Data Virtualisation Engineer Denodo

    • Redditch, UK
    • Employees can work remotely
    • Full-time
    • Department: Digital Development & Architecture

    Company Description

    We pledge "to prove IT can make a real difference to our customer's businesses". We work hard to ensure we understand what our customers need from their technology solutions and then we deliver.

    We are an award-winning company who provide world class customer service; we think big and we hire great people. Version 1 are more than just another IT services company - we are leaders in implementing and supporting Oracle, Microsoft and AWS technologies.

    Invest in us and we’ll invest in you; if you are driven, committed and up for a challenge, we want to meet you.

    Job Description

    This is an exciting opportunity for an experienced Data Virtualisation Engineer to join a  large-scale data solution project. You will join a team delivering a transformative data virtualisation solution using Denodo platform for a key Version 1 customer. 

    The ideal candidate will have a proven track record in implementing data ingestion and data virtualisation solutions using Denodo platform for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies related to data virtualisation and data engineering to play an important role in developing and delivering early proof of concept and production implementation.

    You will ideally have experience in building data solutions using Denodo platform, a variety of open-source tools & comfortable with using Microsoft Azure services or AWS services, and a proven track record in delivering high-quality work to tight deadlines.

    Your main responsibilities will be:

    • Designing and implementing highly performant data solutions running on a Denodo-based data virtualization platform and Coordinating implementation activities related to the Denodo-based data virtualization platform. 
    • Ensuring operational stability of data solutions running on the Denodo-based data virtualization platform.
    • Stay on top of the latest trends in data virtualization along with growing your own skillset in the area of data virtualization. 
    • Designing and implementing highly performant data ingestion & transformation pipelines from multiple sources using a variety of technologies. 
    • Delivering and presenting proofs of concept of key technology components to prospective customers and project stakeholders.
    • Developing scalable and re-usable frameworks for ingestion and data virtualisation of large data sets & Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
    • Working with event-based / streaming technologies to ingest and process data
    • Working with other members of the project team to support the delivery of additional project components (Data Engineering, Reporting tools, API interfaces, Search)
    • Evaluating the performance and applicability of multiple tools against customer requirements
    • Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.

     

    Qualifications

     You will have experience with:

    • Denodo Platform 8.0 Certified Developer or Denodo Platform 8.0 Certified Architect
    • Hands-on experience designing and delivering data virtualisation solutions using the Denodo platform
    • Familiarity with concepts of data virtualization and data governance
    • Microsoft Azure or AWS Big Data Architecture certification.
    • Direct experience of building data pipelines using Azure Data Factory, AWS Glue or other ETL tools and Apache Spark (preferably Databricks).
    • Experience building data warehouse solutions using ETL / ELT tools such as SQL Server Integration Services (SSIS), Oracle Data Integrator (ODI), Talend, Wherescape Red.
    • Experience working with structured and unstructured data
    • Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
    • Experience with other Open Source big data products eg Hadoop (incl. Hive, Pig, Impala) would be an advantage 
    • Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) would be beneficial 
    • Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform is also an advantage to have

    Additional Information

    Before you apply, here are some of our benefits. We offer profit sharepensionprivate healthcare cover, flexible working policy and more. We offer incentives for accreditations and educational assistance for courses relevant to your role.

    We offer employee recognition in the form of Excellence Awards and V1Ps which is awarded by your peers. Engagement is incredibly important with local engagement teams driving our engagement events!

Apply Now

Applications processed via employer's online application form