Apply Now

Applications processed via employer's online application form

Job Details


Belfast, Northern Ireland

Closing Date

2021-03-05 00:00:00


Cloud Software Engineer

  • Team Context

    We are looking for talented and inquisitive individuals help us grow our expertise in end-to-end cloud enabled data architecture as part of Aflac’s global Cloud Data Transformation programme. We are looking to recruit Cloud Data Engineers to drive through change and spearhead Aflac’s journey from on-premise to the Cloud with innovative data technology solutions.

    Our Belfast based Cloud Solutions Engineering team will be integral to the cloud strategy, following DevOps principles and utilising AWS Cloud offerings, developing full stack solutions and consolidating existing but disparate data hubs around the organisation consolidating Aflac’s data making it readily available for gathering deeper insights.

    The Cloud Solutions Engineering team will be responsible for designing, building and maintaining the data platform. Using this platform to support innovative Big Data solutions that help to deliver broader insights across the company, improving not only product offerings but how we support our policyholders. These cloud-based solutions will cover the full lifecycle from discovery, quality, stewardship, ingest, enrichment and exposure to multiple data consumers.

    This high-performing Agile team will be fully functional concentrating on application development, API development, data engineering, and testing capability all located in Northern Ireland, and partnering with SMEs in the United States.

    Role Context 

    You will be a hands-on technical contributor using your experience to assist us building large-scale AWS cloud-based distributed data processing systems and applications covering the full insurance lifecycle from quote through to claims. You will be an integral member of the team assisting day to day in the execution and delivery of enterprise level business critical projects. You will develop your skillset to cope with the evolving and increasing complexity of data-related problems t and you will freely pass on your knowledge to others You will have autonomy and will be expected to be self-motivated, complete individual stories but be a team player. A key aspect of the role will be to work with all data engineers across the business in multiple geographical locations to ensure a ful understanding of Aflac’s current data requirements and solutions to better develop solutions to serve our stakeholders and policyholders. 

    What you’ll be doing

    • Using DevOps at the heart of everything; Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with third parties such as Infoworks, Kafka, Snowflake and Apigee
    • Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud.
    • Analyse, re-design, re-architect and re-platform on-premise Hadoop data warehouse to data platform on AWS using a combination of AWS Native, open source and third-party services.
    • Adoption of the AWS toolset as well as appropriate usage of languages and tools not limited to Spark, EMR, DynamoDB, Kafka and Kinesis, Hive and Hadoop.
    • Work with the data solutions architecture team to help to design and develop production data pipelines, data API solutions and visualisations using but not limited to appropriate technology e.g. Scala, Python, Java, Spring boot, Tableau, Power BI.
    • Design and implement full CI/CD data engineering, ingestion and curation functions on AWS cloud using AWS native and bespoke programming.
    • Develop strong working relationships with local engineering teams to maximise the outcomes of the team and the benefit to customers.
    • Listen to the inputs of others and collaboratively work towards the best solution that maximises the value to the policyholder, without ego or prejudice
    • Solve complex problems alongside fellow team members in a fun and engaging environment.
    • Be blessed with an inbuilt curiosity, a questioning mindset be prepared to collaborate with a team of unique Cloud solution engineers on a journey of disruption and discovery to prove business value gained from insights using data solutions you have created.

    What you should have

    Below is an overview of the skills and experience we are looking for, but remember, don’t rule yourself out if you don’t have everything on the list – it’s your intellect and your attitude we are really after. 


    • Demonstrate experience working on large scale data solutions either on-premise or in the cloud. 
    • Multiple years of IT experience with n-tier, database and client server design/development programming in languages such as Python, Java, Scala Spring and Spring boot
    • Excellent written and verbal communication skills 
    • Bachelor’s degree, preferably in a Computer Science, Engineering, or STEM subject. Comparable commercial experience may also be considered
    • Knowledge of general data engineering principles and data architecture 
    • Knowledge and experience of two or more of the key technologies required for the role: AWS , Kafka, Apigee , Infoworks , Snowflake , Informatica, EMR, S3, Tableau and Power BI


    • Familiar working in a DevOps, Agile and Scrum environment
    • Previous experience with object-oriented programming languages such as, C++, C#
    • Previous RDBMs database experience
    • MS Azure or GCP experience

    So that’s us. Thanks for taking the time to read this far (pretty impressive in the era of the 280-character tweet). We look forward to hearing from you if you fancy joining a new tech innovation company with the agility of a start-up and the stability of a Fortune 500 U.S. company.

Apply Now

Applications processed via employer's online application form