Jobs

Job Details

Location

Belfast, Northern Ireland

Salary

Commensurate with Experience

Experience

3 + years

Closing Date

2019-04-26 15:00:00

Share

Big Data Engineer

  • Who we want
    Are you passionate about big data?  Does running 20,000 reducers and 10,000 mappers excite you?  Are you amazed thinking about using 60 instances of AWS R4.8Xlarge all at 100% CPU, 100% Memory (14 TB), and 100% network utilization with a 13x data explosion? 
     
    You may not have heard of Bazaarvoice, but you probably have used our software on a regular basis…
     
    We provide consumer generated content to over 700 million devices and ½ billion unique users every month.  That content is hosted and collected on over 5000 brand and retail websites around the world.  Our Mission: build the world’s smartest network of consumers, brands, and retailers.  If you want to build software that impacts over 500 million users, helping them make smart buying decisions, work with large data sets, build large distributed systems, and more, then take a look at then take a look at the requirements below and submit your CV…even if your just close.
     
    Essential Skills
    BSc in Computer Science or equivalent (education or work based)
    3+ years building software in a commercial environment (preferably SaaS)
    2+ years’ experience with one or more of the following Java, Scala, C#, or similar language
    Working knowledge of at least one scripting language – PHP, Python, Bash, JavaScript, etc
    Able to demonstrate software engineering fundamentals such as OO design, unit testing, code reuse, code reviews, run time analysis, etc.
     
    Desired Criteria
    Demonstrated experience with Hadoop
    AWS experience including one or more of the following Cloud Formation, EMR, S3, EC2, Cloud Trail, etc.
    Experience in the design, implementation, and support of major components of an analytics solution.
    Experience with Storm or Spark
    Familiar with Maven, Hudson, Git, etc.
    Working with TB+ size data sets
    Performing large-scale analytics using Map/Reduce
    Demonstrated experience in the design and development of large scale, performant, and fault tolerant applicatio
    Experience in defining data to be captured, how it would be aggregated, and how to implement those mechanisms
    Experience and passion in working on agile teams using Scrum, Kanban, or similar
    Nice to have experience building large-scale data processing systems with extensive knowledge in data warehousing solutions.
    This includes developing prototypes and proof-of-concepts for the selected solutions.
    Familiarity with one or more big-data infrastructures such as Hbase, Hadoop, Mongo, Casandra, or RDBMS.
    An understanding and experience building high-performance algorithms.