Connecting...

W1siziisijiwmtkvmdevmdkvmtivmtkvntavndavawn0mi5qcgcixsxbinailcj0ahvtyiisijiwmdb4nzawiyjdxq
Job

Java / Scala / Hadoop Developer (Consultant) - London - Financial

  • Job ref:

    Java / Scala / Hadoop Developer

  • Location:

    London

  • Sector:

    ICT

  • Job type:

    Permanent

  • Contact:

    Nick Hodson

  • Email:

    Nicholas.Hodson@clementmay.com

  • Contact phone:

    0207 251 7322

  • Published:

    29 days ago

  • Consultant:

    Nick Hodson

On behalf of a well respected Partner, Clement May are currently looking for an experienced Java / Scala / Hadoop Developer to join the partner based in London.

You will join a team building a calculation platform using the Hadoop / Spark stack to run forecasting calculations for a leading UK Retail Bank. The project is expected to run for three years from design to product handover, with a series of incremental released delivered throughout the timeline in a semi-agile manner. The partner will be working in close collaboration with the client's own staff and product teams throughout the programme as their strategic delivery partner.

They believe small, multi-disciplinary teams are most effective in delivering complex change of this nature. As such, we are looking for someone with a collaborative mindset who is happy to be flexible and take on different responsibilities over the lifetime of the programme.

Key Responsibilities;

- Interaction with stakeholders, including business teams for risk and reporting as well as model developers. Requirements gathering and analysis.

- Assisting in the design of the platform by mapping business and technical requirements to capabilities of existing components and identifying gaps; design of individual components to fill the gaps identified.

- Development of software components to coordinate data and analytic operations based on the Hadoop stack (includes extension of existing components as well as addition of new ones).

- Integration of dev ops processes, both for delivery of the technology solution and models built on top of the technology; systems and scripting work will be required.

- Assuming a level of ownership for the components you have designed and/or built; i.e. ensuring fitness for purpose usability, robustness, maintainability and performance.

- Testing, deployment planning and handover.

Role Requirements - Technical Skills and experience;

This position will suit a highly capable developer with at least two years' experience working with Java, Hadoop and related technologies. Skills and experience should include:

- Software development: Solid foundation of Java coding skills. Experience using Scala in a real-world context is a distinct advantage. python, R and SQl-like languages will also be useful, additional languages a plus though not directly required. Event driven programming using promises and futures or the reactive pattern.

- Component design: Understanding of OOP, functional, declarative and actor paradigms, their strengths, weakness and where to use each pattern to achieve simplicity, flexibility, robustness and performance. Ability to design and implement interfaces and high-level software components. Confidence discussing and challenging architecture (guidance will be provided on platform architecture and design of distributed components).

- Hadoop experience: Real world experience using Hadoop successfully to solve a business problem is essential. We are particularly interested in Spark, HBase, Phoenix, HDFS and YARN (choosing the right components for the task at hand will count for more than having used that specific set of technologies).

- Performance Optimisation: Understanding of key performance drivers for distributed systems. Data locality, layout and marshalling, caching, disk and memory access patterns. Empirical performance testing, profiling and tuning. Guidance in performance tuning distributed systems will be provided so expert-level skills in this area are not required: however, familiarity and understanding of the key concepts will be needed.

- Functional Programming: Use of functional paradigms to create stateless components for concurrent execution. Use of a DAG and lazy evaluation. Purist approach to functional programming is not required, this is more about understanding how functional concepts simplify design and eliminate execution bottlenecks.

- Dev Ops: Configuration of complex CI / CD pipelines in e.g Jenkins, Travis etc.

- Unix: Familiarity with the Unix command line and common utilities is distinct advantage.

- Development methodologies: Familiarity with agile working and test-driven development. Use of JIRA.

- Education; An excellent result in computer science or another STEM subject is preferred, either at MSc or BSc level, but alternative education paths will be considered.

- Knowledge of financial services is an advantage but not essential, as guidance and support can be provided.