Connecting...

W1siziisijiwmtkvmdevmdkvmtivmtkvntavndavawn0mi5qcgcixsxbinailcj0ahvtyiisijiwmdb4nzawiyjdxq
Job

Java / Scala / Hadoop Developer - Financial - London

Java / Scala / Hadoop Developer - London

This position will suit a highly capable developer with at least two years' experience working with Java, Hadoop and related technologies.

You will be team building a calculation platform using the Hadoop / Spark / Java stack to run forecasting calculations for a leading UK Retail Bank, our clientis looking for someone with who is happy to be flexible and take on different responsibilities over the lifetime of the programme.

Duration: 7 months

Key Responsibilities;

-Interaction with stakeholders, including business teams for risk and reporting as well as model developers. Requirements gathering and analysis.

- Assisting in the design of the platform by mapping business and technical requirements to capabilities of existing components and identifying gaps; design of individual components to fill the gaps identified.

- Development of software components to coordinate data and analytic operations based on the Hadoop stack (includes extension of existing components as well as addition of new ones).

- Integration of dev ops processes, both for delivery of the technology solution and models built on top of the technology; systems and scripting work will be required.

- Assuming a level of ownership for the components you have designed and/or built; i.e. ensuring fitness for purpose usability, robustness, maintainability and performance.

- Testing, deployment planning and handover.

Role Requirements - Technical Skills and experience;

- Software development: Solid foundation of Java coding skills. Experience using Scala in a real-world context is a distinct advantage. python, R and SQl-like languages will also be useful, additional languages a plus though not directly required. Event driven programming using promises and futures or the reactive pattern.

- Component design: Understanding of OOP, functional, declarative and actor paradigms, their strengths, weakness and where to use each pattern to achieve simplicity, flexibility, robustness and performance. Ability to design and implement interfaces and high-level software components. Confidence discussing and challenging architecture (guidance will be provided on platform architecture and design of distributed components).

- Hadoop experience: Real world experience using Hadoop successfully to solve a business problem is essential. We are particularly interested in Spark, HBase, Phoenix, HDFS and YARN (choosing the right components for the task at hand will count for more than having used that specific set of technologies).

- Performance Optimisation: Understanding of key performance drivers for distributed systems. Data locality, layout and marshalling, caching, disk and memory access patterns. Empirical performance testing, profiling and tuning. Guidance in performance tuning distributed systems will be provided so expert-level skills in this area are not required: however, familiarity and understanding of the key concepts will be needed.

- Functional Programming: Use of functional paradigms to create stateless components for concurrent execution. Use of a DAG and lazy evaluation. Purist approach to functional programming is not required, this is more about understanding how functional concepts simplify design and eliminate execution bottlenecks.

- Dev Ops: Configuration of complex CI / CD pipelines in e.g Jenkins, Travis etc.

- Unix: Familiarity with the Unix command line and common utilities is distinct advantage.

- Development methodologies: Familiarity with agile working and test-driven development. Use of JIRA.

- Education; An excellent result in computer science or another STEM subject is preferred, either at MSc or BSc level, but alternative education paths will be considered.

- Knowledge of financial services is an advantage but not essential, as guidance and support can be provided.