Data Engineer

Salary: Base + Bonus + Benfits

Location: London, England, United Kingdom

Job Type: Full time

Data Engineer required with strong Python, SQL, Bash and R and Hadoop (HDFS/HIVE). As a Data Engineer you will collaborate with our Data Scientists and be responsible for the development and management of data acquisition and build systems that bring data in-house and the software that puts the insights into the hands of our investors.

Team Overview


The Data & Insights technology team was established in early 2015 with the aim of delivering data insights capabilities and tools to serve the analytics needs of the business.  Success has led to this team extending to a broader role within Schroders, including supporting the Data Insights Unit (“DIU”)


The DIU are a team of data scientists tasked with helping investors within Schroders (i.e. Analysts and Fund Managers) make better investment decisions by better use of data.  The DIU typically works with historical data (e.g. share price etc) as well as new alternative data sets –essentially acting as an R&D department for Investment.

Overview of role


The data engineer will collaborate with our data scientists in our Data Insights Unit (the DIU) to create pipelines of information for the firm’s investors. Whilst the DIU’s data scientists test hypotheses and unravel the meaning within the data, the data engineer will build the systems that bring data in-house and develop the software that puts insights in the hands of the investors.


You will be responsible for the development and management of data acquisition and pipeline building activities, allowing us to bring together data sets from diverse sources including public sources, 3rd party vendors and internal data repositories. Familiarity with web, ftp, api, sql and related ETL technologies will thus be essential to the role.
For mature and well understood data sources, you will collaborate with the data scientists to convert their models into deployable software. Balancing the competing demands of enterprise compliance (security, approvals, deployment timelines etc.) with agility (enable the data scientists to continually adjust and adapt their models) will be essential at succeeding in this role.


This will enable DIU to use advanced analytical and statistical techniques to make connections between them and to answer important questions for the Investors. 
While the role’s primary focus is on data acquisition, pipelining, and software development; any experience in data science will increase your understanding of your customers needs and the constraints of their models, and thus contribute to your success in this role.

Role key responsibilities:


• This is a hands on role, building data pipelines and productionising advance analytics at Schroders.
• The person will ideally have strong technical skills in broad variety of data technologies across the stack including: Hadoop (HDFS/Hive), Python, SQL, Bash and R.
• Define and manage best practice in configuration and management of the data store. Ensure that all data and systems are managed with the highest regard for data security and upholding of our obligations to the vendors who license data to us.
• Work with IT to align as closely as possible with the rest of the business, whilst still ensuring that the data scientists are able to innovate, research, and act with agility.
• Work with Data & Insights Technology colleagues to create and optimise ETL and data enrichment processes to maximise the usefulness of data for a variety of purposes. Includes working out how best to ingest new data sources from new vendors, and where possible directly engaging with the vendor to improve their service for Schroders’ goals.

 
Skills / Experience
Essential


• SQL
• Python
• Comfortable on the command line / shell scripting experience.
• Ability to take on and assimilate new ideas, assess the technology landscape and identify useful solutions, creative thinker.
• Delivery-oriented
• Some experience with Linux / Hadoop


Useful


• Excellent Maths at A Level.
• Relevant degree subject (e.g. Computer Science)
• Meaningful experience with multiple different types of Data Warehouse technologies
• Data preparation experience.
• Cloud-based data solution implementations (e.g. using AWS / Azure)
• Cloud-based data warehouse solutions (e.g. Amazon RedShift  / Google BigTable / SQL Server PDW)
• Asset Management / Investment practices
• Knowledge of tools within the BI landscape including modern reporting tools such as Tableau, Qlik or Microstrategy
• Familiarity with service design, self-serve and centre-of-excellence models.
• Familiarity with Test Driven Development or DevOps practices

Enterprise skills


• Understands data governance principles and is able to effectively engage with data governance practice.
• Understands typical enterprise concerns around data sharing, including awareness of legal, compliance, governance, information security etc.
• Familiarity with agile methodology.

Note To Agencies:


Schroders does not accept speculative CV’s from agencies. We do have a PSL who are invited to support us when required.  We only pay fees to agencies instructed to send CV’s and are submitted through our recruitment portal.  We do not pay fees on speculative or unsolicited CV’s sent to Schroders or Schroders employees and reserve the right to contact unsolicited CV’s directly.