Pico fuels the global capital markets community by providing exceptional market data services and customized managed infrastructure solutions. As financial industry experts at the center of markets and technology, we help our clients efficiently scale their business and quickly access markets. From infrastructure to connectivity, we support our clients through the full trading lifecycle. We are a global company headquartered in New York, with offices in Chicago, London, Singapore and Tokyo.
The Role This is an opportunity to join a world-leading engineering team in either Krakow or Dublin working on the design and development of new features across the entire Corvil product range. We are at a tremendously exciting stage with extensive innovative work on-going and an exciting vision to fulfil.
Responsibilities Working alongside talented engineers in the Big Data team, you will design, develop, and integrate new capabilities into the Pico Big Data platform.Develop applications to provide insights for trading data (order entry, market data, etc).Use best practice CI/CD methodologies to produce a high-performance system that is secure, easy to deploy and maintainable.Develop innovative analytics solutions combining financial services and network data.Working as a domain expert to advise and troubleshoot production deployments.Collaborating with the Machine Learning team to integrate new algorithms into the platform.Requirements Bachelors' Degree in Computer Science or equivalent.Passionate about Computer Science, Big Data and learning new technologies.4+ years of experience with Java (or can evidence similar proficiency).3+ years of experience working in Linux/Unix environment.Good working knowledge of IP networking protocols and systems.An understanding of how to write maintainable, modular code while conforming to coding standards.An understanding of what it takes to deliver and code a high-availability and high-performance system.Passionate about technology, design and quality – you know what high quality code looks like, together with appropriate unit/integration/system tests.Database design and SQL query development.Experience with Big Data technologies such as Hadoop, Spark, Kafka, Druid, Presto or similar.Experience with RESTful web services.Experience with in-memory solutions an advantage.Understanding of the concepts and technology ecosystem around both real-time/streaming and batch processing in big data platforms.Proven track record developing distributed solutions that support large, complex datasets.Passing knowledge of financial services concepts e.g. electronic trading, market data, etc.Ability to drive Software Development Life Cycles (SDLC) such as Waterfall, Agile, Kanban, depending on the needs of specific projects.Working Arrangements This is a Hybrid position with weekly time in the office with the flexibility of working from home. Though travel may be required from time to time, it is not expected to be regular or frequent. The role holder will be expected to work whatever hours are necessary for the performance of this role (recognizing that it involves multiple jurisdictions/geographies including but not limited to EMEA, USA and APAC).
IMPORTANT DATA PRIVACY INFORMATION: This position is available with PICO GLOBAL LTD. The controller of your personal data will be PICO GLOBAL LTD.
For further information on what personal data we collect, how we will process your personal data and your rights with respect to your personal data please read our Pico Job Candidate Privacy Notice.
Be a part of Pico Family
Pico is an equal opportunity employer. Pico does not discriminate on the basis of a candidate's age, race, gender, color, religion, sexual orientation, physical or mental disability, or other non-merit factors. All employment decisions at Pico are based on business needs, job requirements and qualifications. If you require any assistance or accommodations to be made for the recruitment process, please inform us when you submit your online application.
#J-18808-Ljbffr