Junior Quant Data Engineer
Boston, MA, United States
Job Description Summary For over forty years, HarbourVest has been home to a committed team of professionals with an entrepreneurial spirit and a desire to deliver impactful solutions to our clients and investing partners. As our global firm grows, we continue to add individuals who seek a collaborative, open-door culture that values diversity and innovative thinking. In our collegial environment that’s marked by low turnover and high energy, you’ll be inspired to grow and thrive. Here, you will be encouraged to build on your strengths and acquire new skills and experiences. We are committed to fostering an environment of inclusion that promotes mutual respect among all employees. Understanding and valuing these differences optimizes the potential of both the individual and the firm. HarbourVest is an equal opportunity employer. This position will be a hybrid work arrangement, which translates to 2-3 days minimum per week in the office. The hands-on Junior Quant Data Engineer will be a member of the Quantitative Investment Science team and will help with designing/architecting/developing a cloud-based large data and analytics platform using both traditional RDBMS systems like SQL server & modern tools (such as Spark/Data bricks/Azure Synapse or equivalent) and best design practices within our cloud-based Investment Data Analytics Platform, a strategic asset that drives quantitative research and investment decision-making. The person will also be responsible for identifying the right tool for the job and doing a quick prototype/POCs to validate them. They flourish in an evolving fast-paced environment, and bring a work style marked by high energy, flexibility, quick learning, and collaboration. The ideal candidate is someone who is: Strong knowledge of T-SQL or equivalent
Python experience in building enterprise level production quality software
Strong experience in implementation of efficient ETL and ELT processes for large data sets, preferably with market and trading data providers
Proven understanding of data modeling and various data management concepts such as master data management, reference data management, data governance, data lineage, data catalog, data audit capability, data quality
Strong understanding of DW concepts such as persistent staging, SCD, map and reduce, time series objects
Knowledge of MS Azure cloud and strong experience and understanding of DataOps practice and tools (includes DevOps/GIT source code and CI/CD tools)
Experience building data lake and dealing with both structured and unstructured data
Strong analytical, problem solving and communication skills
What you will do: Issue resolution
Version control, release planning, and deployment preparation
Contribution to logical and physical data models
Creation and updates to project artifacts and tools such as task boards, problem lists, user stories, and technological documentation
Analyze, modify, enhance, and tackle complex stored procedures for reporting and integration
Provide realistic effort estimates and project timelines for development and maintenance projects
Work closely with business analysts to understand requirements and communicate solutions
Alignment to project timelines with quality results and participate in agile ceremonies
And other responsibilities as required
What you bring: 3+ years’ proven experience in crafting/architecting/developing data and analytics platforms
3+ years of SQL using complex queries and writing stored procedures
2+ years of Python experience
3+ years of Agile methodology experience
Education Preferred Bachelor of Arts (B.A) or equivalent experience
Bachelor of Science (B.S) or equivalent experience
#LI-Hybrid
#J-18808-Ljbffr