Big Data Engineer

UserTesting ,
Edinburgh, City of Edinburgh

Overview

Job Description

Company Description UserTesting enables every organisation to deliver the best customer experience powered by human insight. The market leading on-demand Human Insight Platform from UserTesting is used to make accurate customer-first decisions and quickly create great customer experiences. Our customers include 48 of the top 100 brands in the world, and we have delivered human insights to over 35,000 companies to-date. One of Silicon Valley's fastest growing private companies, UserTesting prides itself on great company culture, creating a rewarding and supportive environment for our employees. Headquartered in San Francisco with offices in Atlanta and now Edinburgh, there is a great opportunity to become part of this amazing growth story. Job Description As a Big Data Engineer, you'll design, develop & tune data products, applications and integrations on large scale data platforms with an emphasis on performance, reliability and scalability and most of all quality. You'll support our Machine Learning efforts by building large-scale distributed infrastructure for rapid experimentation, training, and inference. You are passionate about applying cutting-edge machine learning to real-world problems and building the required frameworks and tools to do so. This role is available in Edinburgh, Scotland and is permanent. Responsibilities Work closely with product and design to discover and build solutions that help our customers build great user experiences Collaborate with engineers who are both remote and co-located in our Mountain View, San Francisco, and Atlanta offices Work effectively within a team environment, to regularly solicit and act on feedback, focus on root causes, and continually strive to improve Enhance our customer-facing platform, tester panel distribution systems, video playback tools, and mobile device recording capabilities Advocate and lead-by-example best practices for code quality in architecture and design, maintainability, performance, and scalability Lead on promoting just-right solutions to build for the future while also avoiding costly premature optimizations Requirements At least 5 years of software development experience. At least 3 years of experience of using Big Data systems. Strong in one or more languages (Python/Ruby/Scala/Java/C++) Strong experience on a professional software development team building highly scalable, distributed systems in the cloud Experience in REST API design and implementation Experience with messaging, queuing, and workflow systems, especially Kafka or Amazon Kinesis Experience with non-relational, NoSQL databases and various data-storage systems, especially: Cassandra, ElasticSearch/Solr, Neo4j, etc. Preferred Qualifications Experience working with Machine Learning, especially NLP Experience with software development on top of Deep Learning Frameworks, especially Tensorflow/Keras Data engineering knowledge including ETL, DataWarehouse, Data Visualization, etc. Data modeling experience with columnar data formats Experience integrating with CI tools programmatically Experience with Docker, registries and container deployment services (e.g., AWS ECS, Kubernetes). Additional Information Besides a great work environment and the opportunity to change the world, we offer competitive salary, benefits, plenty of perks, as well as equity participation. Benefits: As well as a five star rated 'best place to work' work environment and the opportunity to change the world, we offer competitive salary, benefits, plenty of perks, as well as stock options. We value diversity, and are proud to be an inclusive, equal opportunity workplace. Note to recruitment agencies. We do have a preferred supplier list for the provision of recruitment services and we will not accept unsolicited CVs from suppliers not currently on our PSL.