DevOps Engineer

We are seeking a talented and qualified DevOps Engineer with proficiency in Google Cloud Platform (GCP) and Kubernetes to join our growing team. You will be responsible for ramping up our operational systems, deployment pipelines and monitoring as we transition to GCP.

You will work as part of our Engineering Team from our Toronto office in The Junction. Our team has a presence in both Boston and Toronto, so it will be necessary to collaborate with a remote team.


  • Writing reusable, testable, and efficient code for deploying and managing infrastructure
  • Design and implement automated build/deployment pipelines
  • Configuration and secrets management
  • Set up and manage instrumentation for application performance monitoring
  • Develop ways to improve application performance and reduce cost by right-sizing infrastructure
  • Manage continuous integration/continuous deployment strategies
  • Provide technical guidance and educate team members on devops operations and best practices
  • Document various deployment processes


  • Intellectual curiosity and a strong desire to learn
  • Problem solving skills, including the ability to disaggregate complex problems and incrementally implement solutions
  • A passion to efficiently support always-available applications
  • Experience with Docker and containerization of services, Kubernetes
  • Experience with cloud services like GCP
  • Experience with continuous integration tooling like TravisCI
  • Experience with application logging / tracing / performance monitoring
  • Experience with automation/configuration management
  • Able to multitask, prioritize, and manage time efficiently
  • Experience with Linux infrastructures
  • Experience with monitoring tools like Graphite and Prometheus
  • Experience with a testing framework
  • Experience using Git
  • Experience with SQL and NoSQL databases
  • Proficient with scripting languages

Great to Have

  • Undergraduate degree in Computer Science, Engineering, or a related field
  • Exposure to Big Data/NoSQL technologies (Spark, HDFS, MapReduce, HBase, Cassandra, MongoDB, etc.)
  • Experience with one or more data analysis toolkits (pandas, R, or MATLAB)
  • Experience with workflow authoring/scheduling applications, such as Luigi or Airflow