Plato Systems

Plato Systems is on a mission to supercharge manufacturing operations by using spatial activity data and AI to improve capacity, productivity, and safety. Spun out of Stanford in 2019, we have built an intelligent analytics platform fueled by our revolutionary device which uses machine perception and sensor fusion to provide an operations digital twin that combines factory operator activity with machine data for the first time. We’re funded by NEA, and have a strong team from industry (Uber, Apple, Amazon, Google) and academia (Stanford). You can find out more about us by visiting our website and our Notion page.

Our mission and team expertise spans beyond software to advanced sensor systems, algorithms, embedded systems, signal processing, and machine learning. We are building and deploying edge software and cloud services for real-time customer facing products as well as internal big data tools. As a solutions engineer, you sit at the intersection of technology and business, working collaboratively with different teams to transform data generated by our fleet of edge devices into data products that are consumable by our end customers.

We currently have a full-time opportunity as a Senior Data Scientist at Plato Systems. In this role, you will sit at the intersection of Data science, Engineering, and Product Management, & work collaboratively with different teams to transform data generated by our fleet of edge devices & other digital data into data products that provide actionable insights to our end customers. A successful candidate for this role will be someone who can establish strong relationships with both our clients and our colleagues by understanding customers’ needs, and then creating and presenting data driven solutions and stories that help them achieve their strategic and tactical goals.


  • Work with a team of Data Scientists and Analysts to drive generation and discovery of actionable insights from Plato’s unique spatio-temporal data through exploratory data analysis, hypothesis testing, and statistical methods.
  • Work closely with Plato’s product managers, sit through customer discussions, and understand the context of the KPIs & customers’ business needs (as opposed to what they ask for) in order to help create products or experiences that address customer pain points.
  • Strive to become an internal expert and consultant for each customer insight module
  • Rapidly generate and iterate through hypotheses to build relevant, story-driven insights
  • Write compelling stories that use the data – and data visualization - to compel the audience with the power of the insight  
  • Work in a data-driven environment, drive process improvement, and work with the stakeholders to translate high-level business goals to working software solutions and customer-facing outputs.
  • “Own” your work – take initiative, be proactive, ask questions to bring clarity, share ideas and challenge the norm, anticipate and complete projects in a comprehensive manner.

Required Qualifications

  • Master’s, or PhD in engineering, applied science, or a related quantitative field such as physics or economics
  • Proficiency in Python (numpy, scipy, pandas, matplotlib and plotly) and OOP programming
  • End-to-end experience with data, including databases, querying, aggregation, analysis, and visualization
  • Familiarity with networking, & Linux environment
  • Able to work from our San Francisco office at least three days/week
  • 5+ years experience and demonstrated experience in working on data products and/or generating actionable insights
  • Self-starter, methodical, high-velocity, motivated, responsible, innovative and technology-driven person who performs well both solo and as a team member
  • A proactive problem solver that has great communications as well as project management skills to relay findings and solutions across technical and non-technical audiences

Preferred Qualifications

  • Experience with one or more of time-series analysis, statistical analysis, anomaly detection, regression, clustering, and computer vision fundamentals
  • Phenomenal data visualization and analysis skills with expertise in Looker, Tableau, Power BI, Seaborne, etc.
  • Ability to understand large data sets and use appropriate technologies and methodology to manipulate data (e.g. SQL, R, Python, ETL pipelines)
  • Prior experience with modern data technologies stack such as Databricks, Airflow, etc.
  • Strong written and verbal communication skills