Plato Systems

Plato Systems is on a mission to supercharge manufacturing operations by using spatial activity data and AI to improve capacity, productivity, and safety. Spun out of Stanford in 2019, we have built an intelligent analytics platform fueled by our revolutionary device which uses machine perception and sensor fusion to provide an operations digital twin that combines factory operator activity with machine data for the first time. We’re funded by NEA, and have a strong team from industry (Uber, Apple, Amazon, Google) and academia (Stanford). You can find out more about us by visiting our website and our Notion page.

Our mission and team expertise spans beyond software to advanced sensor systems, algorithms, embedded systems, signal processing, and machine learning. We are building and deploying edge software and cloud services for real-time customer facing products as well as internal big data tools. As a solutions engineer, you sit at the intersection of technology and business, working collaboratively with different teams to transform data generated by our fleet of edge devices into data products that are consumable by our end customers.

A successful candidate is someone with a strong foundation in software development (including proficiency in Python), business acumen, autonomy, and great communication skills. They must be able to translate complex technical concepts into business-friendly language and effectively communicate with both technical and non-technical stakeholders.

Responsibilities

  • Understand customers’ existing digital operations data, & how they would integrate into Plato’s data fusion platform
  • Tailor Plato applications to meet specific customer’s needs
  • Perform exploratory data analysis and data science, generate and discover actionable insights from Plato’s unique spatio-temporal data, conduct hypothesis testing, and utilize statistical methods to assess their validity.
  • Understand the context of the KPIs & customers’ business needs in order to help create products or experiences that address customer pain points.
  • Become an internal expert and consultant for each customer insight module
  • Rapidly generate and iterate through hypotheses to build relevant, story-driven insights
  • Tell compelling stories that use the data – and data visualization - to compel the audience with the power of the insight  
  • Work in a data-driven environment, drive process improvement, and work with the stakeholders to translate high-level business goals to working software solutions and customer-facing outputs.
  • “Own” your work – take initiative, be proactive, ask questions to bring clarity, share ideas and challenge the norm, anticipate and complete projects in a comprehensive manner
  • Provide training to customers on how to leverage Plato’s technology to solve business problems and provide ongoing technical support post-deployment
  • Own customer deployments from start to finish

Qualifications

  • BS, MS, or PhD in an engineering/quantitative field, such as computer science, electrical engineering, mechanical engineering, pure or applied sciences (math, physics, …), etc.
  • 3+ years combined as a solution engineer, application engineer, data scientist, senior data analyst or a similar technical role at a fast-growing technology company
  • Proficiency in Python (numpy, scipy, pandas, matplotlib and plotly) and OOP programming
  • End-to-end experience with data, including databases, querying, aggregation, analysis, and visualization
  • Familiarity with networking, & Linux environment
  • Able to work from our San Francisco office at least three days/week
  • A proactive problem solver who can collaborate with internal and external partners and communicate effectively with diverse audiences including engineering teams, customers, and contractors
  • Strong analytical skills, with the ability to analyze data to identify issues and measure success
  • Comfortable working independently in a fast paced and ambiguous startup environment
  • Willingness to travel ~3 days per month
  • Valid passport or ability to obtain one

Preferred Qualifications

  • Experienced and comfortable with engaging with customers
  • Ability to understand large data sets and use appropriate technologies and methodology to manipulate data (e.g. SQL, R, Python, ETL pipelines)
  • Prior experience with modern data technologies stack such as Databricks, Airflow, etc.
  • Phenomenal data visualization and analysis skills with expertise in tools such as Looker, Tableau, Power BI, Seaborne, etc.
  • Analytical and logical with good judgment, negotiation and interpersonal skills
  • Detail-oriented, forward looking with good leadership & ownership abilities