The key benchmark in manufacturing is productivity, and Generative AI (GenAI) has unlocked new possibilities to achieve it. Leveraging this technology, operators and managers can now interact with their equipment via a chat interface through oee.ai. In this article, we explore the technology that powers this remarkable capability.
Open Source LLM as the Core
The core of the oee.ai chat function is a large language model (LLM). A large language model is an advanced AI system trained on vast amounts of text data to understand and generate human-like language. It predicts and constructs text based on patterns learned during training, allowing it to perform tasks like answering questions and engaging in conversation. These models are built using deep learning techniques, particularly neural networks with billions of parameters.
LLMs are available in different set-ups. oee.ai has decided for an open source model hosted on its own servers to guarantee maximum data privacy to our customers. No data is leaving our infrastructure and all is governed by European data protection laws.
Agents are Doing the Work
Agents are autonomous entities that perceive their environment, make decisions, and take actions to achieve specific goals. They can operate independently or interact with other agents and systems, often using AI to adapt to changing conditions. Agents are commonly used in simulations, robotics, and AI applications like virtual assistants.
In oee.ai, agents interact with three other infrastructure components: The machine data as the core of oee.ai, a vector database to query specific documents using retrieval-augmented generation (RAG), and a relational database to store a history of past conversations.
Agents are the diligent orchestrators of the tools required to answer the user’s question.
OEE Data from the Time Series Database
To enable the oee.ai chat system, the agent collects current or historical data for the connected equipment by querying the relevant APIs. This data includes productivity status, shift models, and loss reason catalogs – to name a few – all stored in oee.ai, some of them in a time series database. When a user asks a question, like “How was the OEE yesterday in the late shift?”, the agent retrieves the necessary data via the API, and the LLM formulates a clear, user-friendly response for the chat interface.
Agentic Retrieval-Augmented Generation (RAG) for Business Context
In addition to historical and real-time equipment data, users may also seek business context and specific guidance from the oee.ai chatbot. While the LLM can provide general insights learned during training, for precise and context-specific answers – such as to reply to the question “What can I do to improve the OEE?” – the system leverages a vector database containing carefully selected and proprietary resources, such as books – like our own – and web pages, to enhance the response quality. This vector database also supplies crucial information not included in the LLM’s training data, such as configuration guides, enabling the chatbot to accurately answer specific questions like “Where do I configure the shift model in oee.ai?”.
An intriguing aspect of this system is its ability to interact with data in nearly any language. Whether it’s manufacturing execution data or complex documents like books, the chatbot can seamlessly handle queries in one language while the source material might be in another – like feeding a book in German and querying it in Spanish or French. This remarkable capability highlights the advanced nature of modern AI, truly making it an extraordinary time to be alive.
Conversational Memory via a Relational Database
Despite LLMs having vast knowledge from the internet, they still require conversational memory to interact effectively with users. This memory is managed through a relational database that stores the current and all past conversations, enabling the chatbot to maintain context throughout the dialogue. For instance, once a user references an equipment name early in the chat, they can ask subsequent questions without needing to repeatedly mention the equipment, mirroring the natural flow of human conversation where context is remembered.
How to Access oee.ai’s Chatbot
Apps have become a prevalent way for people to interact with technology in the 21st century. To align with this trend, oee.ai’s chatbot is accessible through dedicated iOS and Android apps available on their respective app stores. Additionally, a web interface is integrated into the oee.ai web app for easy online access.
Pushing the Envelope Further
When employees interact directly with machine data, it primarily lowers the barrier to accessing productivity information. This direct access empowers them with a greater sense of control and responsibility over their work, potentially boosting job satisfaction and engagement. However, this is just the beginning of what oee.ai’s technology offers. Expect even more advanced features in the future, such as equipment that can assist in fixing itself. Stay tuned.
If you’re interested in exploring how AI can be integrated into your shopfloor employees’ workflow, feel free to reach out to us at info@oee.ai.
Author: Linus Steinbeck