all AI news
Contextual Mixture of Experts: Integrating Knowledge into Predictive Modeling. (arXiv:2211.00558v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
This work proposes a new data-driven model devised to integrate process
knowledge into its structure to increase the human-machine synergy in the
process industry. The proposed Contextual Mixture of Experts (cMoE) explicitly
uses process knowledge along the model learning stage to mold the historical
data to represent operators' context related to the process through possibility
distributions. This model was evaluated in two real case studies for quality
prediction, including a sulfur recovery unit and a polymerization process. The
contextual mixture …
arxiv experts knowledge mixture of experts modeling predictive predictive modeling