Multimodal Data Analysis
Multimodal data are prevalent in industrial monitoring, finance and healthcare. In particular, time series are often tagged with text comments from experts that provide layman users with the domain knowledge to understand the charts. Texts give the patterns qualitative meaning while time series makes the words quantitative. Analyzing the relationship between different data types is the key to unraveling the hidden structure of such data.
This project aims to develop machine learning and data mining algorithms that provide insight about multimodal data through joint modeling of time series, natural language texts and data of other types. Through tasks such as automatic time series explanation, cross-modal retrieval, time series QA and knowledge discovery, we create virtual domain experts that can comprehend domain-specific terms and use them to explain time series data. Automated financial analyst, plant operator, health advisor and fitness coach are just a few examples of the next generation AI-human interaction paradigm enabled by multimodal learning.