Ikigai’s No-Code AI Platform
by Devavrat Shah
Data as a Graph
We view any data as a proxy of an underlying probability distribution — as a mental model, let’s think of data represented as a (potentially giant) spreadsheet with columns of it being (random) variables and rows of it being samples of the (joint probability) distribution over these (random) variables. A generic way to represent any such distribution is to capture it through a graph (or graphical model). In effect, such a graph is like a traditional “data cube” but with a “fuzzy” or “probabilistic” view of it.
Learning such a graph from data can enable all sorts of AI tasks on it including data stitching (DeepMatch), AutoML, scenario analysis, and optimization. To enable such functionalities in Ikigai’s no-code platform requires creating such “probabilistic index” in real-time. This needed a technological breakthrough.
Compute with PubSub
At Ikigai, we have achieved this breakthrough by developing novel computational architecture using PubSub. Data as a graph naturally leads to a “message-passing” computational architecture where computationally necessary information is exchanged between units of data (or nodes of a graph). We realized that a computationally scalable way to implement such architecture is by utilizing the classical PubSub infrastructure — what is utilized by modern data bus e.g. Apache Kafka.
In an MIT patent, we outline the architecture to explain how it enables computation to scale with data. In effect, it brings computation to data rather than the traditional approach of bringing data to compute.
What does it enable (and how)?
Ikigai’s tech is the key enabler in building its AI-native no-code AI platform. We describe the functionalities that it enables in a no-code manner that are crucially needed for making Ikigai’s operational BI platform truly useful for a data operator.
DeepMatch. The quintessential pre-task of most data-driven analysis is that of “stitching” multiple data sources together. Traditionally, in the database language, this is achieved through “joins”. In many modern settings, however, this does not work as they may lack a shared column or have mismatched entries, thus the correct relationship is likely to be missed. In a traditional setting, this results in either a lot of manual work or careful, case-by-case, data processing work.
By viewing data as a graph, Ikigai provides DeepMatch, which attempts to learn the relationship between columns of a dataset and then use it to further match rows. The human-in-the-loop component allows for the end user to provide minimal supervision to correct the inaccuracies (exception is not an exception but a norm when using AI) and to improve the outcome.
AutoML. The task of prediction can be viewed as that of filling missing values in a spreadsheet. If it is temporal data, it would include imputation prior, historical data, or a forecast of future data. By viewing data as a graph, such tasks can be answered instantly by estimating the missing values, given other data as (conditional) observations.
This is how Ikigai enables AutoML where an end user can obtain predictions by simply specifying which columns in the data set need to be predicted. Indeed, the predictions come with uncertainty quantification as well as interpretation.
AutoML makes Ikigai’s platform unique in its ability to serve various prediction use cases with very limited data. To learn more about how this enables accurate demand forecasting in retail, see here.
Scenario Analysis. A decision maker needs to weigh various scenarios from the lens of the collection of objectives and constraints to finally make a decision. From long-term strategy to daily tactical level decisions, they require such scenario analysis. In effect, this requires “simulation” of the future under different decisions. Typically, this is a lot more complicated than simply doing predictions.
Ikigai enables such scenario analysis with a no-code platform using its unique technology. To learn more about how it enables scenario analysis in the supply chain, see here.
Optimization. The optimal choice amongst various options needs to be made across a variety of decision tasks. It turns out that an optimization problem can be solved by modeling it as a graph and subsequently Ikigai’s tech enables a no-code solution for it.
To learn more about how it enables production planning optimization in supply chain, see here.
About the Author
Devavrat Shah (Co-founder and CTO)
Devavrat is a Professor of AI+Decisions within the department of EECS at MIT. He is the founding director of the Statistics and Data Science Center at MIT.
He previously co-founded Celect ($35MM+ in funding, acquired by Nike in 2019) to help leading retailers utilize A.I. to optimize their inventories. He has made seminal contributions to statistical inference and machine learning that have had an impact in academia and industry.