We have seen exactly the same while building Tilores IdentityRAG. First people try to build AI applications based on datawarehouse data, but they struggle with 1) real-time data and 2) connecting data together where there is no unique identifier (identity resolution problem).
Since we already had a highly-scalable, real-time entity resolution engine, it was quite simple for us to extend that to be a GenAI data source (in this case by building a LangChain integration) and hence IdentityRAG was born.
We are finding being connected to the datawarehouse whilst being a mutual datasource for live data streams is working well - i.e. the real-time data is sent to both Tilores and the dwh at the same time but we also stay in sync with the dwh directly.
We have seen exactly the same while building Tilores IdentityRAG. First people try to build AI applications based on datawarehouse data, but they struggle with 1) real-time data and 2) connecting data together where there is no unique identifier (identity resolution problem).
Since we already had a highly-scalable, real-time entity resolution engine, it was quite simple for us to extend that to be a GenAI data source (in this case by building a LangChain integration) and hence IdentityRAG was born.
We are finding being connected to the datawarehouse whilst being a mutual datasource for live data streams is working well - i.e. the real-time data is sent to both Tilores and the dwh at the same time but we also stay in sync with the dwh directly.
Good point. As I noted previously, Real-time is a key requirement of modern entity resolution engines 🔎 https://gradientflow.com/entity-resolution-insights-and-implications-for-ai-applications/