Dust uses large language models on internal data to improve team productivity

Dust is a new AI startup based in France that is working on improving team productivity by breaking down internal silos, surfacing important knowledge and providing tools to build custom internal apps. At its core, Dust is using large language models (LLMs) on internal company data to give new super powers to team members.

Co-founded by Gabriel Hubert and Stanislas Polu, the pair has known each other for more than a decade. Their first startup called Totems was acquired by Stripe in 2015. After that, they both spent a few years working for Stripe before parting ways.

Stanislas Polu joined OpenAI where he spent three years working on LLMs’ reasoning capabilities while Gabriel Hubert became the head of product at Alan.

They teamed up once again to create Dust. Unlike many AI startups, Dust isn’t focused on creating new large language models. Instead, the company wants to build applications on top of LLMs developed by OpenAI, Cohere, AI21, etc.

The team first worked on a platform that can be used to design and deploy large language model apps. It has then focused its efforts on one use case in particular — centralizing and indexing internal data so that it can be used by LLMs.

From an internal ChatGPT to next-gen software

There are a handful of connectors that constantly fetch internal data from Notion, Slack, Github and Google Drive. This data is then indexed and can be used for semantic search queries. When a user wants to do something with a Dust-powered app, Dust will find the relevant internal data, use it as the context of an LLM and return an answer.

For example, let’s say you just joined a company and you’re working on a project that was started a while back. If your company fosters communication transparency, you will want to find information in existing internal data. But the internal knowledge base might not be up to date. Or it might be hard to find the reason why something is done this way as it’s been discussed in an archived Slack channel.

Dust isn’t just a better internal search tool as it doesn’t just return search results. It can find information across multiple data sources and format answers in a way that is much more useful to you. It can be used as a sort of internal ChatGPT, but it could also be used as the basis of new internal tools.

“We’re convinced that natural language interface is going to disrupt software,” Gabriel Hubert told me. “In five years’ time, it would be disappointing if you still have to go and click on edit, settings, preferences, to decide that your software should behave differently. We see a lot more of our software adapting to your individual needs, because that’s the way you are, but also because that’s the way your team is — because that’s the way your company is.”

The company is working with design partners on several ways to implement and package the Dust platform. “We think there are a lot of different products that can be created in this area of enterprise data, knowledge workers and models that could be used to support them,” Stanislas Polu told me.

It’s still early days for Dust, but the startup is exploring an interesting problem. There are many challenges ahead when it comes to data retention, hallucination and all of the issues that come with LLMs. Maybe hallucination will become less of an issue as LLMs evolve. Maybe Dust will end up creating its own LLM for data privacy reasons.

Dust has raised $5.5 million (€5 million) in a seed round led by Sequoia with XYZ, GG1, Seedcamp, Connect, Motier Ventures, Tiny Supercomputer, AI Grant and a bunch of business angels also participating, such as Olivier Pomel from Datadog, Julien Codorniou, Julien Chaumond from Hugging Face, Mathilde Colin from Front, Charles Gorintin and Jean-Charles Samuelian-Werve from Alan, Eléonore Crespo and Romain Niccoli from Pigment, Nicolas Brusson from BlaBlaCar, Howie Liu from Airtable, Mathieu Rouiff from PhotoRoom, Igor Babuschkin and Irwan Bello.

If you take a step back, Dust is betting that LLMs will greatly change how companies work. A product like Dust works even better in a company that fosters radical transparency instead of information retention, written communication instead of endless meetings, autonomy instead of top-down management.

If LLMs deliver on their promise and greatly improve productivity, some companies will gain an unfair advantage by adopting these values as Dust will unlock a lot of untapped potential for knowledge workers.

Dust uses large language models on internal data to improve team productivity by Romain Dillet originally published on TechCrunch

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to our Newsletter