We’re missing the true value of large language models by keeping them stuck in a chat box. LLMs will be a staple of tomorrow’s machine learning startup opportunities, but only if their impact can be felt where we already work today.
Not long after ChatGPT became a thing, people discovered LLMs could do more than just predict the next word for a sequence of text. Given instructions or when asked to solve a problem, the language model would iterate step-by-step to solve that problem. People started to build on top of LLMs. While the models began with text autocomplete, they’re now used to break down problems into subtasks and execute them one-by-one.
Yet what the language models lack today is the ability to connect with external systems. The genie is out of the bottle. The models are out there. The question is how to use them in the context of an enterprise while not violating standards for privacy and safety.
Large model APIs help engineers get started faster. They take them for some part of the journey but don’t carry them the last mile to the destination. What corporations want to accomplish is usually quite specific, and building a custom solution often makes the most sense. As Ines Montani, the CEO of Explosion, a SignalFire portfolio company, says: “Eventually, the large model will be one part of the toolbox, and ‘surprisingly good’ won't be good enough; you'll want something ‘better’.” Her co-founder, Matthew Honnibal, adds that “ultimately users care about how high the ceiling is, not how the high floor is.”
Building powerful apps on top of large language models with Fixie
Currently, there is a flurry of LLM startups, many of which are using LLM models without any fine-tuning: they just stuff as much info into the prompt and allow the model to take it and use it to refine itself. While this is the quickest way to make use of this space, we believe it won’t scale very well. Differentiation comes from the data moat, the extended capabilities of the product, and the tastes and elbow grease of the humans building these products. For enterprise, we believe that customers want flexibility when using any ML model and any framework. They want the experience to be provider-agnostic, hosted wherever they are, and without vendor lock-in. This is why we invested in Ivy to unify all ML tooling, spaCy/Explosion for providing an Open Source NLP toolbox and annotation system, and most recently we invested in Fixie as the pioneers of LLM adoption for enterprises.
Fixie provides extensions for language models to access external systems, allowing people to ask questions, get responses, and take action. With Fixie, customers can build natural language agents that connect to their data, talk to APIs, and solve complex problems. It’s designed from the ground up with enterprise customers in mind, offering them maximum flexibility and robustness that is necessary to serve enterprise use cases. If you are a company who wants to add LLMs to your application, reach out to learn more or play with the product https://www.fixie.ai/.
We at SignalFire are excited to partner with the founders of Fixie, backing its $5M pre-seed round ahead of its new $12M seed round. Read more about it from Fixie,TechCrunch, and GeekWire.
The team has a strong background building large-scale systems and AI-powered products for billions of users. Matt Welsh, the CEO, was a professor of computer science at Harvard (one of his students was Mark Zuckerberg). After nearly killing Facebook, Matt spent time at Google, Xnor.ai, Apple, and OctoML. Zach Koch, the CPO, is a former product director at Shopify, and was previously a product lead at Google on the Chrome and Android teams. CTO Justin Uberti was the head of the Stadia, Duo, and Hangouts Video teams at Google, and was one of the inventors of WebRTC. Hessam Bagherinezhad is the chief AI officer, and he was an AI/ML leader at Apple and the first employee at Xnor.ai.
SignalFire loves working with experienced AI teams because we build AI ourselves. For the past decade we’ve had a half-dozen engineers working on our Beacon AI data platform, which helps us source investments and assist our portfolio companies with hiring. With all the new AI companies popping up, recruiting top talent in the space can be a challenge. If you’re building an AI company that wants to use AI to find AI engineers, come talk to us at SignalFire or email me at oana@signalfire.com. I’m a machine learning engineer, too, who’s willing do anything to help founders succeed.
*Portfolio company founders listed above have not received any compensation for this feedback and may or may not have invested in a SignalFire fund. These founders may or may not serve as Affiliate Advisors, Retained Advisors, or consultants to provide their expertise on a formal or ad hoc basis. They are not employed by SignalFire and do not provide investment advisory services to clients on behalf of SignalFire. Please refer to our disclosures page for additional disclosures.