Many of the talks yesterday were peppered with acronyms befitting a gathering of high-minded panelists: YC, FTC, AI, LLM, etc. But it’s fair to say that a thread running through the conversations was an open source AI advocacy movement.
This was a clear shift away from the app-obsessed 2010s (or, for Linux fans, a regression), when developers were happy to containerize their tech and hand it over to larger platforms for distribution.
The event also came just two days after Meta CEO Mark Zuckerberg declared that “open source AI is the way forward” and released Llama 3.1, the latest version of Meta’s proprietary open source AI algorithms. As Zuckerberg said in the announcement, some technologists no longer “want to be constrained by what Apple allows us to develop” or face arbitrary rules and app fees.
Open source AI is also the OpenAI approach do not have Despite what the multi-billion dollar startup name might suggest, OpenAI uses it for the largest GPTs, meaning at least some of the code is kept private, OpenAI doesn’t share the “weights” or parameters of its most powerful AI systems, and OpenAI charges a fee for enterprise-level access to its technology.
“With the rise of composite AI systems and agent architectures, small but fine-tuned open source models can produce significantly better results than (OpenAI’s) GPT4 and (Google’s) Gemini, especially for enterprise tasks,” said Ari Golshan, co-founder and CEO of synthetic data company Gretel.ai (who was not at the YC event).
“I don’t think this is about OpenAI versus the world or anything like that,” says Dave Yen, who runs Orange Collective, a fund of YC alumni that backs promising YC founders. “I think it’s about fair competition and creating an environment where startups aren’t at risk of disappearing the next day because OpenAI changes their pricing model or policies.”
“That’s not to say we shouldn’t play it safe,” Yen added, “but we also don’t want to restrict interest rates unnecessarily.”
Open source AI models carry some inherent risks that more cautious technologists have warned about. The most obvious is that the technology teeth They’re open and free. Bad actors have a greater chance of using these tools to cause harm than expensive private AI models. Researchers note that it’s cheap and easy for bad actors to train away the safety parameters that exist in these AI models.
As WIRED’s Will Knight reports, “open source” is also a myth for some AI models: The data used to train them may still be kept secret, licenses may restrict developers from building certain things, and ultimately, the original model creators may benefit more than anyone else.
And some politicians, such as California Senator Scott Wiener, are opposed to the unrestrained development of large-scale AI systems. Wiener’s AI Safety and Innovation Act (SB 1047) has been controversial in the tech world. The bill aims to establish standards for developers of AI models that cost more than $100 million to train, require a certain level of safety testing and red teaming exercises before deployment, protect whistleblowers working in AI labs, and give state attorneys general legal recourse if an AI model causes extreme harm.
Wiener himself spoke at the YC event on Thursday, in a conversation moderated by Bloomberg reporter Shirin Ghaffari. He said he was “deeply grateful” to those in the open source community who voiced opposition to the bill, and that the state had “made a series of revisions in direct response to some of the critical feedback.” One change, Wiener said, is that the bill now more clearly defines a reasonable path for stopping open source AI models that go off the rails.
The big-name speaker at Thursday’s event was a last-minute addition to the program, Andrew Ng, co-founder of Coursera, founder of Google Brain and former chief scientist at Baidu, who, like many of the attendees, spoke in defense of the open source model.
“This is one of those moments that will decide whether entrepreneurs are allowed to continue innovating or whether funds that could be spent on software development should be spent on hiring lawyers,” Ng said.