The utility provided by easy access to LLM ( Large language models ) has already driven leaps of innovation in months not years. We’re witnessing new markets, new products, and new services being created at a pace that would have been unimaginable just a few years ago. Within five years we’ll see costs drop significantly as models are matured and sold at decreasing prices, enabling further access to high quality AI services.
OpenAI has started the sprint but by no means is alone to the finish line. Many of us will be hedging our bets as the AI race heats up.
The convergence of Kubernetes and AI is a natural progression of this trend. The twist here is that applying AI services to Kubernetes instantly catalyses the adoption rate of cloud-native due to the ubiquity of Kubernetes. We’ll see that the bar to entry will drop to the floor as AI make Kubernetes accessible to everyone.
My dream is to create high quality Open Source tooling that has the potential to be used by everyone. I’m excited to see what the future holds. We’ll build multiple backends that let you BYOAI ( Bring your own AI) and we’ll make it easy to use. There will be a community of engineers and researchers that will help us build the best AI services for Kubernetes. Leaning in on LLM and API that are made available as AI services expand like wildfire.
I want you to join us, but better yet help us.
By Alex Jones