The possible future of SaaS product for machine learning

4 minute read

Not with that price

With the emergence of recent AI development, there is a growing demand of software-as-a-service(SaaS) products for deploying machine learning models. Everyone wants their machine learning models deployed in production. Fancy models like the closed-sourced gpt3.5 and the open-sourced llama variants like vicuna and alpaca, are challenging the state of the art of generative AI every few weeks. ChatGPT got its first one million users in just five days and people just can’t stop talking about it. But there is a huge shadow behind all of this AI phenomenon, the cost. Sam Altman, the CEO of OpenAI, mentioned that one of the reason they need to monetize on ChatGPT is that the inference cost is “eye-watering”. In another tweet, he also mentioned a figure that is quite specific - the cost of ChatGPT is a couple cents per chat.

Well at first it doesn’t look that bad. But considering one search on Google will probably cost 1/40th of a USD cent, according to this blog post from Google back in 2009, or $0 marginally then those number could make a huge difference. This guardian article indicates that there are approximately 590 million visits from 100 million unique users to ChatGPT in January 2023. With each visit costing 1 cent, which some may consider to be a quite conservative figure, we are looking at 5.9 million US dollars a month for OpenAI to offer ChatGPT as a freemium service. Various sources have confirmed that training the underlying model, namely GPT-3.5, already burned OpenAI anywhere between 4.6 and 20 million US dollars, not counting the R&D. The cost of a modern, edgy machine learning-based services is non-trivial to say the least.

When there is demand

I am no econmist myself. But if there is one thing that current late stage capitalism taught us, it is that when there is a demand, there will be supply. If the staggering cost of training and deploying machine learning models is not enough to scare away any potential competitors, then we know the prize must be good. Microsoft poured an astonishing 10 billion dollars into OpenAI just so that its own services can have exclusive access to OpenAI’s proprietary machine learning models. And Microsoft moved fast. On February 2023, Microsoft introduced the new Bing, a AI-powered search engine and there are also plans to integrate OpenAI’s large language models into Microsoft’s other software offerings like Office and Outlook. Stock price of Alphabet plunged 7.4%, or roughly 100 billion dollars, when Bard, Google’s newly introduced counter offering to OpenAI’s ChatGPT, delivered a factual mistake during the demo.

Needless to say, companies like Anthropic, Baidu, Meta and Snap all went in with their own version of this futuristic chatbot. But now let’s pause a little bit before we get flipped over by this AI hype and look back at what used to be the product everybody was talking about and how does it look like now, if we want to talk about what would the future be like.

The good and the old

This goes without saying, I love Databases. Databases are monolithic and beautiful. There is normally a query languages, the famous or infamous SQL and its countless dialects so that a user can query data stored in the database in a tabular format. Underneath, there are query planner, optimizer, execution engine and file storage. If you want to to put it in a cloud, then throw in some resource provisioning, distributed scheduler and caching. Bam! You got it. I said I love them because modern databases offer a well-packaged and reasonable solution to users who want to store their data and retrieve information with the stored data, but here is the catch, in a tabular, 2-dimentional format. This architecture would look more or less like what the current databases are offering. Of cource there are different flavors blahblahblah but the idea is the same. That is what databases are for and this is what users are paying for. Modern database products like Snowflake and Databricks are the jewels of that crown. They are easy to use. Users can adopt a pay-as-you-go pricing plan and they are on the cloud, meaning no need to hire any database administrators(DBA) to manage your own database setup. It’s all been taken care of.

Hold on a minute. How does this have anything to do with the AI stuff we mentioned above? Well the idea is that when we talk about using databases, we are really talking about storing and retrieving information from tabular data. Tabular data is data that we can put in a table-like arrangements, like in columns and rows. But that may be enough for, I don’t know, the 20th century? Certainly not enough for what we are looking at today. Exploding amounts of texts, videos and photos are far from any databases can take in and don’t even get me started on getting any information out of those distilled tables. We need a more efficient and compressed way to store those data and databases are simply not the answer.

To be continued

Updated:

Comments