March 10, 2024, 10:55 a.m. | Krisztián Maurer

DEV Community dev.to

I tried using OpenAI's feature for running local functions in a project with many functions. It worked well, even with lots of functions, but the cost of using OpenAI's API increased significantly. This happened because when we send function details (in JSON format) along with our main request, it counts as part of our input tokens, making it more expensive. The problem is, we often send more functions than necessary, even though the AI only needs a few of them …

ai api call cost feature format function functions json llm openai project running work

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

RL Analytics - Content, Data Science Manager

@ Meta | Burlingame, CA

Research Engineer

@ BASF | Houston, TX, US, 77079