YuHeng (玉衡)

A local nutrition tracking app powered by AI

View the Project on GitHub formaxcn/yuheng

LLM Providers & Configuration

YuHeng leverages Large Language Models (LLMs) to provide advanced food recognition and nutritional analysis. It supports multiple providers through a factory-based architecture.

Supported Providers

Architecture

The LLM abstraction layer is located in lib/llm/:

Configuration

LLM settings can be configured in the application’s Settings page:

  1. Provider Select: Choose between Gemini, OpenAI, or Compatible.
  2. API Key: Securely store your provider’s API key.
  3. Model Name: Use the combobox to select from available cloud models or type to enter a custom model name directly.
  4. Base URL (Compatible only): specify the endpoint for your OpenAI-compatible service.

Security

API keys are stored in the application’s persistent settings. On the server, they are used to initialize the appropriate LLM client for each request, ensuring that keys are handled securely and not exposed to the client-side code where possible.