UStackUStack
eron icon

eron

eron is a lightweight iOS AI chat client for Ollama. Connect your endpoint with an API key and chat with your available models.

eron

What is eron?

Eron is a lightweight AI client for using your Ollama setup from iOS. It connects to a local or hosted Ollama endpoint and lets you chat with models you have available in that endpoint.

The core purpose of eron is to give fast, mobile access to Ollama models while keeping requests routed to the endpoint you specify and controls you operate.

Key Features

  • Connect to a local or hosted Ollama endpoint using your API key, so the app can reach your chosen backend.
  • Fast access to models configured in your Ollama setup, with the ability to switch between models.
  • Chat with any model that appears from the connected Ollama endpoint.
  • Upload and Web Search options, including sending images and documents and enabling web search for up-to-date answers.
  • Local processing and no data collection/analytics, with requests processed through your own Ollama setup and no analytics or tracking services used by the app.

How to Use eron

  1. Install eron from the iOS App Store.
  2. In the app, provide connection details for your Ollama endpoint and enter your API key.
  3. Select a model from the list shown by the connected endpoint.
  4. Start a chat. If needed, attach images or documents and/or enable web search for answers that rely on current information.

Use Cases

  • Mobile chatting with your own Ollama models while away from your computer, using the same model lineup as your server.
  • Switching between models quickly for different tasks (for example, using different models available on your Ollama endpoint).
  • Sharing context with the model by sending images or documents directly from your iOS device.
  • Getting answers that use web search when you need more current information than what is available in the model alone.
  • Keeping AI traffic within your control by directing all requests through your own Ollama setup and specifying the endpoint manually.

FAQ

  • Does eron send user messages to an external service?
    The page states that all user messages are transmitted only to the endpoint the user enters manually, and that endpoint is operated and controlled by the user.

  • Does eron use analytics or tracking?
    No. The page states that eron uses no analytics or tracking services.

  • Can I use models from a local or hosted Ollama endpoint?
    Yes. Eron can connect to a local or hosted Ollama endpoint using your API key.

  • What content can I send to the model?
    The page indicates you can send images and documents.

  • Is web search available for up-to-date answers?
    Yes. Eron includes an option to enable web search.

Alternatives

  • Generic Ollama-compatible chat clients on iOS: Look for other mobile apps that act as Ollama frontends, offering model selection and chat against a configurable Ollama endpoint.
  • Desktop AI chat apps that connect to Ollama: Desktop clients can provide a richer workspace (e.g., document handling or multi-session views) while using the same local/hosted Ollama backend.
  • Web-based Ollama frontends: Browser apps can offer quick access and shared environments, with similar functionality (model switching, chat) driven by your configured Ollama endpoint.
  • Direct API integration (custom app or scripts): For teams that need tailored workflows, building against the Ollama API can replace a standalone client, at the cost of more setup and maintenance.