SkillAgentSearch skills...

Omniai

OmniAI standardizes the APIs for multiple AI providers like OpenAI's Chat GPT, Mistral's LeChat, Claude's Anthropic, Google's Gemini and DeepSeek's Chat..

Install / Use

/learn @ksylvest/Omniai
About this skill

Quality Score

0/100

Supported Platforms

Claude Code
Claude Desktop
Gemini CLI

README

OmniAI

LICENSE RubyGems GitHub Yard CircleCI

OmniAI provides a unified Ruby API for integrating with multiple AI providers, including Anthropic, DeepSeek, Google, Mistral, and OpenAI. It streamlines AI development by offering a consistent interface for features such as chat, text-to-speech, speech-to-text, and embeddings—ensuring seamless interoperability across platforms. Switching between providers is effortless, making any integration more flexible and reliable.

📄 Examples

Example #1: 💬 Chat w/ Text

This example demonstrates using OmniAI with Anthropic to ask for a joke. The response is parsed and printed.

require 'omniai/anthropic'

client = OmniAI::Anthropic::Client.new

puts client.chat("Tell me a joke").text
Why don't scientists trust atoms? Because they make up everything!

Example #2: 💬 Chat w/ Prompt

This example demonstrates using OmniAI with Mistral to ask for the fastest animal. It includes a system and user message in the prompt. The response is streamed in real time.

require "omniai/mistral"

client = OmniAI::Mistral::Client.new

client.chat(stream: $stdout) do |prompt|
  prompt.system "Respond in both English and French."
  prompt.user "What is the fastest animal?"
end
**English**: The peregrine falcon is generally considered the fastest animal, reaching speeds of over 390 km/h.
**French**: Le faucon pèlerin est généralement considéré comme l'animal le plus rapide, atteignant des vitesses de plus de 390 km/h.

Example #3: 💬 Chat w/ Vision

This example demonstrates using OmniAI with OpenAI to prompt a “biologist” for an analysis of photos, identifying the animals within each one. A system and user message are provided, and the response is streamed in real time.

require "omniai/openai"

client = OmniAI::OpenAI::Client.new

CAT_URL = "https://images.unsplash.com/photo-1472491235688-bdc81a63246e?q=80&w=1024&h=1024&fit=crop&fm=jpg"
DOG_URL = "https://images.unsplash.com/photo-1517849845537-4d257902454a?q=80&w=1024&h=1024&fit=crop&fm=jpg"

client.chat(stream: $stdout) do |prompt|
  prompt.system("You are a helpful biologist with expertise in animals who responds with the Latin names.")
  prompt.user do |message|
    message.text("What animals are in the attached photos?")
    message.url(CAT_URL, "image/jpeg")
    message.url(DOG_URL, "image/jpeg")
  end
end
The first photo is of a cat, *Felis Catus*.
The second photo is of a dog, *Canis Familiaris*.

Example #4: 💬 Chat w/ Tools

This example demonstrates using OmniAI with Google to ask for the weather. A tool “Weather” is provided. The tool accepts a location and unit (Celsius or Fahrenheit) then calculates the weather. The LLM makes multiple tool-call requests and is automatically provided with a tool-call response prior to streaming in real-time the result.

require 'omniai/google'

client = OmniAI::Google::Client.new

class WeatherTool < OmniAI::Tool
  description "Lookup the weather for a lat / lng."

  parameter :lat, :number, description: "The latitude of the location."
  parameter :lng, :number, description: "The longitude of the location."
  parameter :unit, :string, enum: %w[Celsius Fahrenheit], description: "The unit of measurement."
  required %i[lat lng]

  # @param lat [Float]
  # @param lng [Float]
  # @param unit [String] "Celsius" or "Fahrenheit"
  #
  # @return [String] e.g. "20° Celsius at lat=43.7 lng=-79.4"
  def execute(lat:, lng:, unit: "Celsius")
    puts "[weather] lat=#{lat} lng=#{lng} unit=#{unit}"
    "#{rand(20..50)}° #{unit} at lat=#{lat} lng=#{lng}"
  end
end

class GeocodeTool < OmniAI::Tool
  description "Lookup the latitude and longitude of a location."

  parameter :location, :string, description: "The location to geocode."
  required %i[location]

  # @param location [String] "Toronto, Canada"
  #
  # @return [Hash] { lat: Float, lng: Float, location: String }
  def execute(location:)
    puts "[geocode] location=#{location}"

    {
      lat: rand(-90.0..+90.0),
      lng: rand(-180.0..+180.0),
      location:,
    }
  end
end

tools = [
  WeatherTool.new,
  GeocodeTool.new,
]

client.chat(stream: $stdout, tools:) do |prompt|
  prompt.system "You are an expert in weather."
  prompt.user 'What is the weather in "London" in Celsius and "Madrid" in Fahrenheit?'
end
[geocode] location=London
[weather] lat=... lng=... unit=Celsius
[geocode] location=Madrid
[weather] lat=... lng=... unit=Fahrenheit

The weather is 24° Celsius in London and 42° Fahrenheit in Madrid.

For a set of pre-built tools for interacting with browsers, databases, docker, and more try the OmniAI::Tools project.

Example #5: 💬 Chat w/ History

Tracking a prompt history over multiple user and assistant messages is especially helpful when building an agent like conversation experience. A prompt can be used to track this back-and-forth conversation:

require "omniai/openai"

puts("Type 'exit' or 'quit' to leave.")

client = OmniAI::OpenAI::Client.new

conversation = OmniAI::Chat::Prompt.build do |prompt|
  prompt.system "You are a helpful assistant. Respond in both English and French."
end

loop do
  print "> "
  text = gets.chomp.strip
  next if text.empty?
  break if text.eql?("exit") || text.eql?("quit")

  conversation.user(text)
  response = client.chat(conversation, stream: $stdout)
  conversation.assistant(response.text)
end

Example #6 💬 Chat w/ Schema

Requesting structured data back from an LLM is possible by defining a schema, then passing the schema into the chat. The following example defines a structured schema using OmniAI::Schema to model a Contact. The results of the LLM call are then parsed using the schema to ensure all types are correct.

format = OmniAI::Schema.format(name: "Contact", schema: OmniAI::Schema.object(
  description: "A contact with a name, relationship, and addresses.",
  properties: {
    name: OmniAI::Schema.string,
    relationship: OmniAI::Schema.string(enum: %w[friend family]),
    addresses: OmniAI::Schema.array(
      items: OmniAI::Schema.object(
        title: "Address",
        description: "An address with street, city, state, and zip code.",
        properties: {
          street: OmniAI::Schema.string,
          city: OmniAI::Schema.string,
          state: OmniAI::Schema.string,
          zip: OmniAI::Schema.string,
        },
        required: %i[street city state zip]
      )
    ),
  },
  required: %i[name]
))

response = client.chat(format:) do |prompt|
  prompt.user <<~TEXT
    Parse the following contact:

    NAME: George Harrison
    RELATIONSHIP: friend
    HOME: 123 Main St, Springfield, IL, 12345
    WORK: 456 Elm St, Springfield, IL, 12345
  TEXT
end

puts format.parse(response.text)
{
  name: "George Harrison",
  relationship: "friend",
  addresses: [
    { street: "123 Main St", city: "Springfield", state: "IL", zip: "12345" },
    { street: "456 Elm St", city: "Springfield", state: "IL", zip: "12345" },
  ]
}

Example #7: 🐚 CLI

The OmniAI gem also ships with a CLI to simplify quick tests.

# Chat

omniai chat "Who designed the Ruby programming language?"
omniai chat --provider="google" --model="gemini-2.0-flash" "Who are you?"

## Speech to Text

omniai speak "Salley sells sea shells by the sea shore." > ./files/audio.wav

# Text to Speech

omniai transcribe "./files/audio.wav"

# Embed

omniai embed "What is the capital of France?"

Example #8: 🔈 Text-to-Speech

This example demonstrates using OmniAI with OpenAI to convert text to speech and save it to a file.

require 'omniai/openai'

client = OmniAI::OpenAI::Client.new

File.open(File.join(__dir__, 'audio.wav'), 'wb') do |file|
  client.speak('Sally sells seashells by the seashore.', format: OmniAI::Speak::Format::WAV) do |chunk|
    file << chunk
  end
end

Example #9: 🎤 Speech-to-Text

This example demonstrates using OmniAI with OpenAI to convert speech to text.

require 'omniai/openai'

client = OmniAI::OpenAI::Client.new

File.open(File.join(__dir__, 'audio.wav'), 'rb') do |file|
  transcription = client.transcribe(file)
  puts(transcription.text)
end

Example #10: 💻 Embeddings

This example demonstrates using OmniAI with Mistral to generate embeddings for a dataset. It defines a set of entrie

Related Skills

View on GitHub
GitHub Stars251
CategoryDevelopment
Updated10h ago
Forks13

Languages

Ruby

Security Score

100/100

Audited on Mar 27, 2026

No findings