🚀 DAY 5 OF LAUNCH WEEK: Introducing Socket Firewall Enterprise.Learn more
Socket
Book a DemoInstallSign in
Socket

me.kpavlov.aimocks:ai-mocks-gemini

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

me.kpavlov.aimocks:ai-mocks-gemini

Mock server for Google VertexAI Gemini API built with Mockksy

Source
mavenMaven
Version
0.5.1
Version published
Maintainers
1
Source

Mokksy and AI-Mocks

Maven Central Kotlin CI GitHub branch status Codacy Badge Codacy Coverage codecov CodeRabbit Pull Request Reviews

Documentation API Reference Ask DeepWiki GitHub License Kotlin API

Mokksy and AI-Mocks are mock HTTP and LLM (Large Language Model) servers inspired by WireMock, with support for response streaming and Server-Side Events (SSE). They are designed to build, test, and mock LLM responses for development purposes.

Mokksy

mokksy-mascot-256.png

Mokksy is a mock HTTP server built with Kotlin and Ktor. It addresses the limitations of WireMock by supporting true SSE and streaming responses, making it particularly useful for integration testing LLM clients.

Core Features

  • Flexibility to control server response directly via ApplicationCall object.
  • Built with Kotest Assertions.
  • Fluent modern Kotlin DSL API.
  • Support for simulating streamed responses and Server-Side Events (SSE) with delays between chunks.
  • Support for simulating response delays.

Example Usages

Responding with Predefined Responses

// given
val expectedResponse =
  // language=json
  """
    {
        "response": "Pong"
    }
    """.trimIndent()

mokksy.get {
  path = beEqual("/ping")
  containsHeader("Foo", "bar")
} respondsWith {
  body = expectedResponse
}

// when
val result = client.get("/ping") {
  headers.append("Foo", "bar")
}

// then
assertThat(result.status).isEqualTo(HttpStatusCode.OK)
assertThat(result.bodyAsText()).isEqualTo(expectedResponse)

POST Request

// given
val id = Random.nextInt()
val expectedResponse =
  // language=json
  """
    {
        "id": "$id",
        "name": "thing-$id"
    }
    """.trimIndent()

mokksy.post {
  path = beEqual("/things")
  bodyContains("\"$id\"")
} respondsWith {
  body = expectedResponse
  httpStatus = HttpStatusCode.Created
  headers {
    // type-safe builder style
    append(HttpHeaders.Location, "/things/$id")
  }
  headers += "Foo" to "bar" // list style
}

// when
val result =
  client.post("/things") {
    headers.append("Content-Type", "application/json")
    setBody(
      // language=json
      """
            {
                "id": "$id"
            }
            """.trimIndent(),
    )
  }

// then
assertThat(result.status).isEqualTo(HttpStatusCode.Created)
assertThat(result.bodyAsText()).isEqualTo(expectedResponse)
assertThat(result.headers["Location"]).isEqualTo("/things/$id")
assertThat(result.headers["Foo"]).isEqualTo("bar")

Server-Side Events (SSE) Response

Server-Side Events (SSE) is a technology that allows a server to push updates to the client over a single, long-lived HTTP connection, enabling real-time updates without requiring the client to continuously poll the server for new data.

mokksy.post {
  path = beEqual("/sse")
} respondsWithSseStream {
  flow =
    flow {
      delay(200.milliseconds)
      emit(
        ServerSentEvent(
          data = "One",
        ),
      )
      delay(50.milliseconds)
      emit(
        ServerSentEvent(
          data = "Two",
        ),
      )
    }
}

// when
val result = client.post("/sse")

// then
assertThat(result.status)
  .isEqualTo(HttpStatusCode.OK)
assertThat(result.contentType())
  .isEqualTo(ContentType.Text.EventStream.withCharsetIfNeeded(Charsets.UTF_8))
assertThat(result.bodyAsText())
  .isEqualTo("data: One\r\ndata: Two\r\n")

AI-Mocks

AI-Mocks is a specialized mock server implementations (e.g., mocking OpenAI API) built using Mokksy.

It supports mocking following AI services:

Feature Support Matrix

FeatureOpenAIAnthropicGeminiOllamaA2A
Chat Completions
Streaming
Embeddings
Moderation
Additional APIsResponses--GenerateFull A2A Protocol
(11 endpoints)

Enjoying LLM integration testing? ❤️

Buy me a Coffee

How to build

Building project locally:

gradle build

or using Make:

make

Contributing

I do welcome contributions! Please see the Contributing Guidelines for details.

FAQs

Package last updated on 12 Oct 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts