LLM API Reference¶
Auto-generated API documentation for the LLM module.
iso8583sim.llm¶
LLM module public interface.
llm
¶
LLM-powered features for ISO 8583 message handling.
This module provides AI-powered tools for explaining and generating ISO 8583 messages using various LLM providers.
Features: - MessageExplainer: Explain messages in plain English - MessageGenerator: Generate messages from natural language
Supported Providers: - Anthropic (Claude) - OpenAI (GPT) - Google (Gemini) - Ollama (local models)
Example
from iso8583sim.llm import MessageExplainer, MessageGenerator
Explain a message¶
explainer = MessageExplainer() # Auto-detects provider print(explainer.explain(message))
Generate a message¶
generator = MessageGenerator(provider="anthropic") message = generator.generate("$100 VISA purchase")
Installation
Install with all LLM providers¶
pip install iso8583sim[llm]
Or install specific providers¶
pip install iso8583sim[anthropic] pip install iso8583sim[openai] pip install iso8583sim[google] pip install iso8583sim[ollama]
LLMError
¶
Bases: Exception
Base exception for LLM-related errors.
LLMProvider
¶
Bases: ABC
Abstract base class for LLM providers.
All LLM providers must implement this interface to be compatible with MessageExplainer and MessageGenerator.
Example
class MyProvider(LLMProvider): ... def complete(self, prompt, system=None): ... return "response" ... @property ... def name(self): ... return "my-provider"
complete
abstractmethod
¶
complete(prompt: str, system: str | None = None) -> str
Send a prompt to the LLM and return the response text.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The LLM's response text |
Raises:
| Type | Description |
|---|---|
LLMError
|
If the API call fails |
Source code in iso8583sim/llm/base.py
68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 | |
complete_with_metadata
¶
complete_with_metadata(prompt: str, system: str | None = None) -> LLMResponse
Send a prompt and return response with metadata.
Default implementation wraps complete(). Providers can override for more detailed metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
LLMResponse
|
LLMResponse with content and metadata |
Source code in iso8583sim/llm/base.py
95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 | |
LLMResponse
dataclass
¶
LLMResponse(content: str, model: str, provider: str, usage: dict[str, int] | None = None)
Response from an LLM provider.
ProviderNotAvailableError
¶
ProviderNotAvailableError(provider: str, package: str)
Bases: LLMError
Raised when a provider's dependencies are not installed.
Source code in iso8583sim/llm/base.py
32 33 34 35 36 37 38 | |
MessageExplainer
¶
MessageExplainer(provider: LLMProvider | str | None = None)
Explains ISO 8583 messages in plain English using an LLM.
This class uses an LLM provider to generate human-readable explanations of ISO 8583 messages, making them easier to understand for developers and analysts.
Example
explainer = MessageExplainer() # Auto-detects provider message = parser.parse("0100...") print(explainer.explain(message)) "This is a VISA authorization request for $100.00..."
Or explain a raw message string¶
print(explainer.explain("0100702406C120E09000..."))
Use a specific provider¶
explainer = MessageExplainer(provider="anthropic")
Initialize the MessageExplainer.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
LLMProvider | str | None
|
LLM provider to use. Can be: - LLMProvider instance: Use directly - str: Provider name ('anthropic', 'openai', 'google', 'ollama') - None: Auto-detect available provider |
None
|
Source code in iso8583sim/llm/explainer.py
37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | |
explain
¶
explain(message: ISO8583Message | str, verbose: bool = False) -> str
Explain an ISO 8583 message in plain English.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
message
|
ISO8583Message | str
|
ISO8583Message object or raw message string to explain |
required |
verbose
|
bool
|
If True, include more technical details |
False
|
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the message |
Source code in iso8583sim/llm/explainer.py
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 | |
explain_field
¶
explain_field(field_number: int, value: str) -> str
Explain a specific ISO 8583 field value.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
field_number
|
int
|
The field number (e.g., 2, 39, 55) |
required |
value
|
str
|
The field value to explain |
required |
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the field and its value |
Source code in iso8583sim/llm/explainer.py
93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 | |
explain_error
¶
explain_error(error: str, message: ISO8583Message | str) -> str
Explain a validation or parsing error in context.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
error
|
str
|
The error message to explain |
required |
message
|
ISO8583Message | str
|
The message that caused the error |
required |
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the error and how to fix it |
Source code in iso8583sim/llm/explainer.py
115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 | |
explain_response_code
¶
explain_response_code(code: str) -> str
Explain an ISO 8583 response code.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
code
|
str
|
The response code (e.g., "00", "51", "05") |
required |
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the response code |
Source code in iso8583sim/llm/explainer.py
143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 | |
MessageGenerator
¶
MessageGenerator(provider: LLMProvider | str | None = None)
Generates ISO 8583 messages from natural language descriptions.
This class uses an LLM to interpret natural language descriptions and generate valid ISO 8583 messages.
Example
generator = MessageGenerator() message = generator.generate("$100 VISA purchase at a gas station") print(message.mti) # "0100" print(message.fields[4]) # "000000010000"
Generate without validation¶
message = generator.generate("refund $50", validate=False)
Initialize the MessageGenerator.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
LLMProvider | str | None
|
LLM provider to use. Can be: - LLMProvider instance: Use directly - str: Provider name ('anthropic', 'openai', 'google', 'ollama') - None: Auto-detect available provider |
None
|
Source code in iso8583sim/llm/generator.py
32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 | |
generate
¶
generate(description: str, validate: bool = True) -> ISO8583Message
Generate an ISO 8583 message from a natural language description.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
description
|
str
|
Natural language description of the desired message (e.g., "$100 VISA purchase at a gas station in NYC") |
required |
validate
|
bool
|
Whether to validate the generated message |
True
|
Returns:
| Type | Description |
|---|---|
ISO8583Message
|
Valid ISO8583Message object |
Raises:
| Type | Description |
|---|---|
GenerationError
|
If the message cannot be generated or validated |
Source code in iso8583sim/llm/generator.py
56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 | |
suggest_fields
¶
suggest_fields(partial_message: ISO8583Message) -> dict[int, str]
Suggest missing fields for a partial message.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
partial_message
|
ISO8583Message
|
Partial ISO8583Message with some fields populated |
required |
Returns:
| Type | Description |
|---|---|
dict[int, str]
|
Dictionary of suggested field values |
Source code in iso8583sim/llm/generator.py
177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 | |
get_provider
¶
get_provider(name: str | None = None, **kwargs) -> LLMProvider
Get an LLM provider instance.
If name is provided, creates that specific provider. If name is None, auto-detects the first available provider.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str | None
|
Provider name ('anthropic', 'openai', 'google', 'ollama') or None for auto-detection |
None
|
**kwargs
|
Additional arguments passed to the provider constructor |
{}
|
Returns:
| Type | Description |
|---|---|
LLMProvider
|
An initialized LLMProvider instance |
Raises:
| Type | Description |
|---|---|
ProviderConfigError
|
If no provider is available or configured |
Example
provider = get_provider() # Auto-detect provider = get_provider("anthropic") # Specific provider provider = get_provider("openai", model="gpt-4-turbo")
Source code in iso8583sim/llm/providers/__init__.py
58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 | |
list_available_providers
¶
list_available_providers() -> list[str]
List all available and configured providers.
Returns:
| Type | Description |
|---|---|
list[str]
|
List of provider names that are installed and configured |
Source code in iso8583sim/llm/providers/__init__.py
107 108 109 110 111 112 113 114 115 116 117 118 | |
list_installed_providers
¶
list_installed_providers() -> list[str]
List all installed providers (may not be configured).
Returns:
| Type | Description |
|---|---|
list[str]
|
List of provider names that have their packages installed |
Source code in iso8583sim/llm/providers/__init__.py
121 122 123 124 125 126 127 128 129 130 131 132 | |
iso8583sim.llm.base¶
Base classes and exceptions.
base
¶
Base classes for LLM providers.
This module defines the abstract base class for LLM providers, allowing multiple backend implementations (Anthropic, OpenAI, Google, Ollama).
LLMResponse
dataclass
¶
LLMResponse(content: str, model: str, provider: str, usage: dict[str, int] | None = None)
Response from an LLM provider.
LLMError
¶
Bases: Exception
Base exception for LLM-related errors.
ProviderNotAvailableError
¶
ProviderNotAvailableError(provider: str, package: str)
Bases: LLMError
Raised when a provider's dependencies are not installed.
Source code in iso8583sim/llm/base.py
32 33 34 35 36 37 38 | |
LLMProvider
¶
Bases: ABC
Abstract base class for LLM providers.
All LLM providers must implement this interface to be compatible with MessageExplainer and MessageGenerator.
Example
class MyProvider(LLMProvider): ... def complete(self, prompt, system=None): ... return "response" ... @property ... def name(self): ... return "my-provider"
complete
abstractmethod
¶
complete(prompt: str, system: str | None = None) -> str
Send a prompt to the LLM and return the response text.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The LLM's response text |
Raises:
| Type | Description |
|---|---|
LLMError
|
If the API call fails |
Source code in iso8583sim/llm/base.py
68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 | |
complete_with_metadata
¶
complete_with_metadata(prompt: str, system: str | None = None) -> LLMResponse
Send a prompt and return response with metadata.
Default implementation wraps complete(). Providers can override for more detailed metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
LLMResponse
|
LLMResponse with content and metadata |
Source code in iso8583sim/llm/base.py
95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 | |
iso8583sim.llm.explainer¶
Message explanation using LLMs.
explainer
¶
Message Explainer using LLM to provide human-readable explanations.
MessageExplainer
¶
MessageExplainer(provider: LLMProvider | str | None = None)
Explains ISO 8583 messages in plain English using an LLM.
This class uses an LLM provider to generate human-readable explanations of ISO 8583 messages, making them easier to understand for developers and analysts.
Example
explainer = MessageExplainer() # Auto-detects provider message = parser.parse("0100...") print(explainer.explain(message)) "This is a VISA authorization request for $100.00..."
Or explain a raw message string¶
print(explainer.explain("0100702406C120E09000..."))
Use a specific provider¶
explainer = MessageExplainer(provider="anthropic")
Initialize the MessageExplainer.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
LLMProvider | str | None
|
LLM provider to use. Can be: - LLMProvider instance: Use directly - str: Provider name ('anthropic', 'openai', 'google', 'ollama') - None: Auto-detect available provider |
None
|
Source code in iso8583sim/llm/explainer.py
37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | |
explain
¶
explain(message: ISO8583Message | str, verbose: bool = False) -> str
Explain an ISO 8583 message in plain English.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
message
|
ISO8583Message | str
|
ISO8583Message object or raw message string to explain |
required |
verbose
|
bool
|
If True, include more technical details |
False
|
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the message |
Source code in iso8583sim/llm/explainer.py
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 | |
explain_field
¶
explain_field(field_number: int, value: str) -> str
Explain a specific ISO 8583 field value.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
field_number
|
int
|
The field number (e.g., 2, 39, 55) |
required |
value
|
str
|
The field value to explain |
required |
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the field and its value |
Source code in iso8583sim/llm/explainer.py
93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 | |
explain_error
¶
explain_error(error: str, message: ISO8583Message | str) -> str
Explain a validation or parsing error in context.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
error
|
str
|
The error message to explain |
required |
message
|
ISO8583Message | str
|
The message that caused the error |
required |
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the error and how to fix it |
Source code in iso8583sim/llm/explainer.py
115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 | |
explain_response_code
¶
explain_response_code(code: str) -> str
Explain an ISO 8583 response code.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
code
|
str
|
The response code (e.g., "00", "51", "05") |
required |
Returns:
| Type | Description |
|---|---|
str
|
Human-readable explanation of the response code |
Source code in iso8583sim/llm/explainer.py
143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 | |
iso8583sim.llm.generator¶
Message generation from natural language.
generator
¶
Message Generator using LLM to create ISO 8583 messages from natural language.
MessageGenerator
¶
MessageGenerator(provider: LLMProvider | str | None = None)
Generates ISO 8583 messages from natural language descriptions.
This class uses an LLM to interpret natural language descriptions and generate valid ISO 8583 messages.
Example
generator = MessageGenerator() message = generator.generate("$100 VISA purchase at a gas station") print(message.mti) # "0100" print(message.fields[4]) # "000000010000"
Generate without validation¶
message = generator.generate("refund $50", validate=False)
Initialize the MessageGenerator.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
LLMProvider | str | None
|
LLM provider to use. Can be: - LLMProvider instance: Use directly - str: Provider name ('anthropic', 'openai', 'google', 'ollama') - None: Auto-detect available provider |
None
|
Source code in iso8583sim/llm/generator.py
32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 | |
generate
¶
generate(description: str, validate: bool = True) -> ISO8583Message
Generate an ISO 8583 message from a natural language description.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
description
|
str
|
Natural language description of the desired message (e.g., "$100 VISA purchase at a gas station in NYC") |
required |
validate
|
bool
|
Whether to validate the generated message |
True
|
Returns:
| Type | Description |
|---|---|
ISO8583Message
|
Valid ISO8583Message object |
Raises:
| Type | Description |
|---|---|
GenerationError
|
If the message cannot be generated or validated |
Source code in iso8583sim/llm/generator.py
56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 | |
suggest_fields
¶
suggest_fields(partial_message: ISO8583Message) -> dict[int, str]
Suggest missing fields for a partial message.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
partial_message
|
ISO8583Message
|
Partial ISO8583Message with some fields populated |
required |
Returns:
| Type | Description |
|---|---|
dict[int, str]
|
Dictionary of suggested field values |
Source code in iso8583sim/llm/generator.py
177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 | |
iso8583sim.llm.providers¶
Provider factory and utilities.
providers
¶
LLM provider factory and auto-detection.
This module provides a factory function to create LLM providers and auto-detect available providers based on installed packages and configured API keys.
get_provider
¶
get_provider(name: str | None = None, **kwargs) -> LLMProvider
Get an LLM provider instance.
If name is provided, creates that specific provider. If name is None, auto-detects the first available provider.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str | None
|
Provider name ('anthropic', 'openai', 'google', 'ollama') or None for auto-detection |
None
|
**kwargs
|
Additional arguments passed to the provider constructor |
{}
|
Returns:
| Type | Description |
|---|---|
LLMProvider
|
An initialized LLMProvider instance |
Raises:
| Type | Description |
|---|---|
ProviderConfigError
|
If no provider is available or configured |
Example
provider = get_provider() # Auto-detect provider = get_provider("anthropic") # Specific provider provider = get_provider("openai", model="gpt-4-turbo")
Source code in iso8583sim/llm/providers/__init__.py
58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 | |
list_available_providers
¶
list_available_providers() -> list[str]
List all available and configured providers.
Returns:
| Type | Description |
|---|---|
list[str]
|
List of provider names that are installed and configured |
Source code in iso8583sim/llm/providers/__init__.py
107 108 109 110 111 112 113 114 115 116 117 118 | |
list_installed_providers
¶
list_installed_providers() -> list[str]
List all installed providers (may not be configured).
Returns:
| Type | Description |
|---|---|
list[str]
|
List of provider names that have their packages installed |
Source code in iso8583sim/llm/providers/__init__.py
121 122 123 124 125 126 127 128 129 130 131 132 | |
iso8583sim.llm.providers.anthropic¶
Anthropic (Claude) provider.
anthropic
¶
Anthropic (Claude) LLM provider implementation.
AnthropicProvider
¶
AnthropicProvider(api_key: str | None = None, model: str | None = None, max_tokens: int | None = None)
Bases: LLMProvider
LLM provider using Anthropic's Claude API.
Example
provider = AnthropicProvider() # Uses ANTHROPIC_API_KEY env var response = provider.complete("Explain ISO 8583") print(response)
Or with explicit API key¶
provider = AnthropicProvider(api_key="sk-ant-...")
Initialize the Anthropic provider.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
api_key
|
str | None
|
Anthropic API key. If not provided, uses ANTHROPIC_API_KEY env var. |
None
|
model
|
str | None
|
Model to use. Defaults to claude-sonnet-4-20250514. |
None
|
max_tokens
|
int | None
|
Maximum tokens in response. Defaults to 4096. |
None
|
Raises:
| Type | Description |
|---|---|
ProviderNotAvailableError
|
If anthropic package is not installed. |
ProviderConfigError
|
If no API key is available. |
Source code in iso8583sim/llm/providers/anthropic.py
32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 | |
complete
¶
complete(prompt: str, system: str | None = None) -> str
Send a prompt to Claude and return the response.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The response text from Claude |
Raises:
| Type | Description |
|---|---|
LLMError
|
If the API call fails |
Source code in iso8583sim/llm/providers/anthropic.py
72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 | |
complete_with_metadata
¶
complete_with_metadata(prompt: str, system: str | None = None) -> LLMResponse
Send a prompt and return response with metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
LLMResponse
|
LLMResponse with content and usage metadata |
Source code in iso8583sim/llm/providers/anthropic.py
98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 | |
is_available
¶
is_available() -> bool
Check if Anthropic provider is available.
Returns:
| Type | Description |
|---|---|
bool
|
True if anthropic package is installed and API key is configured. |
Source code in iso8583sim/llm/providers/anthropic.py
130 131 132 133 134 135 136 137 138 | |
iso8583sim.llm.providers.openai¶
OpenAI (GPT) provider.
openai
¶
OpenAI (GPT) LLM provider implementation.
OpenAIProvider
¶
OpenAIProvider(api_key: str | None = None, model: str | None = None, max_tokens: int | None = None)
Bases: LLMProvider
LLM provider using OpenAI's GPT API.
Example
provider = OpenAIProvider() # Uses OPENAI_API_KEY env var response = provider.complete("Explain ISO 8583") print(response)
Or with explicit API key¶
provider = OpenAIProvider(api_key="sk-...")
Initialize the OpenAI provider.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
api_key
|
str | None
|
OpenAI API key. If not provided, uses OPENAI_API_KEY env var. |
None
|
model
|
str | None
|
Model to use. Defaults to gpt-4o. |
None
|
max_tokens
|
int | None
|
Maximum tokens in response. Defaults to 4096. |
None
|
Raises:
| Type | Description |
|---|---|
ProviderNotAvailableError
|
If openai package is not installed. |
ProviderConfigError
|
If no API key is available. |
Source code in iso8583sim/llm/providers/openai.py
32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 | |
complete
¶
complete(prompt: str, system: str | None = None) -> str
Send a prompt to GPT and return the response.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The response text from GPT |
Raises:
| Type | Description |
|---|---|
LLMError
|
If the API call fails |
Source code in iso8583sim/llm/providers/openai.py
72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 | |
complete_with_metadata
¶
complete_with_metadata(prompt: str, system: str | None = None) -> LLMResponse
Send a prompt and return response with metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
LLMResponse
|
LLMResponse with content and usage metadata |
Source code in iso8583sim/llm/providers/openai.py
102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 | |
is_available
¶
is_available() -> bool
Check if OpenAI provider is available.
Returns:
| Type | Description |
|---|---|
bool
|
True if openai package is installed and API key is configured. |
Source code in iso8583sim/llm/providers/openai.py
143 144 145 146 147 148 149 150 151 | |
iso8583sim.llm.providers.google¶
Google (Gemini) provider.
google
¶
Google (Gemini) LLM provider implementation.
GoogleProvider
¶
GoogleProvider(api_key: str | None = None, model: str | None = None)
Bases: LLMProvider
LLM provider using Google's Gemini API.
Example
provider = GoogleProvider() # Uses GOOGLE_API_KEY env var response = provider.complete("Explain ISO 8583") print(response)
Or with explicit API key¶
provider = GoogleProvider(api_key="...")
Initialize the Google provider.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
api_key
|
str | None
|
Google API key. If not provided, uses GOOGLE_API_KEY env var. |
None
|
model
|
str | None
|
Model to use. Defaults to gemini-1.5-flash. |
None
|
Raises:
| Type | Description |
|---|---|
ProviderNotAvailableError
|
If google-generativeai package is not installed. |
ProviderConfigError
|
If no API key is available. |
Source code in iso8583sim/llm/providers/google.py
31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 | |
complete
¶
complete(prompt: str, system: str | None = None) -> str
Send a prompt to Gemini and return the response.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The response text from Gemini |
Raises:
| Type | Description |
|---|---|
LLMError
|
If the API call fails |
Source code in iso8583sim/llm/providers/google.py
69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 | |
complete_with_metadata
¶
complete_with_metadata(prompt: str, system: str | None = None) -> LLMResponse
Send a prompt and return response with metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
LLMResponse
|
LLMResponse with content and metadata |
Source code in iso8583sim/llm/providers/google.py
93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 | |
is_available
¶
is_available() -> bool
Check if Google provider is available.
Returns:
| Type | Description |
|---|---|
bool
|
True if google-generativeai package is installed and API key is configured. |
Source code in iso8583sim/llm/providers/google.py
127 128 129 130 131 132 133 134 135 | |
iso8583sim.llm.providers.ollama¶
Ollama (local models) provider.
ollama
¶
Ollama (local) LLM provider implementation.
OllamaProvider
¶
OllamaProvider(model: str | None = None, host: str | None = None)
Bases: LLMProvider
LLM provider using local Ollama server.
Example
provider = OllamaProvider() # Uses llama3.2 on localhost response = provider.complete("Explain ISO 8583") print(response)
Or with custom model and host¶
provider = OllamaProvider(model="mistral", host="http://192.168.1.100:11434")
Initialize the Ollama provider.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
model
|
str | None
|
Model to use. Defaults to llama3.2. |
None
|
host
|
str | None
|
Ollama server URL. Defaults to http://localhost:11434. |
None
|
Raises:
| Type | Description |
|---|---|
ProviderNotAvailableError
|
If ollama package is not installed. |
Source code in iso8583sim/llm/providers/ollama.py
30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 | |
complete
¶
complete(prompt: str, system: str | None = None) -> str
Send a prompt to Ollama and return the response.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The response text from Ollama |
Raises:
| Type | Description |
|---|---|
LLMError
|
If the API call fails |
Source code in iso8583sim/llm/providers/ollama.py
61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 | |
complete_with_metadata
¶
complete_with_metadata(prompt: str, system: str | None = None) -> LLMResponse
Send a prompt and return response with metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The user prompt to send |
required |
system
|
str | None
|
Optional system prompt for context |
None
|
Returns:
| Type | Description |
|---|---|
LLMResponse
|
LLMResponse with content and metadata |
Source code in iso8583sim/llm/providers/ollama.py
88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 | |
is_available
¶
is_available() -> bool
Check if Ollama provider is available.
Returns:
| Type | Description |
|---|---|
bool
|
True if ollama package is installed. Note: doesn't check if server is running. |
Source code in iso8583sim/llm/providers/ollama.py
126 127 128 129 130 131 132 | |