PER.22.08.01.RFC1.7 - Monolith for AI-Assisted Investment Analysis
Document ID: PER.22.08.01.RFC1.7
Status: DRAFT
Date: 2025-06-18
Abstract
This document specifies the architecture for an AI-assisted investment analysis system. The core of the system is a unified Elixir monolith built with the Phoenix Framework and the Ash Framework. This monolith internally manages all workloads, including specialized AI/ML tasks powered by the ash_ai library. The system's Thesis Generation System (TGS) is implemented as a crew of stateful, autonomous AI agents orchestrated within the monolith. Each agent is an LLM-driven entity given a specific persona and a set of specialized Elixir-based "tools" to perform its function. Long-running research processes are managed as reliable, asynchronous background jobs using Oban, eliminating the need for external services or a separate event bus for core operations.
Table of Contents
- Abstract
- 1. Introduction
- 2. Terminology
- 3. System Architecture
- 4. Trading Platform & Brokerage Interaction
- 5. Infrastructure Specifications
- 6. Data Source Requirements
- 7. Data Processing Architecture
- 8. Investment Strategy & AI Model Implementation
- 9. Risk Management Protocol (User Guidance)
- 10. Security Considerations
- 11. Cost Analysis
- 12. Performance Metrics
- 13. Core Technologies and Libraries
- 14. References
- Appendix A: Thesis Generation System (TGS) Implementation Details
1. Introduction
1.1. Purpose
This RFC outlines the design for an AI-assisted investment analysis system built as a unified Elixir monolith. This approach simplifies the technology stack and operational overhead by leveraging the strengths of the BEAM, the Phoenix and Ash frameworks for core application logic, and the ash_ai and Oban libraries for sophisticated, asynchronous AI workloads. The specification details a Thesis Generation System (TGS) based on a crew of tool-equipped AI agents capable of performing deep, iterative, and verifiable research.
1.2. Guiding Principles
- Unified Monolith: A single, cohesive codebase simplifies development, testing, and deployment.
- Asynchronous Job-Based Communication: Long-running or complex tasks MUST be executed in background jobs (via Oban) to ensure the system remains responsive and reliable.
- Framework Power: Accelerate development with Phoenix for application structure, Ash for declarative resource modeling,
ash_aifor AI agent implementation, andObanfor robust job processing. - Tool-Based AI: AI agents SHALL be implemented as LLM-driven entities that are given and can reason about a discrete set of specialized tools (Elixir functions) to accomplish their goals.
- Deep Research & Verifiability: The TGS MUST move beyond simple data retrieval to a process of analysis, synthesis, and conflict resolution, with all claims traceable to their sources.
- Human-in-the-Loop: The user retains final investment decision-making authority.
1.3. Scope
This specification covers:
- The architecture of the unified Elixir/Phoenix monolith.
- The internal multi-agent architecture of the TGS, powered by
ash_ai. - Asynchronous workflow orchestration using Oban.
- Infrastructure requirements for the monolithic deployment.
Direct automated trade execution is out of scope.
2. Terminology
- Elixir Monolith: The primary Elixir application housing all logical services (DIS, SGS, TGS, TRS, NAS, CSMS).
ash_ai: An Elixir library and Ash Framework extension for building AI features, including creating agents and defining tools for them to use.Oban: A robust background job processing library for Elixir, used to run the TGS workflow asynchronously.- AI Agent: An LLM-driven entity defined within the application, given a specific persona (e.g., "You are a financial researcher") and a set of tools to achieve a goal.
- Tool: A specific Elixir function with a defined purpose, name, and argument schema that is exposed to an AI Agent to interact with the system or external APIs.
- Thesis Generation System (TGS): A logical service within the monolith that orchestrates a crew of AI agents to perform deep research and generate an investment thesis.
3. System Architecture
3.1. Overall Application Structure
The high-level architecture is a unified monolith. The complex TGS workflow is encapsulated within an asynchronous Oban job, which orchestrates the AI agent crew. All communication is handled via internal function calls and job enqueuing.
graph TD
subgraph "Elixir Monolith"
direction LR
subgraph "Core Services"
direction TB
SGS["Signal Generation Service"]
TRS["Thesis Review Service"]
Other_Modules["DIS, NAS, CSMS"]
end
subgraph "TGS (Thesis Generation System)"
direction TB
Orchestrator["Oban Job Orchestrator"]
AgentCrew["AI Agent Crew (ash_ai)"]
Tools["Elixir-based Tools
(API Clients, DB Access)"]
end
SGS -- "Enqueues Job" --> Orchestrator
Orchestrator -- "Manages & Invokes" --> AgentCrew
AgentCrew -- "Uses" --> Tools
Tools -- "HTTP/DB Calls" --> External
AgentCrew -- "Produces Thesis" --> TRS
end
subgraph "External"
direction TB
db["PostgreSQL Database"]
ext_data["Data Sources & LLMs"]
end
CoreServices -- "DB Connection" --> db
Tools -- "DB Access" --> db3.2. Core Components
All logical services are implemented as modules within the Elixir monolith:
- Data Ingestion Service (DIS): Modules for polling basic market data.
- Signal Generation Service (SGS): Modules that perform technical analysis, identify investment candidates, and enqueue a TGS job in Oban.
- Thesis Generation System (TGS): An internal service, triggered by an Oban job, that manages a crew of
ash_ai-powered agents to conduct deep research. This service is detailed in Appendix A. - Thesis Review Service (TRS): Modules that receive the generated thesis, perform a critical review, and decide whether to notify the user.
- Notification and Action Service (NAS): Modules for formatting and sending user notifications.
- Configuration and State Management (CSMS): Managed via Ash resources.
3.3. Communication and Data Flow
Communication is entirely internal to the monolith. There are no external network event buses between services.
- The workflow SHALL be initiated by the
SGSenqueuing a job inOban. - The
Obanworker SHALL orchestrate theTGSagent crew. - Data SHALL be passed between agents as standard Elixir terms (structs, maps, etc.).
7. Data Processing Architecture
7.2. Asynchronous Job Workflow
The Oban job workflow defines the interaction between the services.
- SGS (Elixir) identifies a candidate and enqueues a
TGS.Jobwith the candidate's details (e.g.,Oban.insert(TGS.Job.new(%{topic: "AAPL"}))). - An Oban worker picks up the job, starting the TGS process.
- The job invokes the TGS
SupervisorAgent, which dynamically plans and delegates tasks to its crew of specialist agents. - The agent crew collaborates, using their tools to conduct research and build the thesis.
- The job concludes by passing the final, cited report to the TRS (Elixir) for review.
10. Security Considerations
- API Key Management: All API keys for LLMs and data sources MUST be managed via environment variables or an encrypted secrets store (e.g., Elixir's
Vapor). - Oban Dashboard Security: If the Oban Web UI is used in production, it MUST be secured behind an authentication and authorization layer (e.g., using
Phoenix.LiveView.Router.basic_auth). - Tool Safety: Tools provided to AI agents MUST be designed with safety in mind. Destructive operations (e.g., writing/deleting data) SHOULD require explicit user confirmation or be heavily restricted.
- Ethical Operation: Agents MUST be designed to operate ethically, including flagging biases, respecting website terms of service via their scraping tools, and ensuring rigorous citation.
11. Cost Analysis
- Monolith Infrastructure: A single container or PaaS instance (2-4 vCPUs, 2-4 GB RAM) is sufficient. ~$15-50/month.
- LLM/Data APIs: This remains the primary variable cost. For ~5 comprehensive analyses per month, this could be ~$20-50/month.
- Managed Database: Managed PostgreSQL instance. ~$15-30/month.
- Total Estimated Monthly Cost: $50 - $130 / month. (Note: NATS server cost is eliminated).
13. Core Technologies and Libraries
- Elixir/OTP: Latest stable version.
- Core Frameworks:
phoenix,ash,ash_postgres. - AI & Job Processing:
ash_ai,oban. - HTTP Client:
req. - Data Parsing:
jason,floki(for HTML).
14. References
- [AshAI] AshAI Documentation https://ash-hq.org/docs/ash_ai
- [Oban] Oban Documentation https://getoban.pro/
Appendix A: Thesis Generation System (TGS) Implementation Details
A.1 Introduction
This appendix specifies the implementation of the TGS as an autonomous, tool-based multi-agent system powered by ash_ai. This architecture moves beyond simple prompt chaining to a more robust model where AI agents are given specific capabilities (tools) and can reason about how and when to use them. The entire system is orchestrated by a master Oban job within the monolith.
A.2 Architecture Overview: The Agent and the Tool
The core concept is the Tool-Using Agent. An agent is an LLM with a persistent persona and access to a predefined set of Elixir functions, known as "tools." The orchestration is managed by a top-level SupervisorAgent that delegates tasks to specialist agents in its crew.
graph TD
A[Oban Job Starts] --> B(Invoke SupervisorAgent);
B -- "Goal: 'Analyze X'" --> C{SupervisorAgent
Plans & Delegates};
C -- "Tool Call:
delegate_task(:researcher, 'Find sources for X')" --> D{ResearchAgent};
D -- "Tool Call:
web_search('financial news for X')" --> E[Execute Tool: web_search];
E --> F[API/DB];
F --> E;
E -- "Result: List of URLs" --> D;
D -- "Result" --> C;
C -- "Tool Call:
delegate_task(:analyzer, 'Extract facts from URLs')" --> G{AnalysisAgent};
G -- "Tool Calls:
extract_facts(url)" --> H[Execute Tool: extract_facts];
H -- "Result: Knowledge Graph" --> G;
G -- "Result" --> C;
C --> I{SynthesisAgent};
I -- "Tool Call:
write_report(graph)" --> J[Execute Tool: write_report];
J -- "Result: Markdown Report" --> I;
I -- "Final Report" --> A;
A --> K[Job Complete];A.3 The Agent Crew and Their Tools
The TGS is composed of AI agents defined using ash_ai. Each is given a persona and a list of available tools.
| Agent Name | Persona / Role | Key Tools (Elixir Functions) |
| SupervisorAgent | The Orchestrator | delegate_task(agent_name, task_description) |
| ResearchAgent | Information Gatherer | web_search(query), scrape_url_content(url) |
| AnalysisAgent | The Thinker | extract_claims_from_text(text, source), build_knowledge_graph(claims) |
| SynthesisAgent | The Writer | write_report_from_graph(graph) |
A.4 Defining an Agent and Its Tools
Agents and tools are defined using a DSL provided by ash_ai within an Ash resource. This makes them declarative and integrates them tightly with the rest of the application.
Listing: Defining the ResearchAgent and its Tools
# lib/my_app/tgs/agents.ex
defmodule MyApp.TGS.Agents do
use Ash.Resource,
extensions: [AshAi.Resource.Agent]
# This resource is a registry for our agents
agents do
# Define the ResearchAgent
agent :researcher do
description "A financial research assistant that finds and scrapes web content."
# The tools this agent is allowed to use
tools [MyApp.TGS.Tools.Web, :scrape_url_content]
end
# ... other agents like :analyzer, :synthesis, etc.
end
end
# lib/my_app/tgs/tools/web.ex
defmodule MyApp.TGS.Tools.Web do
# This module contains functions that are exposed as "tools"
use AshAi.Resource.Tool
# Define the web_search tool
tool :web_search do
description "Searches the web for a given query using a financial news search engine."
# Define the arguments the LLM must provide to call the tool
argument :query, :string, "The search query to execute."
# The Elixir code that runs when the tool is called
run fn %{query: query}, _context ->
# This calls the actual implementation, e.g., the Tavily API client
MyApp.TGS.Researcher.find_sources(query)
end
end
# Define the scrape_url_content tool
tool :scrape_url_content do
description "Scrapes the full text content from a given URL."
argument :url, :string, "The URL to scrape."
run fn %{url: url}, _context ->
# Call a scraping library like Floki
MyApp.TGS.Scraper.scrape(url)
end
end
end
A.5 Orchestration with Oban and the Supervisor
The Oban job's primary responsibility is to start the process by invoking the SupervisorAgent and to handle the final result. The supervisor AI itself then drives the workflow by deciding which specialist to delegate to next.
Listing: The Orchestrating Oban Job
# lib/my_app/tgs/job.ex
defmodule MyApp.TGS.Job do
use Oban.Worker, queue: :tgs_research
alias MyApp.TGS.Agents
alias AshAi.Chat
@impl Oban.Worker
def perform(%Oban.Job{args: %{"topic" => topic}}) do
# 1. Define the initial goal for the Supervisor
initial_goal = "Generate a comprehensive investment thesis for #{topic}."
# 2. Start a chat session with the SupervisorAgent
# The Supervisor AI will then use its `delegate_task` tool to call other agents.
# AshAI handles the complex loop of the AI picking a tool, the system executing it,
# and the result being fed back to the AI.
case Chat.chat(Agents, :supervisor, initial_goal) do
{:ok, %{content: final_report}} ->
# The Supervisor returns the final report once all delegated tasks are complete
MyApp.TRS.review_thesis(final_report)
:ok
{:error, reason} ->
# The entire agentic workflow failed
{:error, reason}
end
end
end