Overview
The app turns your question into a structured mind map in one of two ways: a fast path (AI only, no web search) or a research path (orchestrator β web search β synthesis). Both paths end with a markdown outline that the frontend renders as an interactive map.
Pipeline
When "Research the Internet" is off, a single API call generates the outline. When it is on, three steps run in sequence:
Orchestrator β Research β Synthesis (sub-questions) (DDG + Wiki) (markdown outline)
The frontend then parses the markdown into a tree and draws the mind map (zoom, expand/collapse, outline view, search, export).
Fast path (Research off)
- You enter a question and click Generate mind map.
- The app calls
POST /api/brainstormwith your question, depth level, breadth level, andresearch_web: false. - The server uses the direct outline function: one AI (OpenAI) call builds a hierarchical markdown outline (headings #, ##, ###, β¦) from the question. No web search.
- The response returns
outline_markdownand an empty sources list. - The frontend parses the markdown into a tree and renders the mind map.
This path is faster and uses fewer tokens because there is no orchestrator or research step.
Research path (Research on)
When you check Research the Internet, the app runs three stages:
1. Orchestrator
The app calls POST /api/brainstorm/plan with your question and Brainstorm Depth Level. The orchestrator (OpenAI) turns your question into N sub-questions (N = depth level, e.g. 4 by default, or up to 10 with Deep Internet Research). These sub-questions are used to search the web so the AI gets focused results per angle.
2. Research
The frontend (your browser) performs the search so the server never hits DuckDuckGo (reducing block risk). For each sub-query:
- DuckDuckGo: The app fetches the search results page via a CORS proxy, parses the HTML, and extracts title, URL, and snippet for each result. By default up to 4 sub-queries Γ 5 results each; with Deep Internet Research, up to 10 sub-queries Γ 12 results each. A short delay is used between requests to be gentle on the proxy.
- Wikipedia: For the main question, the app fetches one Wikipedia summary (intro text) via the public API (no key). This is merged with the DuckDuckGo results so the synthesis gets both search snippets and an encyclopedic overview.
All results are sent to the synthesis API as raw_results (grouped by sub-query).
3. Synthesis
The app calls POST /api/brainstorm/synthesize with your question, Brainstorm Breadth Level, and raw_results. The synthesis agent (OpenAI) reads the sub-questions and the web results (titles and snippets) and produces a single markdown outline (headings). Breadth level controls how many heading levels (1β30). The response includes outline_markdown and a sources list (title, URL, snippet) for the left-hand panel.
What the frontend does
- Parse: The markdown outline is parsed into a tree (each heading becomes a node).
- Mind map: A custom SVG renders the tree left-to-right with curved links. Each node has an expand/collapse circle; links to children start from the right edge of that circle.
- Views: You can switch to Outline (text list) or Concept graph (force-directed network of the same nodes and links).
- Search: A search box filters and highlights matching nodes in the map and outline, and filters the Sources list.
- Export: PNG, PDF, HTML, or copy the raw markdown.
- Research Paper tab: Preview uses
PaperPreview(react-markdown): section numbers for all heading levels (h1βh6), a Table of contents at the top with clickable links that scroll to each section, and body content indented 1rem to the right of each heading. Markdown tab is a raw textarea. Paper is generated viaPOST /api/brainstorm/paperwithinclude_toc: false; the app always shows its own TOC in Preview.
Session on load: The app first tries to restore the last session from the browser (localStorage)βquestion, outline, sources, and paper. If none exists (first visit), it fetches the default questionβs cached mind map and paper from the server (GET /api/brainstorm/seed) for a quick first load with no API calls. If that fails, it falls back to a minimal built-in seed. The default question is "Opportunities and challenges of applying AI to business?"
Mind map and research paper
When you click Generate mind map, the app builds the outline first (fast or research path), then automatically calls the paper API with that outline. The paper is generated from the same outline in one flow, so the structure matches the mind map. Both are stored in the browser. The Detail Mind Map tab shows the full paper as a mind map (headings and content as nodes).
Summary
Fast path: Question β one AI call β outline β mind map. Research path: Question β orchestrator (sub-questions) β browser (DuckDuckGo + Wikipedia) β synthesis (outline from results) β mind map + sources. The frontend turns the final markdown into the interactive map. The outline is then expanded into a full paper automatically; both are saved. Detail Mind Map tab renders the paper as a navigable mind map.