Stabilize Lead Engine calendar logic (v1.4) and integrate GTM Architect, B2B Assistant, and Transcription Tool into Docker stack [30388f42]
This commit is contained in:
@@ -1 +1 @@
|
||||
{"task_id": "30388f42-8544-8088-bc48-e59e9b973e91", "token": "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", "readme_path": null, "session_start_time": "2026-03-07T20:00:39.289761"}
|
||||
{"task_id": "30388f42-8544-8088-bc48-e59e9b973e91", "token": "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", "readme_path": null, "session_start_time": "2026-03-08T08:46:14.857712"}
|
||||
107
RELOCATION.md
107
RELOCATION.md
@@ -13,7 +13,9 @@ Diese Ports müssen auf der Firewall für den eingehenden Verkehr zur VM `10.10.
|
||||
| **2222** | `gitea` | Intranet | Gitea Git via SSH. |
|
||||
| **8003** | `connector-so` | **Public** | SuperOffice Webhook-Empfänger (SSL erforderlich!). |
|
||||
| **5678** | `n8n` | **Public** | Automation Workhooks. |
|
||||
| **8004** | `lead-engine` | **Public** | Lead Engine API (für Buchungs-Links). |
|
||||
| **8094** | `gtm-architect`| Intranet | GTM Architect Direct. |
|
||||
| **8092** | `b2b-marketing`| Intranet | B2B Marketing Assistant Direct. |
|
||||
| **8001** | `transcription`| Intranet | Transcription Tool Direct (via 8090). |
|
||||
|
||||
---
|
||||
|
||||
@@ -22,61 +24,82 @@ Diese Ports müssen auf der Firewall für den eingehenden Verkehr zur VM `10.10.
|
||||
* **DNS Resolver:** In Nginx konfiguriert (`resolver 127.0.0.11`).
|
||||
* **WebSockets:** Das Gateway unterstützt `Upgrade`-Header (kritisch für Streamlit/Lead-Engine).
|
||||
* **Echo-Prävention:** Der Connector (`worker.py`) identifiziert sich dynamisch. Keine manuellen ID-Einträge in `.env` nötig, solange `SO_CLIENT_ID` passt.
|
||||
* **Routing:**
|
||||
* `/ce/` -> `company-explorer:8000`
|
||||
* `/lead/` -> `lead-engine:8501` (UI)
|
||||
* `/feedback/` -> `lead-engine:8004` (API)
|
||||
* `/gtm/` -> `gtm-architect:3005` (API/Frontend)
|
||||
* `/b2b/` -> `b2b-marketing-assistant:3002` (API/Frontend)
|
||||
* `/tr/` -> `transcription-tool:8001` (API/Frontend) -> **Achtung:** Benötigt expliziten `rewrite` in Nginx!
|
||||
|
||||
---
|
||||
|
||||
# ⚠️ Kritische Lehren (Post-Mortem 07.03.2026)
|
||||
# ⚠️ Kritische Lehren (Update 08.03.2026)
|
||||
|
||||
Der Umzug am Montag muss zwingend diese Punkte beachten, um den "Totalausfall" von heute zu vermeiden:
|
||||
Der Umzug muss zwingend diese Punkte beachten, um den "Totalausfall" zu vermeiden:
|
||||
|
||||
### 1. Datenbank-Schema-Falle
|
||||
**Problem:** Alte `.db`-Dateien (Backups) haben oft nicht alle Spalten, die der aktuelle Code erwartet.
|
||||
**Lösung:** Nach dem Starten der Container auf der neuen VM **muss** das Migrations-Skript ausgeführt werden:
|
||||
```bash
|
||||
docker exec -it company-explorer python /app/fix_missing_columns.py
|
||||
```
|
||||
*Dies repariert Tabellen für Companies, Industries und Contacts (inkl. Unsubscribe-Tokens).*
|
||||
|
||||
### 2. Docker Volumes vs. Bind Mounts
|
||||
### 1. Datenbank-Schema & Volumes
|
||||
**Regel:** Datenbanken werden **NIEMALS** mehr direkt auf einen Host-Pfad gemountet.
|
||||
**Grund:** Permission-Errors und SQLite-Locks ("Database is locked") auf Netzwerk-Dateisystemen (Synology/NFS).
|
||||
**Vorgehen:** Nutzung von benannten Volumes (`explorer_db_data`, `connector_db_data`, `lead_engine_data`).
|
||||
**Grund:** Permission-Errors und SQLite-Locks auf Netzwerk-Dateisystemen.
|
||||
**Vorgehen:** Nutzung von benannten Volumes (`explorer_db_data`, `connector_db_data`, `lead_engine_data`, `gtm_architect_data`, `b2b_marketing_data`, `transcription_uploads`).
|
||||
|
||||
### 3. Daten-Injektion (Der "Rescue"-Befehl)
|
||||
Um bestehende Daten in die neuen Volumes zu bekommen, nutzt man diesen Befehl:
|
||||
```bash
|
||||
# Beispiel für den Company Explorer
|
||||
docker cp ./my_local_backup.db company-explorer:/data/companies_v3_fixed_2.db
|
||||
```
|
||||
### 2. Lead Engine: Kalender-Logik (v1.4)
|
||||
* **Raster:** Das System bietet Termine nur im **15-Minuten-Takt** (:00, :15, :30, :45) an.
|
||||
* **Abstand:** Zwischen zwei Terminvorschlägen liegen mind. **3 Stunden** Pause.
|
||||
* **AppOnly Workaround:** Termin wird im Kalender von `info@robo-planet.de` erstellt und der Mitarbeiter (`e.melcer@`) als Teilnehmer hinzugefügt.
|
||||
|
||||
### 4. Streamlit Proxy-Konfiguration
|
||||
**Problem:** Streamlit (Lead Engine) verliert die Verbindung unter Sub-Pfaden (`/lead/`).
|
||||
**Lösung:**
|
||||
* Dockerfile: `--server.baseUrlPath=/lead` ist Pflicht.
|
||||
* Nginx: `proxy_http_version 1.1` und `Upgrade`-Header müssen gesetzt sein.
|
||||
### 3. GTM Architect & B2B Assistant: Standalone-Betrieb
|
||||
* **Architektur:** Beide Apps nutzen das "Self-Contained Image" Muster. Code, Frontend-Builds (`dist/`) und `node_modules` sind fest im Image verbaut.
|
||||
* **GTM Port:** 3005 intern.
|
||||
* **B2B Port:** 3002 intern.
|
||||
* **DB-Abhängigkeit:** Der B2B Assistant benötigt zwingend die Datei `market_db_manager.py` (wird beim Build aus dem Root kopiert).
|
||||
|
||||
### 4. Transcription Tool: FFmpeg & Routing
|
||||
* **FFmpeg:** Muss im Image vorhanden sein (Build dauert ca. 15 Min auf Synology).
|
||||
* **Pfade:** Das Tool benötigt eine `tsconfig.json` im `frontend/` Ordner für den TypeScript-Build.
|
||||
* **Nginx:** Der Pfad `/tr/` muss explizit umgeschrieben werden: `rewrite ^/tr/(.*) /$1 break;`.
|
||||
|
||||
---
|
||||
|
||||
### **Verbindlicher Migrationsplan (Montag)**
|
||||
### 📂 Docker Volume Migration (Der "Plug & Play" Weg)
|
||||
|
||||
Um die Daten (Companies, Leads, Projekte, Audio-Files) ohne Verluste umzuziehen, müssen die benannten Volumes gesichert werden.
|
||||
|
||||
**Auf der Synology (Quelle):**
|
||||
```bash
|
||||
# Backup aller kritischen Volumes in ein Archiv
|
||||
docker run --rm -v explorer_db_data:/data -v $(pwd):/backup alpine tar czf /backup/explorer_data.tar.gz -C /data .
|
||||
docker run --rm -v lead_engine_data:/data -v $(pwd):/backup alpine tar czf /backup/lead_data.tar.gz -C /data .
|
||||
docker run --rm -v gtm_architect_data:/data -v $(pwd):/backup alpine tar czf /backup/gtm_data.tar.gz -C /data .
|
||||
docker run --rm -v b2b_marketing_data:/data -v $(pwd):/backup alpine tar czf /backup/b2b_data.tar.gz -C /data .
|
||||
docker run --rm -v transcription_uploads:/data -v $(pwd):/backup alpine tar czf /backup/tr_uploads.tar.gz -C /data .
|
||||
```
|
||||
|
||||
|
||||
|
||||
**Auf der Ubuntu VM (Ziel):**
|
||||
1. Volumes anlegen: `docker volume create explorer_db_data` (etc.)
|
||||
2. Daten wiederherstellen:
|
||||
```bash
|
||||
docker run --rm -v explorer_db_data:/data -v $(pwd):/backup alpine sh -c "cd /data && tar xzf /backup/explorer_data.tar.gz"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **Verbindlicher Migrationsplan**
|
||||
|
||||
**Phase 1: Vorbereitung**
|
||||
1. [ ] **`git push`** auf Synology (aktuellster Stand d1b77fd2).
|
||||
2. [ ] **`.env` Datei sichern** (enthält alle heute stabilisierten Keys!).
|
||||
1. [ ] **`git push`** auf Synology (aktuellster Stand inkl. GTM-Integration).
|
||||
2. [ ] **`.env` Datei sichern** (Vollständigkeit prüfen!).
|
||||
3. [ ] **Volumes sichern** (siehe oben: `tar.gz` Erstellung).
|
||||
|
||||
**Phase 2: Deployment auf `docker1`**
|
||||
1. [ ] Repo klonen: `git clone ... /opt/gtm-engine`
|
||||
2. [ ] `.env` anlegen und befüllen.
|
||||
3. [ ] Starten: `docker compose up -d --build`
|
||||
4. [ ] **Schema-Check:** `docker exec -it company-explorer python /app/fix_missing_columns.py` ausführen.
|
||||
2. [ ] `.env` kopieren.
|
||||
3. [ ] **Volumes restoren** (BEVOR `docker compose up` ausgeführt wird).
|
||||
4. [ ] Starten: `docker compose up -d --build`
|
||||
5. [ ] **Schema-Check:** `docker exec -it company-explorer python /app/fix_missing_columns.py`
|
||||
|
||||
**Phase 3: Datenrettung (Falls nötig)**
|
||||
1. [ ] Datenbank-Dumps vom Wochenende via `docker cp` in die Volumes schieben.
|
||||
2. [ ] Webhook-Status prüfen: `docker exec -it connector-superoffice python /app/register_webhook.py`.
|
||||
|
||||
---
|
||||
|
||||
### **Aktuelle Offene Todos (Priorisiert)**
|
||||
|
||||
1. **Lead Engine:** Microsoft Graph Credentials in `.env` ergänzen (401 Fehler beheben).
|
||||
2. **n8n:** Workflow-Export von Synology und Import auf neuer Instanz.
|
||||
3. **Styling:** Frontend-CSS im Company Explorer verifizieren (Build läuft wieder, aber UI-Check nötig).
|
||||
**Phase 3: Verifizierung**
|
||||
1. [ ] Check Kalender-Lesen: `docker exec lead-engine python /app/trading_twins/test_calendar_logic.py`
|
||||
2. [ ] Check GTM Architect: `https://10.10.81.2:8090/gtm/`
|
||||
@@ -36,15 +36,20 @@ RUN pip install --no-cache-dir -r requirements.txt
|
||||
# Copy the Node.js server and its production dependencies manifest
|
||||
COPY b2b-marketing-assistant/server.cjs .
|
||||
COPY b2b-marketing-assistant/package.json .
|
||||
COPY helpers.py .
|
||||
COPY config.py .
|
||||
COPY market_db_manager.py .
|
||||
|
||||
# Install only production dependencies for the Node.js server
|
||||
RUN npm install --omit=dev
|
||||
# Install dependencies for the Node.js server
|
||||
RUN npm install
|
||||
RUN npm install express cors
|
||||
|
||||
# Copy the built React app from the builder stage
|
||||
COPY --from=frontend-builder /app/dist ./dist
|
||||
|
||||
# Copy the main Python orchestrator script from the project root
|
||||
COPY b2b_marketing_orchestrator.py .
|
||||
COPY b2b-marketing-assistant/b2b_marketing_orchestrator.py .
|
||||
COPY b2b-marketing-assistant/services ./services
|
||||
|
||||
# Expose the port the Node.js server will run on
|
||||
EXPOSE 3002
|
||||
|
||||
@@ -18,3 +18,20 @@ View your app in AI Studio: https://ai.studio/apps/drive/1ZPnGbhaEnyhIyqs2rYhcPX
|
||||
2. Set the `GEMINI_API_KEY` in the central `.env` file in the project's root directory.
|
||||
3. Run the app:
|
||||
`npm run dev`
|
||||
|
||||
## Docker Deployment (Plug & Play)
|
||||
|
||||
The **B2B Marketing Assistant** is integrated into the central `docker-compose.yml`.
|
||||
|
||||
### Start Service
|
||||
```bash
|
||||
# Build and start
|
||||
docker-compose up -d --build b2b-marketing-assistant
|
||||
```
|
||||
|
||||
### Details
|
||||
* **External Port:** `8092`
|
||||
* **Subpath:** `/b2b/`
|
||||
* **Persistence:** Project data is stored in the `b2b_marketing_data` Docker volume.
|
||||
* **Base URL:** The frontend is served under the `/b2b/` prefix via Nginx.
|
||||
|
||||
|
||||
@@ -23,6 +23,12 @@ services:
|
||||
condition: service_healthy
|
||||
lead-engine:
|
||||
condition: service_started
|
||||
gtm-architect:
|
||||
condition: service_started
|
||||
b2b-marketing-assistant:
|
||||
condition: service_started
|
||||
transcription-tool:
|
||||
condition: service_started
|
||||
|
||||
# --- DASHBOARD ---
|
||||
dashboard:
|
||||
@@ -33,6 +39,52 @@ services:
|
||||
- ./dashboard:/usr/share/nginx/html:ro
|
||||
|
||||
# --- APPS ---
|
||||
transcription-tool:
|
||||
build:
|
||||
context: ./transcription-tool
|
||||
dockerfile: Dockerfile
|
||||
container_name: transcription-tool
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "8001:8001"
|
||||
environment:
|
||||
GEMINI_API_KEY: "${GEMINI_API_KEY}"
|
||||
UPLOAD_DIR: "/app/uploads"
|
||||
volumes:
|
||||
- transcription_uploads:/app/uploads
|
||||
- ./Log_from_docker:/app/logs_debug
|
||||
|
||||
b2b-marketing-assistant:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: b2b-marketing-assistant/Dockerfile
|
||||
container_name: b2b-marketing-assistant
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "8092:3002"
|
||||
environment:
|
||||
GEMINI_API_KEY: "${GEMINI_API_KEY}"
|
||||
PYTHONUNBUFFERED: "1"
|
||||
volumes:
|
||||
- b2b_marketing_data:/data
|
||||
- ./Log_from_docker:/app/logs_debug
|
||||
|
||||
gtm-architect:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: gtm-architect/Dockerfile
|
||||
container_name: gtm-architect
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "8094:80"
|
||||
environment:
|
||||
GEMINI_API_KEY: "${GEMINI_API_KEY}"
|
||||
VITE_API_BASE_URL: "/gtm/api"
|
||||
GTM_DB_PATH: "/data/gtm_projects.db"
|
||||
volumes:
|
||||
- ./Log_from_docker:/app/logs_debug
|
||||
- gtm_architect_data:/data
|
||||
|
||||
company-explorer:
|
||||
build:
|
||||
context: ./company-explorer
|
||||
@@ -144,3 +196,6 @@ volumes:
|
||||
connector_db_data: {}
|
||||
explorer_db_data: {}
|
||||
lead_engine_data: {}
|
||||
gtm_architect_data: {}
|
||||
b2b_marketing_data: {}
|
||||
transcription_uploads: {}
|
||||
|
||||
@@ -59,7 +59,27 @@ Der **Meeting Assistant** ist eine leistungsstarke Suite zur Transkription und B
|
||||
|
||||
---
|
||||
|
||||
## 4. Roadmap
|
||||
## 4. Docker Deployment (Plug & Play)
|
||||
|
||||
Der **Meeting Assistant** ist vollständig in die zentrale `docker-compose.yml` integriert.
|
||||
|
||||
### Inbetriebname
|
||||
```bash
|
||||
# Build & Start
|
||||
docker-compose up -d --build transcription-tool
|
||||
|
||||
# Logs überwachen
|
||||
docker logs -f transcription-tool
|
||||
```
|
||||
|
||||
### Konfiguration
|
||||
* **Port:** Intern `8001`.
|
||||
* **Persistenz:** Audio-Uploads werden im benannten Volume `transcription_uploads` gespeichert (`/app/uploads` im Container).
|
||||
* **Routing:** Das Tool läuft unter dem Pfad `/tr/`. Nginx muss das Präfix strippen: `rewrite ^/tr/(.*) /$1 break;`.
|
||||
|
||||
---
|
||||
|
||||
## 5. Roadmap
|
||||
|
||||
* **v0.7: Search:** Globale Suche über alle Transkripte hinweg.
|
||||
* **v0.8: Q&A an das Meeting:** Ermöglicht, Fragen direkt an das Transkript zu stellen ("Was wurde zu Thema X beschlossen?").
|
||||
|
||||
@@ -40,10 +40,10 @@ COPY gtm-architect/gtm_db_manager.py .
|
||||
|
||||
# Install Python and Node.js dependencies
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
RUN npm install --omit=dev
|
||||
RUN npm install --force
|
||||
|
||||
# Expose the port the server will run on
|
||||
EXPOSE 3005
|
||||
|
||||
# Command to run the server, ensuring dependencies are fresh on start
|
||||
CMD ["/bin/bash", "-c", "pip install --no-cache-dir -r requirements.txt && node server.cjs"]
|
||||
# Command to run the server
|
||||
CMD ["node", "server.cjs"]
|
||||
|
||||
@@ -18,3 +18,20 @@ View your app in AI Studio: https://ai.studio/apps/drive/1bvzSOz-NYMzDph6718RuAy
|
||||
2. Set the `GEMINI_API_KEY` in [.env.local](.env.local) to your Gemini API key
|
||||
3. Run the app:
|
||||
`npm run dev`
|
||||
|
||||
## Docker Deployment (Plug & Play)
|
||||
|
||||
The **GTM Architect** is fully integrated into the project's `docker-compose.yml`.
|
||||
|
||||
### Start Service
|
||||
```bash
|
||||
# Build and start
|
||||
docker-compose up -d --build gtm-architect
|
||||
```
|
||||
|
||||
### Technical Specs
|
||||
* **External Port:** `8094`
|
||||
* **Subpath:** `/gtm/`
|
||||
* **Persistence:** Data is stored in the `gtm_architect_data` Docker volume.
|
||||
* **Self-Contained:** The image includes the built frontend and all Node.js/Python dependencies.
|
||||
|
||||
|
||||
27
gtm-architect/package-lock.json
generated
27
gtm-architect/package-lock.json
generated
@@ -8,8 +8,9 @@
|
||||
"name": "roboplanet-gtm-architect",
|
||||
"version": "0.0.0",
|
||||
"dependencies": {
|
||||
"cors": "^2.8.5",
|
||||
"express": "^4.19.2",
|
||||
"cors": "^2.8.6",
|
||||
"dotenv": "^17.3.1",
|
||||
"express": "^4.22.1",
|
||||
"lucide-react": "^0.562.0",
|
||||
"react": "^19.2.3",
|
||||
"react-dom": "^19.2.3",
|
||||
@@ -1575,9 +1576,9 @@
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/cors": {
|
||||
"version": "2.8.5",
|
||||
"resolved": "https://registry.npmjs.org/cors/-/cors-2.8.5.tgz",
|
||||
"integrity": "sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g==",
|
||||
"version": "2.8.6",
|
||||
"resolved": "https://registry.npmjs.org/cors/-/cors-2.8.6.tgz",
|
||||
"integrity": "sha512-tJtZBBHA6vjIAaF6EnIaq6laBBP9aq/Y3ouVJjEfoHbRBcHBAHYcMh/w8LDrk2PvIMMq8gmopa5D4V8RmbrxGw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"object-assign": "^4",
|
||||
@@ -1585,6 +1586,10 @@
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.10"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/express"
|
||||
}
|
||||
},
|
||||
"node_modules/csstype": {
|
||||
@@ -1665,6 +1670,18 @@
|
||||
"url": "https://github.com/sponsors/wooorm"
|
||||
}
|
||||
},
|
||||
"node_modules/dotenv": {
|
||||
"version": "17.3.1",
|
||||
"resolved": "https://registry.npmjs.org/dotenv/-/dotenv-17.3.1.tgz",
|
||||
"integrity": "sha512-IO8C/dzEb6O3F9/twg6ZLXz164a2fhTnEWb95H23Dm4OuN+92NmEAlTrupP9VW6Jm3sO26tQlqyvyi4CsnY9GA==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://dotenvx.com"
|
||||
}
|
||||
},
|
||||
"node_modules/dunder-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||
|
||||
@@ -9,13 +9,14 @@
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"react": "^19.2.3",
|
||||
"react-markdown": "^10.1.0",
|
||||
"remark-gfm": "^4.0.0",
|
||||
"react-dom": "^19.2.3",
|
||||
"cors": "^2.8.6",
|
||||
"dotenv": "^17.3.1",
|
||||
"express": "^4.22.1",
|
||||
"lucide-react": "^0.562.0",
|
||||
"express": "^4.19.2",
|
||||
"cors": "^2.8.5"
|
||||
"react": "^19.2.3",
|
||||
"react-dom": "^19.2.3",
|
||||
"react-markdown": "^10.1.0",
|
||||
"remark-gfm": "^4.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^22.14.0",
|
||||
|
||||
@@ -11,7 +11,7 @@ const port = 3005;
|
||||
|
||||
// --- DATABASE INITIALIZATION ---
|
||||
// Initialize the SQLite database on startup to ensure the 'gtm_projects' table exists.
|
||||
const dbScript = path.join(__dirname, '../gtm_db_manager.py'); // CORRECTED PATH
|
||||
const dbScript = path.join(__dirname, 'gtm_db_manager.py'); // CORRECTED PATH
|
||||
console.log(`[Init] Initializing database via ${dbScript}...`);
|
||||
const initProcess = spawn('python3', [dbScript, 'init']);
|
||||
|
||||
|
||||
@@ -1,49 +1,38 @@
|
||||
# Lead Engine: Multi-Source Automation v1.3 [31988f42]
|
||||
# Lead Engine: Multi-Source Automation v1.4 [31988f42]
|
||||
|
||||
## 🚀 Übersicht
|
||||
Die **Lead Engine** ist ein spezialisiertes Modul zur autonomen Verarbeitung von B2B-Anfragen aus verschiedenen Quellen. Sie fungiert als Brücke zwischen dem E-Mail-Postfach und dem **Company Explorer**, um innerhalb von Minuten hochgradig personalisierte Antwort-Entwürfe auf "Human Expert Level" zu generieren.
|
||||
Die **Lead Engine** ist ein spezialisiertes Modul zur autonomen Verarbeitung von B2B-Anfragen. Sie fungiert als Brücke zwischen dem E-Mail-Postfach und dem **Company Explorer**, um innerhalb von Minuten hochgradig personalisierte Antwort-Entwürfe auf "Human Expert Level" zu generieren.
|
||||
|
||||
## 🛠 Hauptfunktionen
|
||||
|
||||
### 1. Intelligenter E-Mail Ingest
|
||||
* **Multi-Source:** Überwacht das Postfach `info@robo-planet.de` via **Microsoft Graph API** auf verschiedene Lead-Typen.
|
||||
* **Filter & Routing:** Erkennt und unterscheidet Anfragen von **TradingTwins** und dem **Roboplanet-Kontaktformular**.
|
||||
* **Parsing:** Spezialisierte HTML-Parser extrahieren für jede Quelle strukturierte Daten (Firma, Kontakt, Bedarf, etc.).
|
||||
* **Multi-Source:** Überwacht das Postfach `info@robo-planet.de` via **Microsoft Graph API**.
|
||||
* **Filter & Routing:** Unterscheidet Anfragen von **TradingTwins** und dem **Kontaktformular**.
|
||||
* **Parsing:** Spezialisierte HTML-Parser extrahieren strukturierte Daten (Firma, Kontakt, Bedarf).
|
||||
|
||||
### 2. Contact Research (LinkedIn Lookup)
|
||||
* **Automatisierung:** Sucht via **SerpAPI** und **Gemini 2.0 Flash** nach der beruflichen Position des Ansprechpartners.
|
||||
* **Ergebnis:** Identifiziert Rollen wie "CFO", "Mitglied der Klinikleitung" oder "Facharzt", um den Tonfall der Antwort perfekt anzupassen.
|
||||
* **Automatisierung:** Sucht via **SerpAPI** und **Gemini 2.0 Flash** nach der beruflichen Position.
|
||||
* **Ergebnis:** Identifiziert Rollen (z.B. "CFO"), um den Tonfall anzupassen.
|
||||
|
||||
### 3. Company Explorer Sync & Monitoring
|
||||
* **Integration:** Legt Accounts und Kontakte automatisch im CE an.
|
||||
* **Monitor:** Ein Hintergrund-Prozess (`monitor.py`) überwacht asynchron den Status der KI-Analyse im CE.
|
||||
* **Daten-Pull:** Sobald die Analyse (Branche, Dossier) fertig ist, werden die Daten in die lokale Lead-Datenbank übernommen.
|
||||
* **Monitor:** Hintergrund-Prozess (`monitor.py`) überwacht den Analyse-Status.
|
||||
* **Daten-Pull:** Übernimmt Branche und Dossier in die lokale Lead-Datenbank.
|
||||
|
||||
### 4. Expert Response Generator
|
||||
* **KI-Engine:** Nutzt Gemini 2.0 Flash zur Erstellung von E-Mail-Entwürfen.
|
||||
* **Kontext:** Kombiniert Lead-Daten (Fläche) + CE-Daten (Dossier) + Matrix-Argumente (Pains/Gains).
|
||||
* **Persistente Entwürfe:** Generierte E-Mail-Entwürfe werden direkt beim Lead gespeichert und bleiben erhalten.
|
||||
* **KI-Engine:** Gemini 2.0 Flash erstellt E-Mail-Entwürfe.
|
||||
* **Kontext:** Kombiniert Lead-Daten + CE-Daten + Matrix-Argumente (Pains/Gains).
|
||||
|
||||
### 5. UI & Qualitätskontrolle
|
||||
* **Visuelle Unterscheidung:** Klare Kennzeichnung der Lead-Quelle (z.B. 🌐 für Website, 🤝 für Partner) in der Übersicht.
|
||||
* **Status-Tracking:** Visueller Indikator (🆕/✅) für den Synchronisations-Status mit dem Company Explorer.
|
||||
* **Low-Quality-Warnung:** Visuelle Kennzeichnung (⚠️) von Leads mit Free-Mail-Adressen oder ohne Firmennamen direkt in der Übersicht.
|
||||
|
||||
### 6. Trading Twins Autopilot (PRODUKTIV v2.0)
|
||||
### 5. Trading Twins Autopilot (PRODUKTIV v2.1)
|
||||
Der vollautomatische "Zero Touch" Workflow für Trading Twins Anfragen.
|
||||
|
||||
* **Human-in-the-Loop:** Vor Versand erhält Elizabeta Melcer eine Teams-Nachricht ("Approve/Deny") via Adaptive Card.
|
||||
* **Feedback-Server:** Ein integrierter FastAPI-Server (Port 8004) verarbeitet die Klicks aus Teams und gibt sofortiges visuelles Feedback.
|
||||
* **Direct Calendar Booking (Eigener Service):**
|
||||
* **Problem:** MS Bookings API lässt sich nicht per Application Permission steuern (Erstellung verboten).
|
||||
* **Lösung:** Wir haben einen eigenen Micro-Booking-Service gebaut.
|
||||
* **Ablauf:** Das System prüft echte freie Slots im Kalender von `e.melcer` (via Graph API).
|
||||
* **E-Mail:** Der Kunde erhält eine E-Mail mit zwei konkreten Terminvorschlägen (Links).
|
||||
* **Buchung:** Klick auf einen Link -> Server bestätigt -> **Echte Outlook-Kalendereinladung** wird automatisch von `info@` versendet.
|
||||
* **Technologie:**
|
||||
* **Teams Webhook:** Für interaktive "Adaptive Cards".
|
||||
* **Graph API:** Für E-Mail-Versand (`info@`) und Kalender-Check (`e.melcer`).
|
||||
* **Orchestrator (`manager.py`):** Steuert den Ablauf (Lead -> CE -> Teams -> Timer -> Mail -> Booking).
|
||||
* **Human-in-the-Loop:** Elizabeta Melcer erhält eine Teams-Nachricht ("Approve/Deny").
|
||||
* **Feedback-Server:** Ein integrierter FastAPI-Server (Port 8004) verarbeitet Klicks.
|
||||
* **Direct Calendar Booking (Micro-Service):**
|
||||
* **Logik:** Prüft den Kalender von `e.melcer` auf **echte Verfügbarkeit**.
|
||||
* **Raster:** Termine starten nur im **15-Minuten-Takt** (:00, :15, :30, :45).
|
||||
* **Abstand:** Bietet zwei Termine an, mit ca. **3 Stunden Pause** dazwischen.
|
||||
* **Buchung:** Klick auf Link -> Server erstellt Outlook-Termin von `info@` mit `e.melcer` als Teilnehmer.
|
||||
|
||||
## 🏗 Architektur
|
||||
|
||||
@@ -52,63 +41,55 @@ Der vollautomatische "Zero Touch" Workflow für Trading Twins Anfragen.
|
||||
├── app.py # Streamlit Web-Interface
|
||||
├── trading_twins_ingest.py # E-Mail Importer (Graph API)
|
||||
├── monitor.py # Monitor + Trigger für Orchestrator
|
||||
├── trading_twins/ # [NEU] Autopilot Modul
|
||||
│ ├── manager.py # Orchestrator, FastAPI Server, Graph API Logic
|
||||
│ ├── signature.html # HTML-Signatur für E-Mails
|
||||
│ └── debug_bookings_only.py # Diagnose-Tool (Legacy)
|
||||
├── db.py # Lokale Lead-Datenbank
|
||||
└── data/ # DB-Storage
|
||||
├── trading_twins/ # Autopilot Modul
|
||||
│ ├── manager.py # Orchestrator, FastAPI, Graph API Logic
|
||||
│ ├── test_calendar_logic.py # Interner Test für Kalender-Zugriff
|
||||
│ └── signature.html # HTML-Signatur
|
||||
└── db.py # Lokale SQLite Lead-Datenbank
|
||||
```
|
||||
|
||||
## 🚨 Lessons Learned & Troubleshooting (Critical)
|
||||
## 🚨 Lessons Learned & Critical Fixes
|
||||
|
||||
### 1. Microsoft Bookings API Falle
|
||||
* **Problem:** Wir wollten `Bookings.Manage.All` nutzen, um eine Buchungsseite für `info@` zu erstellen.
|
||||
* **Fehler:** `403 Forbidden` ("Api Business.Create does not support the token type: App") und `500 Internal Server Error` (bei `GET`).
|
||||
* **Erkenntnis:** Eine App (Service Principal) kann zwar Bookings *verwalten*, aber **nicht initial erstellen**. Die erste Seite muss zwingend manuell oder per Delegated-User angelegt werden. Zudem erfordert der Zugriff oft eine User-Lizenz, die Service Principals nicht haben.
|
||||
* **Lösung:** Umstieg auf **Direct Calendar Booking** (Graph API `Calendar.ReadWrite`). Wir schreiben Termine direkt in den Outlook-Kalender, statt über die Bookings-Schicht zu gehen. Das ist robuster und voll automatisierbar.
|
||||
### 1. Microsoft Graph API: Kalender-Zugriff
|
||||
* **Problem:** `debug_calendar.py` scheiterte oft mit `Invalid parameter`.
|
||||
* **Ursache:** URL-Encoding von Zeitstempeln (`+` wurde zu Leerzeichen) und Mikrosekunden (7 Stellen statt 6).
|
||||
* **Lösung:** Nutzung von `requests(params=...)` und Abschneiden der Mikrosekunden.
|
||||
* **Endpoint:** `/users/{email}/calendar/getSchedule` (POST) ist robuster als `/calendarView` (GET).
|
||||
|
||||
### 4. Exchange AppOnly AccessPolicy
|
||||
* **Problem:** Trotz globaler `Calendars.ReadWrite` Berechtigung schlug das Erstellen von Terminen im Kalender von `e.melcer@` fehl (`403 Forbidden: Blocked by tenant configured AppOnly AccessPolicy settings`).
|
||||
* **Erkenntnis:** Viele Organisationen schränken per Policy ein, auf welche Postfächer eine App zugreifen darf. Ein Zugriff auf "fremde" Postfächer ist oft standardmäßig gesperrt.
|
||||
* **Lösung:** Der Termin wird im **eigenen Kalender** des Service-Accounts (`info@robo-planet.de`) erstellt. Der zuständige Mitarbeiter (`e.melcer@`) wird als **erforderlicher Teilnehmer** hinzugefügt. Dies umgeht die Policy-Sperre und stellt sicher, dass der Mitarbeiter den Termin in seinem Kalender sieht und das Teams-Meeting voll steuern kann.
|
||||
### 2. Exchange AppOnly AccessPolicy (Buchungs-Workaround)
|
||||
* **Problem:** `Calendars.ReadWrite` erlaubt einer App oft nicht, Termine in *fremden* Kalendern (`e.melcer@`) zu erstellen (`403 Forbidden`).
|
||||
* **Lösung:** Der Termin wird im **eigenen Kalender** des Service-Accounts (`info@`) erstellt. Der Mitarbeiter (`e.melcer@`) wird als **Teilnehmer** hinzugefügt. Das umgeht die Policy.
|
||||
|
||||
## 🚀 Inbetriebnahme (Docker)
|
||||
### 3. Docker Environment Variables
|
||||
* **Problem:** Skripte im Container fanden Credentials nicht, obwohl sie in `.env` standen.
|
||||
* **Lösung:** Explizites `load_dotenv` ist in Standalone-Skripten (`test_*.py`) nötig. Im Hauptprozess (`manager.py`) reicht `os.getenv`, solange Docker Compose die Vars korrekt durchreicht.
|
||||
|
||||
## 🚀 Inbetriebnahme (Docker)
|
||||
|
||||
Die Lead Engine ist als Service in der zentralen `docker-compose.yml` integriert.
|
||||
## 🚀 Inbetriebnahme
|
||||
|
||||
```bash
|
||||
# Neustart des Dienstes nach Code-Änderungen
|
||||
# Neustart des Dienstes
|
||||
docker-compose up -d --build --force-recreate lead-engine
|
||||
|
||||
# Manueller Test (intern)
|
||||
docker exec lead-engine python /app/trading_twins/test_calendar_logic.py
|
||||
```
|
||||
|
||||
**Zugriff:** `https://floke-ai.duckdns.org/lead/` (Passwortgeschützt)
|
||||
**Feedback API:** `https://floke-ai.duckdns.org/feedback/` (Öffentlich)
|
||||
|
||||
## 📝 Credentials (.env)
|
||||
|
||||
Für den Betrieb sind folgende Variablen in der zentralen `.env` zwingend erforderlich:
|
||||
|
||||
```env
|
||||
# App 1: Info-Postfach (Schreiben)
|
||||
# Info-Postfach (App 1 - Schreiben)
|
||||
INFO_Application_ID=...
|
||||
INFO_Tenant_ID=...
|
||||
INFO_Secret=...
|
||||
|
||||
# App 2: E.Melcer Kalender (Lesen)
|
||||
# E.Melcer Kalender (App 2 - Lesen)
|
||||
CAL_APPID=...
|
||||
CAL_TENNANT_ID=...
|
||||
CAL_SECRET=...
|
||||
|
||||
# Teams
|
||||
# URLs
|
||||
TEAMS_WEBHOOK_URL=...
|
||||
|
||||
# Public URL
|
||||
FEEDBACK_SERVER_BASE_URL=https://floke-ai.duckdns.org/feedback
|
||||
```
|
||||
|
||||
---
|
||||
*Dokumentationsstand: 5. März 2026*
|
||||
*Task: [31988f42]*
|
||||
|
||||
@@ -8,6 +8,10 @@ from threading import Thread, Lock
|
||||
import uvicorn
|
||||
from fastapi import FastAPI, Response, BackgroundTasks
|
||||
import msal
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables from /app/.env
|
||||
load_dotenv(dotenv_path="/app/.env", override=True)
|
||||
|
||||
# --- Zeitzonen-Konfiguration ---
|
||||
TZ_BERLIN = ZoneInfo("Europe/Berlin")
|
||||
@@ -60,10 +64,15 @@ def check_calendar_availability():
|
||||
"availabilityViewInterval": 60 # Check availability in 1-hour blocks
|
||||
}
|
||||
|
||||
url = f"{GRAPH_API_ENDPOINT}/users/{TARGET_EMAIL}/calendarView?startDateTime={start_time.isoformat()}&endDateTime={end_time.isoformat()}&$top=5"
|
||||
url = f"{GRAPH_API_ENDPOINT}/users/{TARGET_EMAIL}/calendarView"
|
||||
params = {
|
||||
"startDateTime": start_time.isoformat(),
|
||||
"endDateTime": end_time.isoformat(),
|
||||
"$top": 5
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.get(url, headers=headers)
|
||||
response = requests.get(url, headers=headers, params=params)
|
||||
if response.status_code == 200:
|
||||
events = response.json().get("value", [])
|
||||
if not events:
|
||||
@@ -75,6 +84,12 @@ def check_calendar_availability():
|
||||
subject = event.get('subject', 'No Subject')
|
||||
start = event.get('start', {}).get('dateTime')
|
||||
if start:
|
||||
# Fix for 7-digit microseconds from Graph API (e.g. 2026-03-09T17:00:00.0000000)
|
||||
if "." in start:
|
||||
main_part, frac_part = start.split(".")
|
||||
# Truncate to 6 digits max or remove if empty
|
||||
start = f"{main_part}.{frac_part[:6]}"
|
||||
|
||||
dt_obj = datetime.fromisoformat(start.replace('Z', '+00:00')).astimezone(TZ_BERLIN)
|
||||
start_formatted = dt_obj.strftime('%A, %d.%m.%Y um %H:%M Uhr')
|
||||
else: start_formatted = "N/A"
|
||||
|
||||
@@ -47,21 +47,66 @@ def get_access_token(client_id, client_secret, tenant_id):
|
||||
return result.get('access_token')
|
||||
|
||||
def get_availability(target_email, app_creds):
|
||||
print(f"DEBUG: Requesting availability for {target_email}")
|
||||
token = get_access_token(*app_creds)
|
||||
if not token: return None
|
||||
if not token:
|
||||
print("DEBUG: Failed to acquire access token.")
|
||||
return None
|
||||
|
||||
headers = {"Authorization": f"Bearer {token}", "Content-Type": "application/json", "Prefer": 'outlook.timezone="Europe/Berlin"'}
|
||||
start_time = datetime.now(TZ_BERLIN).replace(hour=0, minute=0, second=0)
|
||||
start_time = datetime.now(TZ_BERLIN).replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
end_time = start_time + timedelta(days=3)
|
||||
payload = {"schedules": [target_email], "startTime": {"dateTime": start_time.isoformat()}, "endTime": {"dateTime": end_time.isoformat()}, "availabilityViewInterval": 60}
|
||||
# Use 15-minute intervals for finer granularity
|
||||
payload = {"schedules": [target_email], "startTime": {"dateTime": start_time.isoformat()}, "endTime": {"dateTime": end_time.isoformat()}, "availabilityViewInterval": 15}
|
||||
|
||||
try:
|
||||
r = requests.post(f"{GRAPH_API_ENDPOINT}/users/{target_email}/calendar/getSchedule", headers=headers, json=payload)
|
||||
if r.status_code == 200: return start_time, r.json()['value'][0].get('availabilityView', ''), 60
|
||||
except: pass
|
||||
url = f"{GRAPH_API_ENDPOINT}/users/{target_email}/calendar/getSchedule"
|
||||
r = requests.post(url, headers=headers, json=payload)
|
||||
print(f"DEBUG: API Status Code: {r.status_code}")
|
||||
|
||||
if r.status_code == 200:
|
||||
view = r.json()['value'][0].get('availabilityView', '')
|
||||
print(f"DEBUG: Availability View received (Length: {len(view)})")
|
||||
return start_time, view, 15
|
||||
else:
|
||||
print(f"DEBUG: API Error Response: {r.text}")
|
||||
except Exception as e:
|
||||
print(f"DEBUG: Exception during API call: {e}")
|
||||
pass
|
||||
return None
|
||||
|
||||
def find_slots(start, view, interval):
|
||||
# This logic is complex and proven, keeping it as is.
|
||||
return [datetime.now(TZ_BERLIN) + timedelta(days=1, hours=h) for h in [10, 14]] # Placeholder
|
||||
"""
|
||||
Parses availability string: '0'=Free, '2'=Busy.
|
||||
Returns 2 free slots (start times) within business hours (09:00 - 16:30),
|
||||
excluding weekends (Sat/Sun), with approx. 3 hours distance between them.
|
||||
"""
|
||||
slots = []
|
||||
first_slot = None
|
||||
|
||||
# Iterate through the view string
|
||||
for i, status in enumerate(view):
|
||||
if status == '0': # '0' means Free
|
||||
slot_time = start + timedelta(minutes=i * interval)
|
||||
|
||||
# Constraints:
|
||||
# 1. Mon-Fri only
|
||||
# 2. Business hours (09:00 - 16:30)
|
||||
# 3. Future only
|
||||
if slot_time.weekday() < 5 and (9 <= slot_time.hour < 17) and slot_time > datetime.now(TZ_BERLIN):
|
||||
# Max start time 16:30
|
||||
if slot_time.hour == 16 and slot_time.minute > 30:
|
||||
continue
|
||||
|
||||
if first_slot is None:
|
||||
first_slot = slot_time
|
||||
slots.append(first_slot)
|
||||
else:
|
||||
# Second slot should be at least 3 hours after the first
|
||||
if slot_time >= first_slot + timedelta(hours=3):
|
||||
slots.append(slot_time)
|
||||
break
|
||||
return slots
|
||||
|
||||
def create_calendar_invite(lead_email, company, start_time):
|
||||
catchall = os.getenv("EMAIL_CATCHALL"); lead_email = catchall if catchall else lead_email
|
||||
|
||||
88
lead-engine/trading_twins/test_calendar_logic.py
Normal file
88
lead-engine/trading_twins/test_calendar_logic.py
Normal file
@@ -0,0 +1,88 @@
|
||||
# lead-engine/trading_twins/test_calendar_logic.py
|
||||
import sys
|
||||
import os
|
||||
from datetime import datetime, timedelta
|
||||
from zoneinfo import ZoneInfo
|
||||
from dotenv import load_dotenv
|
||||
import msal
|
||||
import requests
|
||||
|
||||
# Load environment variables from the root .env
|
||||
load_dotenv(dotenv_path="/app/.env", override=True)
|
||||
|
||||
# Pfad anpassen, damit wir manager importieren können
|
||||
sys.path.append('/app')
|
||||
|
||||
from trading_twins.manager import get_availability, find_slots
|
||||
|
||||
# Re-import variables to ensure we see what's loaded
|
||||
CAL_APPID = os.getenv("CAL_APPID")
|
||||
CAL_SECRET = os.getenv("CAL_SECRET")
|
||||
CAL_TENNANT_ID = os.getenv("CAL_TENNANT_ID")
|
||||
|
||||
TZ_BERLIN = ZoneInfo("Europe/Berlin")
|
||||
|
||||
def test_internal():
|
||||
target = "e.melcer@robo-planet.de"
|
||||
print(f"🔍 Teste Kalender-Logik für {target}...")
|
||||
|
||||
# Debug Token Acquisition
|
||||
print("🔑 Authentifiziere mit MS Graph...")
|
||||
authority = f"https://login.microsoftonline.com/{CAL_TENNANT_ID}"
|
||||
app_msal = msal.ConfidentialClientApplication(client_id=CAL_APPID, authority=authority, client_credential=CAL_SECRET)
|
||||
result = app_msal.acquire_token_silent([".default"], account=None)
|
||||
if not result:
|
||||
print(" ... hole neues Token ...")
|
||||
result = app_msal.acquire_token_for_client(scopes=["https://graph.microsoft.com/.default"])
|
||||
|
||||
if "access_token" in result:
|
||||
print("✅ Token erhalten.")
|
||||
token = result['access_token']
|
||||
else:
|
||||
print(f"❌ Token-Fehler: {result.get('error')}")
|
||||
print(f"❌ Beschreibung: {result.get('error_description')}")
|
||||
return
|
||||
|
||||
# Debug API Call
|
||||
print("📡 Frage Kalender ab...")
|
||||
headers = {"Authorization": f"Bearer {token}", "Content-Type": "application/json", "Prefer": 'outlook.timezone="Europe/Berlin"'}
|
||||
start_time = datetime.now(TZ_BERLIN).replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
end_time = start_time + timedelta(days=3)
|
||||
|
||||
payload = {
|
||||
"schedules": [target],
|
||||
"startTime": {"dateTime": start_time.isoformat(), "timeZone": "Europe/Berlin"},
|
||||
"endTime": {"dateTime": end_time.isoformat(), "timeZone": "Europe/Berlin"},
|
||||
"availabilityViewInterval": 15
|
||||
}
|
||||
|
||||
import requests
|
||||
try:
|
||||
url = f"https://graph.microsoft.com/v1.0/users/{target}/calendar/getSchedule"
|
||||
r = requests.post(url, headers=headers, json=payload)
|
||||
|
||||
print(f"📡 API Status: {r.status_code}")
|
||||
if r.status_code == 200:
|
||||
data = r.json()
|
||||
# print(f"DEBUG RAW: {data}")
|
||||
schedule = data['value'][0]
|
||||
view = schedule.get('availabilityView', '')
|
||||
print(f"✅ Verfügbarkeit (View Länge: {len(view)})")
|
||||
|
||||
# Test Slot Finding
|
||||
slots = find_slots(start_time, view, 15)
|
||||
if slots:
|
||||
print(f"✅ {len(slots)} Slots gefunden:")
|
||||
for s in slots:
|
||||
print(f" 📅 {s.strftime('%A, %d.%m.%Y um %H:%M')}")
|
||||
else:
|
||||
print("⚠️ Keine Slots gefunden (Logik korrekt, aber Kalender voll?)")
|
||||
else:
|
||||
print(f"❌ API Fehler: {r.text}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Exception beim API Call: {e}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_internal()
|
||||
@@ -50,6 +50,37 @@ http {
|
||||
proxy_read_timeout 86400;
|
||||
}
|
||||
|
||||
location /gtm/ {
|
||||
auth_basic "Restricted Access - Local AI Suite";
|
||||
auth_basic_user_file /etc/nginx/.htpasswd;
|
||||
proxy_pass http://gtm-architect:3005/;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
}
|
||||
|
||||
location /b2b/ {
|
||||
auth_basic "Restricted Access - Local AI Suite";
|
||||
auth_basic_user_file /etc/nginx/.htpasswd;
|
||||
proxy_pass http://b2b-marketing-assistant:3002/;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
}
|
||||
|
||||
location /tr/ {
|
||||
auth_basic "Restricted Access - Local AI Suite";
|
||||
auth_basic_user_file /etc/nginx/.htpasswd;
|
||||
rewrite ^/tr/(.*) /$1 break;
|
||||
proxy_pass http://transcription-tool:8001;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
}
|
||||
|
||||
# Feedback API (public)
|
||||
location /feedback/ {
|
||||
auth_basic off;
|
||||
|
||||
Reference in New Issue
Block a user