diff --git a/.github/instructions/broken-access-control-prevention.instructions.md b/.github/instructions/broken-access-control-prevention.instructions.md
new file mode 100644
index 000000000..394e6cd3e
--- /dev/null
+++ b/.github/instructions/broken-access-control-prevention.instructions.md
@@ -0,0 +1,107 @@
+---
+applyTo: '**/*.py'
+---
+
+# Security: Broken Access Control Prevention
+
+## Critical Requirement
+
+**NEVER treat caller-supplied ids or stored active-scope settings as authorization decisions after login.**
+
+Treat all of the following as untrusted authorization inputs unless the code proves otherwise:
+
+- `conversation_id`, `message_id`, `document_id`, `file_id`, `approval_id`, `group_id`, and `public_workspace_id`
+- `activeGroupOid` and `activePublicWorkspaceOid` values loaded from user settings
+- Plugin or tool-call arguments such as `user_id`, `conversation_id`, `group_id`, `public_workspace_id`, `scope_id`, and `scope_type`
+
+## Preferred Safe Patterns
+
+Use these patterns by default:
+
+- Revalidate personal conversation ownership with `_authorize_personal_conversation_read(...)`, `_authorize_personal_conversation_access(...)`, or an explicit owner check before reading dependent data.
+- Route `activeGroupOid` writes through `update_active_group_for_user(...)`.
+- Route `activePublicWorkspaceOid` writes through `update_active_public_workspace_for_user(...)`.
+- Resolve active group scope through `require_active_group(...)` instead of raw settings reads in backend and plugin code.
+- Resolve active public workspace scope through `require_active_public_workspace(...)` instead of raw settings reads in backend and plugin code.
+- In Semantic Kernel plugins, normalize tool-call scope ids through `_resolve_authorized_scope_arguments(...)`, `_resolve_blob_location_with_fallback(...)`, or `_resolve_authorized_fact_memory_call(...)` before storage, blob, or Cosmos access.
+- Prefer request-scoped authorization context such as `g.authorized_chat_context` over raw tool arguments.
+
+## Disallowed Patterns For New Code
+
+Do not add new code that does any of the following without a reviewed exception:
+
+- Call `update_user_settings(...)` with a literal `{"activeGroupOid": ...}` payload outside `update_active_group_for_user(...)`
+- Call `update_user_settings(...)` with a literal `{"activePublicWorkspaceOid": ...}` payload outside `update_active_public_workspace_for_user(...)`
+- Read `activeGroupOid` or `activePublicWorkspaceOid` directly from raw settings in backend routes or plugins when a shared validator exists
+- Expose `user_id`, `conversation_id`, `group_id`, `public_workspace_id`, `scope_id`, or `scope_type` in a `@kernel_function` surface without immediately rebinding those values to the authorized request context
+- Read a personal conversation by request-derived `conversation_id` and continue to message, blob, or feedback work without an explicit ownership boundary
+
+## Safe Examples
+
+```python
+conversation_item = _authorize_personal_conversation_read(user_id, conversation_id)
+messages = list(
+ cosmos_messages_container.query_items(
+ query=query,
+ partition_key=conversation_item['id'],
+ )
+)
+```
+
+```python
+update_active_group_for_user(requested_active_group, user_id=user_id)
+active_group_id = require_active_group(user_id)
+```
+
+```python
+authorized_scope = self._resolve_authorized_fact_memory_call(
+ scope_type=scope_type,
+ scope_id=scope_id,
+ conversation_id=conversation_id,
+)
+```
+
+## Unsafe Examples
+
+```python
+update_user_settings(user_id, {'activeGroupOid': group_id})
+```
+
+```python
+active_group_id = settings.get('settings', {}).get('activeGroupOid')
+```
+
+```python
+@kernel_function(name='unsafe_tool')
+def unsafe_tool(self, user_id: str, conversation_id: str, group_id: str = ''):
+ return self.store.lookup(user_id=user_id, conversation_id=conversation_id, group_id=group_id)
+```
+
+```python
+conversation_item = cosmos_conversations_container.read_item(
+ item=conversation_id,
+ partition_key=conversation_id,
+)
+```
+
+## PR Review Checklist
+
+For any Python change that reads or mutates user, group, workspace, conversation, or plugin-scoped data:
+
+1. Identify every caller-controlled id that crosses into a data read or mutation.
+2. Revalidate ownership or membership at the sensitive operation boundary, not just at route entry.
+3. Use the dedicated active-scope validators instead of raw settings reads and writes.
+4. Rebind plugin scope parameters to the authorized request context before storage, blob, or Cosmos access.
+5. Add or update a regression test when the change touches an authorization boundary.
+
+## Workflow Guardrail
+
+This repository includes a Development PR check in `.github/workflows/broken-access-control-check.yml` backed by `scripts/check_broken_access_control.py`.
+
+If a reviewed exception is unavoidable, add the suppression token below near the specific line and include a justification comment:
+
+```text
+bac-check: ignore
+```
+
+Use that escape hatch rarely. It is for reviewed legacy exceptions, not normal route or plugin code.
\ No newline at end of file
diff --git a/.github/instructions/xss-prevention.instructions.md b/.github/instructions/xss-prevention.instructions.md
new file mode 100644
index 000000000..aa9d156d0
--- /dev/null
+++ b/.github/instructions/xss-prevention.instructions.md
@@ -0,0 +1,133 @@
+---
+applyTo: '**/*.js, **/*.html, **/*.py'
+---
+
+# Security: XSS Prevention and Browser Rendering
+
+## Critical Requirement
+
+**NEVER pass untrusted data into browser HTML or JavaScript execution sinks without an explicit safe boundary.**
+
+Treat all of the following as untrusted unless the code proves otherwise:
+
+- User profile fields, workspace names, group names, agent names, document titles, filenames, tags, descriptions, emails, and ids
+- API response values returned from storage, Microsoft Graph, Cosmos DB, Azure AI Search, or any plugin/tool response
+- Markdown, rich text, uploaded text files, generated summaries, model output, and any server-returned error string
+
+## Preferred Safe Patterns
+
+Use these patterns by default:
+
+- Create DOM nodes with `document.createElement(...)`
+- Set untrusted text with `textContent`
+- Set trusted static classes with `className`
+- Use `setAttribute(...)` or `dataset` for inert data only when DOM node creation is not practical
+- Attach behavior with `addEventListener(...)`
+- Normalize dynamic HTTP links with a helper such as `sanitizeHttpUrl(...)` before assigning `href` or `src`
+- Sanitize rendered markdown with `DOMPurify.sanitize(marked.parse(...))` before inserting HTML
+- Keep static modal or card shells fully static, then populate untrusted fields with DOM APIs after creation
+
+## Disallowed Patterns For New Code
+
+Do not add new code that does any of the following with untrusted values:
+
+- `innerHTML`, `outerHTML`, `insertAdjacentHTML`, or jQuery `.html(...)`
+- Inline event handlers such as `onclick=`, `onerror=`, `onload=`, or `setAttribute('onclick', ...)`
+- Dynamic interpolation into HTML attributes such as `href`, `src`, `title`, `style`, or `data-*`
+- `javascript:` URLs
+- `Markup(...)` in Python on untrusted content
+- Jinja `|safe` on untrusted content
+- `marked.parse(...)` output rendered without `DOMPurify.sanitize(...)`
+
+## Safe Examples
+
+### JavaScript
+
+```javascript
+const row = document.createElement('tr');
+const nameCell = document.createElement('td');
+nameCell.textContent = user.displayName || 'Unknown User';
+
+const actionButton = document.createElement('button');
+actionButton.type = 'button';
+actionButton.dataset.userId = user.id || '';
+actionButton.addEventListener('click', handleUserClick);
+
+row.appendChild(nameCell);
+row.appendChild(actionButton);
+```
+
+```javascript
+const renderedHtml = DOMPurify.sanitize(marked.parse(markdownText || ''));
+markdownContainer.innerHTML = renderedHtml;
+```
+
+### HTML / Jinja
+
+```html
+
+```
+
+### Python
+
+```python
+return render_template(
+ 'page.html',
+ title=page_title,
+ items=items,
+)
+```
+
+## Unsafe Examples
+
+```javascript
+row.innerHTML = `
${user.displayName}
`;
+```
+
+```javascript
+button.setAttribute('onclick', `selectUser('${user.id}', '${user.displayName}')`);
+```
+
+```html
+Run
+```
+
+```python
+return Markup(user_supplied_html)
+```
+
+```html
+{{ user_supplied_html|safe }}
+```
+
+## Static HTML Shell Exception
+
+When a static HTML shell is genuinely simpler, it is acceptable only if:
+
+- The HTML string is fully static
+- It contains no `${...}` interpolation or dynamic concatenation
+- Untrusted values are populated afterward with `textContent`, `setAttribute(...)`, or `dataset`
+
+## PR Review Checklist
+
+For any JavaScript, HTML, or Python change that affects browser rendering:
+
+1. Identify the trust boundary for every value that reaches the browser.
+2. Prefer DOM node creation and `textContent` for untrusted text.
+3. Normalize dynamic URLs before assigning them to clickable or loadable attributes.
+4. If HTML rendering is required, document the sanitizer boundary explicitly.
+5. Add or update a regression test when untrusted data reaches a browser-rendering path.
+
+## Workflow Guardrail
+
+This repository includes a Development PR check in `.github/workflows/xss-sink-check.yml` backed by `scripts/check_xss_sinks.py`.
+
+If a reviewed exception is unavoidable, add the suppression token below near the specific line and include a justification comment:
+
+```text
+xss-check: ignore
+```
+
+Use that escape hatch rarely. It is for reviewed legacy exceptions, not for normal rendering code.
\ No newline at end of file
diff --git a/.github/workflows/broken-access-control-check.yml b/.github/workflows/broken-access-control-check.yml
new file mode 100644
index 000000000..33869b28f
--- /dev/null
+++ b/.github/workflows/broken-access-control-check.yml
@@ -0,0 +1,66 @@
+name: Broken Access Control Check
+
+on:
+ pull_request:
+ branches:
+ - Development
+ paths:
+ - 'application/**/*.py'
+ - 'scripts/check_broken_access_control.py'
+ - 'functional_tests/test_broken_access_control_guardrails_checker.py'
+ - '.github/workflows/broken-access-control-check.yml'
+ - '.github/instructions/broken-access-control-prevention.instructions.md'
+
+jobs:
+ broken-access-control-check:
+ runs-on: ubuntu-latest
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v4
+ with:
+ fetch-depth: 0
+
+ - name: Set up Python
+ uses: actions/setup-python@v5
+ with:
+ python-version: '3.11'
+
+ - name: Get changed Python files
+ id: changed-files
+ uses: tj-actions/changed-files@v46.0.1
+ with:
+ files_yaml: |
+ bac_surface:
+ - 'application/**/*.py'
+ bac_guardrails:
+ - 'scripts/check_broken_access_control.py'
+ - 'functional_tests/test_broken_access_control_guardrails_checker.py'
+ - '.github/workflows/broken-access-control-check.yml'
+ - '.github/instructions/broken-access-control-prevention.instructions.md'
+ - 'docs/explanation/features/v0.241.022/BROKEN_ACCESS_CONTROL_PR_GUARDRAILS.md'
+
+ - name: Run Broken Access Control validation
+ env:
+ CHANGED_BAC_FILES: ${{ steps.changed-files.outputs.bac_surface_all_changed_files }}
+ GITHUB_BASE_SHA: ${{ github.event.pull_request.base.sha }}
+ GITHUB_HEAD_SHA: ${{ github.sha }}
+ run: |
+ if [[ -z "$CHANGED_BAC_FILES" ]]; then
+ echo "No changed application files detected for Broken Access Control validation."
+ exit 0
+ fi
+
+ echo "Changed application files:"
+ printf '%s\n' "$CHANGED_BAC_FILES" | tr ' ' '\n'
+
+ python scripts/check_broken_access_control.py \
+ --base-sha "$GITHUB_BASE_SHA" \
+ --head-sha "$GITHUB_HEAD_SHA" \
+ $CHANGED_BAC_FILES
+
+ - name: Run Broken Access Control guardrail self-test (advisory)
+ if: steps.changed-files.outputs.bac_guardrails_any_changed == 'true'
+ continue-on-error: true
+ run: |
+ python functional_tests/test_broken_access_control_guardrails_checker.py
\ No newline at end of file
diff --git a/.github/workflows/xss-sink-check.yml b/.github/workflows/xss-sink-check.yml
new file mode 100644
index 000000000..938df6061
--- /dev/null
+++ b/.github/workflows/xss-sink-check.yml
@@ -0,0 +1,70 @@
+name: XSS Sink Check
+
+on:
+ pull_request:
+ branches:
+ - Development
+ paths:
+ - 'application/**/*.js'
+ - 'application/**/*.html'
+ - 'application/**/*.py'
+ - 'scripts/check_xss_sinks.py'
+ - 'functional_tests/test_xss_guardrails_checker.py'
+ - '.github/workflows/xss-sink-check.yml'
+ - '.github/instructions/xss-prevention.instructions.md'
+
+jobs:
+ xss-sink-check:
+ runs-on: ubuntu-latest
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v4
+ with:
+ fetch-depth: 0
+
+ - name: Set up Python
+ uses: actions/setup-python@v5
+ with:
+ python-version: '3.11'
+
+ - name: Get changed XSS-related files
+ id: changed-files
+ uses: tj-actions/changed-files@v46.0.1
+ with:
+ files_yaml: |
+ xss_surface:
+ - 'application/**/*.js'
+ - 'application/**/*.html'
+ - 'application/**/*.py'
+ xss_guardrails:
+ - 'scripts/check_xss_sinks.py'
+ - 'functional_tests/test_xss_guardrails_checker.py'
+ - '.github/workflows/xss-sink-check.yml'
+ - '.github/instructions/xss-prevention.instructions.md'
+ - 'docs/explanation/features/v0.241.021/XSS_PR_GUARDRAILS.md'
+
+ - name: Run XSS sink validation
+ env:
+ CHANGED_XSS_FILES: ${{ steps.changed-files.outputs.xss_surface_all_changed_files }}
+ GITHUB_BASE_SHA: ${{ github.event.pull_request.base.sha }}
+ GITHUB_HEAD_SHA: ${{ github.sha }}
+ run: |
+ if [[ -z "$CHANGED_XSS_FILES" ]]; then
+ echo "No changed application files detected for XSS sink validation."
+ exit 0
+ fi
+
+ echo "Changed application files:"
+ printf '%s\n' "$CHANGED_XSS_FILES" | tr ' ' '\n'
+
+ python scripts/check_xss_sinks.py \
+ --base-sha "$GITHUB_BASE_SHA" \
+ --head-sha "$GITHUB_HEAD_SHA" \
+ $CHANGED_XSS_FILES
+
+ - name: Run XSS guardrail self-test (advisory)
+ if: steps.changed-files.outputs.xss_guardrails_any_changed == 'true'
+ continue-on-error: true
+ run: |
+ python functional_tests/test_xss_guardrails_checker.py
\ No newline at end of file
diff --git a/.gitignore b/.gitignore
index 6fad4cde2..05e2a5ac0 100644
--- a/.gitignore
+++ b/.gitignore
@@ -52,3 +52,4 @@ nul
/artifacts/tmp
scripts/agent.json
scripts/me.json
+.github/instructions/python-venv-path.instructions.md
\ No newline at end of file
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
new file mode 100644
index 000000000..23af17953
--- /dev/null
+++ b/CONTRIBUTING.md
@@ -0,0 +1,103 @@
+# Contributing to SimpleChat
+
+This repository uses the standard `CONTRIBUTING.md` filename so GitHub can surface the guide automatically.
+The documentation-site copy lives at `docs/contributing.md`, and both files should stay aligned.
+
+## Contribution Flow
+
+SimpleChat contributions should be made through a fork-based workflow.
+
+1. Fork the repository.
+2. Clone your fork locally.
+3. Add the main SimpleChat repository as `upstream`.
+4. Create a new branch from the upstream `Development` branch.
+5. Make your changes in that new branch.
+6. Push the branch to your fork.
+7. Open a pull request from your fork branch back to the main SimpleChat repository's `Development` branch.
+
+Do not open contributor pull requests directly to `Staging` or `main`. The repository uses a staged promotion flow: `Development` -> `Staging` -> `main`.
+Use the branch names exactly as written here. In this repository, `Development` and `Staging` are capitalized.
+After a contribution is merged into `Development`, the SimpleChat team handles promotion forward.
+
+
+
+## Suggested Git Commands
+
+Use whatever Git workflow you prefer, but this is the expected starting point:
+
+```bash
+git clone
+cd simplechat
+git remote add upstream
+git fetch upstream
+git switch -c feature/my-change upstream/Development
+```
+
+When you are ready to publish your work:
+
+```bash
+git push -u origin feature/my-change
+```
+
+If your branch falls behind, sync it from `upstream/Development` before opening or updating the pull request.
+
+## Local Development
+
+Before contributing, make sure you can run SimpleChat locally.
+
+Recommended local setup in VS Code uses a repo-local `.venv` with Python 3.12.
+
+From the repo root in PowerShell:
+
+```powershell
+py -3.12 -m venv .venv
+.\.venv\Scripts\Activate.ps1
+pip install --upgrade pip
+pip install -r application/single_app/requirements.txt
+Set-Location application/single_app
+$env:FLASK_DEBUG = "1"
+python app.py
+```
+
+For the full local workflow, environment guidance, and notes about Docker, WSL2, and Gunicorn validation, see:
+
+- [README.md](./README.md)
+- [docs/setup_instructions_manual.md](./docs/setup_instructions_manual.md)
+- [docs/explanation/running_simplechat_locally.md](./docs/explanation/running_simplechat_locally.md)
+
+## Pull Request Expectations
+
+Keep pull requests focused and easy to review.
+
+- Base your work on `Development`, not `main`.
+- Keep unrelated refactors out of the same pull request.
+- Explain what changed, why it changed, and how you tested it.
+- Include screenshots or short recordings for UI changes when helpful.
+- Call out any configuration, schema, security, or deployment impact.
+- Update documentation when user-facing behavior or setup steps change.
+
+## Tests and Validation
+
+Before opening a pull request, run the tests that match your change.
+
+- Add or update functional tests for bug fixes and new features when appropriate.
+- Run relevant tests from `functional_tests/` and `ui_tests/` when your change affects those areas.
+- If you change Flask routes, keep the existing Swagger route decorator pattern intact.
+
+Pull requests are reviewed by the SimpleChat team and go through repository validation. Depending on the files changed, that can include Python syntax checks, release-note validation, Swagger route validation, and additional maintainer review. Maintainers may also run additional security or AI-assisted review before merge.
+
+## Security and Repo Conventions
+
+- Never commit secrets, keys, or environment-specific credentials.
+- Review [SECURITY.md](./SECURITY.md) before submitting security-sensitive changes.
+- Follow the repository's existing structure and conventions instead of introducing broad cleanup changes.
+- If you use AI-assisted tooling while contributing, also review [CLAUDE.md](./CLAUDE.md) and [.github/copilot-instructions.md](./.github/copilot-instructions.md) for repo-specific guidance.
+
+## Need Help?
+
+If you are unsure about the right target branch or how to structure a change, open a draft pull request against `Development` and explain the question in the description. That gives the maintainers a concrete starting point for feedback.
diff --git a/README.md b/README.md
index 8e8f4599b..37aa452ae 100644
--- a/README.md
+++ b/README.md
@@ -12,6 +12,10 @@ The application utilizes **Azure Cosmos DB** for storing conversations, metadata
[Simple Chat Documentation | Simple Chat Documentation](https://microsoft.github.io/simplechat/)
+## Contributing
+
+See [CONTRIBUTING.md](./CONTRIBUTING.md) for the fork-based workflow, target branch guidance, and local development references for SimpleChat contributors.
+
## Quick Deploy
[Detailed deployment Guide](./deployers/bicep/README.md)
diff --git a/application/external_apps/bulkloader/requirements.txt b/application/external_apps/bulkloader/requirements.txt
index 2cee8a3b3..f8a4a0b0f 100644
--- a/application/external_apps/bulkloader/requirements.txt
+++ b/application/external_apps/bulkloader/requirements.txt
@@ -1,3 +1,3 @@
-requests==2.32.4
+requests==2.33.0
msal==1.31.0
python-dotenv==0.21.0
\ No newline at end of file
diff --git a/application/external_apps/databaseseeder/requirements.txt b/application/external_apps/databaseseeder/requirements.txt
index dfdf02ede..b0d23371b 100644
--- a/application/external_apps/databaseseeder/requirements.txt
+++ b/application/external_apps/databaseseeder/requirements.txt
@@ -1,3 +1,3 @@
-requests==2.32.4
+requests==2.33.0
msal==1.31.0
python-dotenv==0.21.0
diff --git a/application/single_app/app.py b/application/single_app/app.py
index fdbaee55d..422133651 100644
--- a/application/single_app/app.py
+++ b/application/single_app/app.py
@@ -603,6 +603,23 @@ def _is_idle_timeout_exempt(path):
return True
return any(path.startswith(prefix) for prefix in IDLE_TIMEOUT_EXEMPT_PREFIXES)
+
+def maybe_log_authenticated_browser_request():
+ """Record throttled login activity for authenticated browser page requests."""
+ if request.method != 'GET' or request.path.startswith('/api/'):
+ return
+
+ user_id = session.get('user', {}).get('oid') or session.get('user', {}).get('sub')
+ if not user_id:
+ return
+
+ maybe_log_authenticated_request_login(
+ user_id=user_id,
+ session_state=session,
+ request_path=request.path,
+ request_method=request.method
+ )
+
@app.before_request
def enforce_idle_session_timeout():
"""
@@ -646,6 +663,7 @@ def enforce_idle_session_timeout():
if should_refresh_last_activity:
session['last_activity_epoch'] = now_epoch
session.modified = True
+ maybe_log_authenticated_browser_request()
return None
idle_timeout_minutes, _ = get_idle_timeout_settings(request_settings)
@@ -698,6 +716,7 @@ def enforce_idle_session_timeout():
session['last_activity_epoch'] = now_epoch
session.modified = True
+ maybe_log_authenticated_browser_request()
return None
@app.after_request
@@ -933,8 +952,9 @@ def list_semantic_kernel_plugins():
# ------------------- API Thoughts Routes ----------------
register_route_backend_thoughts(app)
-# ------------------- Extenral Health Routes ----------
+# ------------------- External Health Routes ----------
register_route_external_health(app)
+register_no_auth_health(app)
if __name__ == '__main__':
debug_mode = os.environ.get("FLASK_DEBUG", "0") == "1"
diff --git a/application/single_app/config.py b/application/single_app/config.py
index 7196cfe80..30cb1b95f 100644
--- a/application/single_app/config.py
+++ b/application/single_app/config.py
@@ -94,7 +94,7 @@
EXECUTOR_TYPE = 'thread'
EXECUTOR_MAX_WORKERS = 30
SESSION_TYPE = 'filesystem'
-VERSION = "0.241.006"
+VERSION = "0.241.007"
SECRET_KEY = os.getenv('SECRET_KEY', 'dev-secret-key-change-in-production')
@@ -106,15 +106,16 @@
'Referrer-Policy': 'strict-origin-when-cross-origin',
'Content-Security-Policy': (
"default-src 'self'; "
- "script-src 'self' 'unsafe-inline' 'unsafe-eval'; "
+ "script-src 'self' 'unsafe-inline' 'unsafe-eval' https://cdn.jsdelivr.net; "
#"script-src 'self' 'unsafe-inline' 'unsafe-eval' https://cdn.jsdelivr.net https://code.jquery.com https://stackpath.bootstrapcdn.com; "
- "style-src 'self' 'unsafe-inline'; "
+ "style-src 'self' 'unsafe-inline' https://cdn.jsdelivr.net; "
#"style-src 'self' 'unsafe-inline' https://cdn.jsdelivr.net https://stackpath.bootstrapcdn.com; "
"img-src 'self' data: https: blob:; "
"font-src 'self'; "
#"font-src 'self' https://cdn.jsdelivr.net https://stackpath.bootstrapcdn.com; "
"connect-src 'self' https: wss: ws:; "
"media-src 'self' blob:; "
+ "frame-src 'self' blob:; "
"object-src 'none'; "
"frame-ancestors 'self'; "
"base-uri 'self';"
@@ -150,6 +151,7 @@ def get_allowed_extensions(enable_video=False, enable_audio=False):
Args:
enable_video: Whether video file support is enabled
+ enable_audio: Whether audio file support is enabled
Returns:
set: Allowed file extensions
"""
@@ -308,6 +310,18 @@ def get_redis_cache_infrastructure_endpoint(redis_hostname: str) -> str:
partition_key=PartitionKey(path="/conversation_id")
)
+cosmos_personal_workflows_container_name = "personal_workflows"
+cosmos_personal_workflows_container = cosmos_database.create_container_if_not_exists(
+ id=cosmos_personal_workflows_container_name,
+ partition_key=PartitionKey(path="/user_id")
+)
+
+cosmos_personal_workflow_runs_container_name = "personal_workflow_runs"
+cosmos_personal_workflow_runs_container = cosmos_database.create_container_if_not_exists(
+ id=cosmos_personal_workflow_runs_container_name,
+ partition_key=PartitionKey(path="/user_id")
+)
+
cosmos_group_conversations_container_name = "group_conversations"
cosmos_group_conversations_container = cosmos_database.create_container_if_not_exists(
id=cosmos_group_conversations_container_name,
@@ -320,6 +334,24 @@ def get_redis_cache_infrastructure_endpoint(redis_hostname: str) -> str:
partition_key=PartitionKey(path="/conversation_id")
)
+cosmos_collaboration_conversations_container_name = "collaboration_conversations"
+cosmos_collaboration_conversations_container = cosmos_database.create_container_if_not_exists(
+ id=cosmos_collaboration_conversations_container_name,
+ partition_key=PartitionKey(path="/id")
+)
+
+cosmos_collaboration_messages_container_name = "collaboration_messages"
+cosmos_collaboration_messages_container = cosmos_database.create_container_if_not_exists(
+ id=cosmos_collaboration_messages_container_name,
+ partition_key=PartitionKey(path="/conversation_id")
+)
+
+cosmos_collaboration_user_state_container_name = "collaboration_user_state"
+cosmos_collaboration_user_state_container = cosmos_database.create_container_if_not_exists(
+ id=cosmos_collaboration_user_state_container_name,
+ partition_key=PartitionKey(path="/user_id")
+)
+
cosmos_settings_container_name = "settings"
cosmos_settings_container = cosmos_database.create_container_if_not_exists(
id=cosmos_settings_container_name,
diff --git a/application/single_app/functions_activity_logging.py b/application/single_app/functions_activity_logging.py
index 987a3dec9..92c78dfb8 100644
--- a/application/single_app/functions_activity_logging.py
+++ b/application/single_app/functions_activity_logging.py
@@ -5,6 +5,7 @@
"""
import logging
+import time
import uuid
from datetime import datetime
from typing import Optional, Dict, Any
@@ -13,6 +14,101 @@
from config import cosmos_activity_logs_container
+def coerce_activity_log_user_id(user_id: Any) -> str:
+ """Extract a stable string user id from a scalar or session-style identity payload."""
+ if user_id is None:
+ return ''
+
+ if isinstance(user_id, str):
+ return user_id.strip()
+
+ if isinstance(user_id, dict):
+ for key in ('oid', 'sub', 'id', 'user_id'):
+ candidate = user_id.get(key)
+ if isinstance(candidate, str) and candidate.strip():
+ return candidate.strip()
+ return ''
+
+ return str(user_id).strip()
+
+
+USER_LOGIN_ACTIVITY_SESSION_KEY = 'last_user_login_activity_epoch'
+USER_LOGIN_ACTIVITY_MIN_INTERVAL_SECONDS = 15 * 60
+
+
+def _parse_session_epoch(session_state: Optional[dict], session_key: str) -> Optional[int]:
+ """Safely parse an epoch value stored in session state."""
+ if session_state is None:
+ return None
+
+ raw_epoch = session_state.get(session_key)
+ if raw_epoch is None:
+ return None
+
+ try:
+ return int(float(raw_epoch))
+ except (TypeError, ValueError):
+ return None
+
+
+def record_user_login_session_activity(
+ session_state: Optional[dict],
+ now_epoch: Optional[int] = None
+) -> Optional[int]:
+ """Persist the last time login activity was recorded for the current session."""
+ if session_state is None:
+ return None
+
+ resolved_epoch = int(now_epoch if now_epoch is not None else time.time())
+ session_state[USER_LOGIN_ACTIVITY_SESSION_KEY] = resolved_epoch
+
+ if hasattr(session_state, 'modified'):
+ session_state.modified = True
+
+ return resolved_epoch
+
+
+def maybe_log_authenticated_request_login(
+ user_id: str,
+ session_state: Optional[dict],
+ request_path: str,
+ request_method: str = 'GET',
+ now_epoch: Optional[int] = None,
+ login_method: str = 'authenticated_request',
+ min_interval_seconds: int = USER_LOGIN_ACTIVITY_MIN_INTERVAL_SECONDS
+) -> bool:
+ """
+ Log a throttled login-style activity for authenticated browser requests.
+
+ This captures passive SSO/session-based access that never re-enters the
+ explicit OAuth callback, while preventing per-request log spam.
+ """
+ if not user_id or session_state is None:
+ return False
+
+ normalized_method = (request_method or '').upper()
+ if normalized_method != 'GET':
+ return False
+
+ resolved_epoch = int(now_epoch if now_epoch is not None else time.time())
+ last_logged_epoch = _parse_session_epoch(session_state, USER_LOGIN_ACTIVITY_SESSION_KEY)
+ if last_logged_epoch is not None and (resolved_epoch - last_logged_epoch) < min_interval_seconds:
+ return False
+
+ log_user_login(
+ user_id,
+ login_method,
+ activity_details={
+ 'auth_signal': 'authenticated_request',
+ 'request_path': request_path,
+ 'request_method': normalized_method,
+ 'is_interactive_login': False
+ }
+ )
+ record_user_login_session_activity(session_state, resolved_epoch)
+ return True
+
+
def _get_email_domain(email: str) -> str:
"""Return only the email domain for low-sensitivity audit metadata."""
normalized_email = (email or '').strip()
@@ -1113,7 +1209,8 @@ def log_conversation_archival(
def log_user_login(
user_id: str,
- login_method: str = 'azure_ad'
+ login_method: str = 'azure_ad',
+ activity_details: Optional[Dict[str, Any]] = None
) -> None:
"""
Log user login activity to the activity_logs container.
@@ -1125,7 +1222,16 @@ def log_user_login(
try:
# Create login activity record
- import uuid
+ login_details = {
+ 'login_method': login_method,
+ 'success': True
+ }
+ if activity_details:
+ login_details.update({
+ key: value for key, value in activity_details.items()
+ if value is not None
+ })
+
login_activity = {
'id': str(uuid.uuid4()),
'user_id': user_id,
@@ -1133,11 +1239,12 @@ def log_user_login(
'login_method': login_method,
'timestamp': datetime.utcnow().isoformat(),
'created_at': datetime.utcnow().isoformat(),
- 'details': {
- 'login_method': login_method,
- 'success': True
- }
+ 'details': login_details
}
+
+ for key, value in login_details.items():
+ if key not in {'login_method', 'success'}:
+ login_activity[key] = value
# Save to activity_logs container
cosmos_activity_logs_container.create_item(body=login_activity)
@@ -1148,7 +1255,7 @@ def log_user_login(
extra=login_activity,
level=logging.INFO
)
- debug_print(f"✅ User login activity logged for user {user_id}")
+ debug_print(f"✅ User login activity logged for user {user_id} via {login_method}")
except Exception as e:
# Log error but don't break the login flow
@@ -1636,14 +1743,15 @@ def log_general_admin_action(
"""
try:
+ normalized_admin_user_id = coerce_activity_log_user_id(admin_user_id)
activity_record = {
'id': str(uuid.uuid4()),
- 'user_id': admin_user_id,
+ 'user_id': normalized_admin_user_id,
'activity_type': 'admin_action',
'timestamp': datetime.utcnow().isoformat(),
'created_at': datetime.utcnow().isoformat(),
'admin': {
- 'user_id': admin_user_id,
+ 'user_id': normalized_admin_user_id,
'email': admin_email
},
'action': action,
@@ -1670,7 +1778,7 @@ def log_general_admin_action(
log_event(
message=f"Error logging admin action: {str(e)}",
extra={
- 'admin_user_id': admin_user_id,
+ 'admin_user_id': normalized_admin_user_id,
'admin_email': admin_email,
'action': action,
'error': str(e)
diff --git a/application/single_app/functions_agent_scope.py b/application/single_app/functions_agent_scope.py
index 660647b9a..526c7d587 100644
--- a/application/single_app/functions_agent_scope.py
+++ b/application/single_app/functions_agent_scope.py
@@ -30,4 +30,18 @@ def scope_matches(candidate):
if selected_agent_name:
return next((agent for agent in agents_cfg if agent.get("name") == selected_agent_name and scope_matches(agent)), None)
- return None
\ No newline at end of file
+ return None
+
+
+def is_selected_agent_scope_enabled(settings, selected_agent_data):
+ """Return whether app settings allow the selected agent's scope."""
+ if not isinstance(selected_agent_data, dict):
+ return True
+
+ if selected_agent_data.get("is_group", False):
+ return bool((settings or {}).get("allow_group_agents", False))
+
+ if selected_agent_data.get("is_global", False):
+ return True
+
+ return bool((settings or {}).get("allow_user_agents", False))
\ No newline at end of file
diff --git a/application/single_app/functions_agent_templates.py b/application/single_app/functions_agent_templates.py
index f5cda8a3f..af566ef88 100644
--- a/application/single_app/functions_agent_templates.py
+++ b/application/single_app/functions_agent_templates.py
@@ -102,10 +102,32 @@ def _serialize_additional_settings(raw: Any) -> str:
return json.dumps(parsed, indent=2, sort_keys=True)
+def _normalize_actions_to_load(actions: Any, strict: bool = False) -> List[str]:
+ if actions in (None, ""):
+ return []
+ if not isinstance(actions, list):
+ if strict:
+ raise ValueError("actions_to_load must be an array of strings")
+ return []
+
+ cleaned: List[str] = []
+ for action in actions:
+ if isinstance(action, str):
+ trimmed = action.strip()
+ elif strict:
+ raise ValueError("actions_to_load entries must be strings")
+ else:
+ trimmed = str(action).strip()
+
+ if trimmed:
+ cleaned.append(trimmed)
+
+ return cleaned
+
+
def _sanitize_template(doc: Dict[str, Any], include_internal: bool = False) -> Dict[str, Any]:
cleaned = _strip_metadata(doc)
- cleaned.setdefault('actions_to_load', [])
- cleaned['actions_to_load'] = [a for a in cleaned['actions_to_load'] if a]
+ cleaned['actions_to_load'] = _normalize_actions_to_load(cleaned.get('actions_to_load'))
cleaned.setdefault('tags', [])
cleaned['tags'] = [str(tag)[:64] for tag in cleaned['tags']]
cleaned['helper_text'] = _normalize_helper_text(
@@ -287,7 +309,7 @@ def _base_template_from_payload(payload: Dict[str, Any], user_info: Optional[Dic
tags = payload.get('tags') or []
tags = [str(tag)[:64] for tag in tags]
- actions = [str(action) for action in (payload.get('actions_to_load') or []) if action]
+ actions = _normalize_actions_to_load(payload.get('actions_to_load'), strict=True)
template = {
'id': payload.get('id') or str(uuid.uuid4()),
@@ -366,6 +388,11 @@ def update_agent_template(template_id: str, updates: Dict[str, Any]) -> Optional
else:
payload['additional_settings'] = _parse_additional_settings(doc.get('additional_settings'))
+ if 'actions_to_load' in payload:
+ payload['actions_to_load'] = _normalize_actions_to_load(payload['actions_to_load'], strict=True)
+ else:
+ payload['actions_to_load'] = _normalize_actions_to_load(doc.get('actions_to_load'))
+
if 'tags' in payload:
payload['tags'] = [str(tag)[:64] for tag in payload['tags']]
diff --git a/application/single_app/functions_appinsights.py b/application/single_app/functions_appinsights.py
index c81d17f94..14c013cef 100644
--- a/application/single_app/functions_appinsights.py
+++ b/application/single_app/functions_appinsights.py
@@ -75,18 +75,60 @@ def is_debug_enabled() -> bool:
return bool(settings.get('enable_debug_logging', False))
+def _get_appinsights_debug_logger() -> Optional[logging.Logger]:
+ """Return a logger that can emit DEBUG traces without widening parent logger levels."""
+ base_logger = get_appinsights_logger()
+ if not base_logger:
+ return None
+
+ base_name = base_logger.name or 'root'
+ debug_logger_name = 'appinsights.debug' if base_name == 'root' else f"{base_name}.debug"
+ debug_logger = logging.getLogger(debug_logger_name)
+ debug_logger.setLevel(logging.DEBUG)
+ return debug_logger
+
+
+def _emit_appinsights_debug_trace(
+ message: str,
+ category: str,
+ details: Optional[Dict[str, Any]] = None,
+) -> None:
+ """Send a tagged debug trace to App Insights when Azure Monitor logging is configured."""
+ if not _azure_monitor_configured:
+ return
+
+ debug_logger = _get_appinsights_debug_logger()
+ if not debug_logger:
+ return
+
+ trace_properties = dict(details or {})
+ trace_properties.setdefault('debug_tag', '[debug]')
+ trace_properties.setdefault('debug_category', category)
+ trace_message = f"[debug] [{category}] {message}"
+
+ try:
+ # Use a child logger so DEBUG traces can flow to App Insights even when the
+ # parent logger stays at INFO to avoid broad third-party debug noise.
+ if trace_properties:
+ debug_logger.debug(trace_message, extra=trace_properties, stacklevel=3)
+ else:
+ debug_logger.debug(trace_message, stacklevel=3)
+ except Exception:
+ pass
+
+
def debug_print(message: Any, *args: Any, category: str = "INFO", **kwargs: Any) -> None:
- """Emit a debug-only console message using the unified logging implementation."""
+ """Emit debug-only console output and forward a tagged App Insights trace when available."""
flush = kwargs.pop('flush', False)
details = kwargs or None
- log_event(
- message,
- extra=details,
- debug_only=True,
- category=category,
- flush=flush,
- message_args=args,
- )
+ formatted_message = _format_message(message, args)
+ settings = _load_logging_settings()
+
+ _emit_debug_message(settings, formatted_message, category, flush, details)
+ if not settings.get('enable_debug_logging', False):
+ return
+
+ _emit_appinsights_debug_trace(formatted_message, category, details)
def get_appinsights_logger():
diff --git a/application/single_app/functions_approvals.py b/application/single_app/functions_approvals.py
index a6c733467..b755d5c2c 100644
--- a/application/single_app/functions_approvals.py
+++ b/application/single_app/functions_approvals.py
@@ -277,7 +277,8 @@ def approve_request(
approver_id: str,
approver_email: str,
approver_name: str,
- comment: Optional[str] = None
+ comment: Optional[str] = None,
+ approval: Optional[Dict[str, Any]] = None,
) -> Dict[str, Any]:
"""
Approve an approval request.
@@ -295,10 +296,11 @@ def approve_request(
"""
try:
# Get the approval request
- approval = cosmos_approvals_container.read_item(
- item=approval_id,
- partition_key=group_id
- )
+ if approval is None:
+ approval = cosmos_approvals_container.read_item(
+ item=approval_id,
+ partition_key=group_id
+ )
# Validate status
if approval['status'] != STATUS_PENDING:
@@ -368,7 +370,8 @@ def deny_request(
denier_email: str,
denier_name: str,
comment: str,
- auto_denied: bool = False
+ auto_denied: bool = False,
+ approval: Optional[Dict[str, Any]] = None,
) -> Dict[str, Any]:
"""
Deny an approval request.
@@ -387,10 +390,11 @@ def deny_request(
"""
try:
# Get the approval request
- approval = cosmos_approvals_container.read_item(
- item=approval_id,
- partition_key=group_id
- )
+ if approval is None:
+ approval = cosmos_approvals_container.read_item(
+ item=approval_id,
+ partition_key=group_id
+ )
# Validate status (allow denying pending requests)
if approval['status'] not in [STATUS_PENDING]:
@@ -543,6 +547,29 @@ def get_approval_by_id(approval_id: str, group_id: str) -> Optional[Dict[str, An
return None
+def get_authorized_approval(
+ approval_id: str,
+ group_id: str,
+ user_id: str,
+ user_roles: List[str],
+ require_approval_rights: bool = False,
+) -> Dict[str, Any]:
+ """Return an approval only if the current user is allowed to view or approve it."""
+ approval = get_approval_by_id(approval_id, group_id)
+ if not approval:
+ raise LookupError("Approval not found")
+
+ is_authorized = (
+ _can_user_approve(approval, user_id, user_roles)
+ if require_approval_rights
+ else _can_user_view(approval, user_id, user_roles)
+ )
+ if not is_authorized:
+ raise PermissionError("You are not authorized to access this approval")
+
+ return approval
+
+
def auto_deny_expired_approvals() -> int:
"""
Auto-deny approval requests that have expired (older than 3 days).
diff --git a/application/single_app/functions_documents.py b/application/single_app/functions_documents.py
index 7c6e4a272..362729ffd 100644
--- a/application/single_app/functions_documents.py
+++ b/application/single_app/functions_documents.py
@@ -1,5 +1,6 @@
# functions_documents.py that has some changes I need to merge into Development
+import re
import traceback
from config import *
from functions_content import *
@@ -20,6 +21,7 @@ def allowed_file(filename, allowed_extensions=None):
ARCHIVED_SCOPE_PREFIX = "__archived__::"
CURRENT_ALIAS_BLOB_PATH_MODE = "current_alias"
ARCHIVED_REVISION_BLOB_PATH_MODE = "archived_revision"
+TAG_COLOR_PATTERN = re.compile(r'^#?(?:[0-9a-fA-F]{3}|[0-9a-fA-F]{6})$')
def _get_blob_container_name(group_id=None, public_workspace_id=None):
@@ -7566,6 +7568,58 @@ def sanitize_tags_for_filter(raw_tags):
return valid_tags
+def normalize_tag_color(color):
+ """
+ Normalize a tag color to a canonical 6-digit lowercase hex code.
+ Returns None for invalid values.
+ """
+ if not isinstance(color, str):
+ return None
+
+ normalized_color = color.strip()
+ if not normalized_color:
+ return None
+
+ if not TAG_COLOR_PATTERN.fullmatch(normalized_color):
+ return None
+
+ if not normalized_color.startswith('#'):
+ normalized_color = f'#{normalized_color}'
+
+ if len(normalized_color) == 4:
+ normalized_color = '#' + ''.join(component * 2 for component in normalized_color[1:])
+
+ return normalized_color.lower()
+
+
+def get_safe_tag_color(color, tag_name):
+ """
+ Return a normalized tag color or the deterministic default for the tag.
+ """
+ normalized_color = normalize_tag_color(color)
+ if normalized_color:
+ return normalized_color
+
+ safe_tag_name = normalize_tag(tag_name) or str(tag_name or '')
+ return get_default_tag_color(safe_tag_name)
+
+
+def validate_tag_color(color, tag_name):
+ """
+ Validate a requested tag color.
+ Returns (is_valid, error_message, normalized_color).
+ Missing colors resolve to the deterministic default for the tag.
+ """
+ if color is None:
+ return True, None, get_safe_tag_color(None, tag_name)
+
+ normalized_color = normalize_tag_color(color)
+ if not normalized_color:
+ return False, 'Tag color must be a valid 3- or 6-digit hex color', None
+
+ return True, None, normalized_color
+
+
def get_workspace_tags(user_id, group_id=None, public_workspace_id=None):
"""
Get all unique tags used in a workspace with document counts.
@@ -7662,7 +7716,7 @@ def get_workspace_tags(user_id, group_id=None, public_workspace_id=None):
results.append({
'name': tag_name,
'count': count,
- 'color': tag_def.get('color', get_default_tag_color(tag_name))
+ 'color': get_safe_tag_color(tag_def.get('color'), tag_name)
})
# Add defined tags that haven't been used yet (count = 0)
@@ -7671,7 +7725,7 @@ def get_workspace_tags(user_id, group_id=None, public_workspace_id=None):
results.append({
'name': tag_name,
'count': 0,
- 'color': tag_def.get('color', get_default_tag_color(tag_name))
+ 'color': get_safe_tag_color(tag_def.get('color'), tag_name)
})
# Sort by count descending, then name ascending
@@ -7728,34 +7782,40 @@ def get_or_create_tag_definition(user_id, tag_name, workspace_type='personal', c
"""
from datetime import datetime, timezone
+ safe_color = get_safe_tag_color(color, tag_name)
+
if workspace_type == 'group' and group_id:
from functions_group import find_group_by_id
group_doc = find_group_by_id(group_id)
if not group_doc:
- return {'color': color or get_default_tag_color(tag_name)}
+ return {'color': safe_color}
tag_defs = group_doc.get('tag_definitions', {})
if tag_name not in tag_defs:
tag_defs[tag_name] = {
- 'color': color if color else get_default_tag_color(tag_name),
+ 'color': safe_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
group_doc['tag_definitions'] = tag_defs
cosmos_groups_container.upsert_item(group_doc)
- return tag_defs[tag_name]
+ stored_tag_def = dict(tag_defs[tag_name])
+ stored_tag_def['color'] = get_safe_tag_color(stored_tag_def.get('color'), tag_name)
+ return stored_tag_def
elif workspace_type == 'public' and public_workspace_id:
from functions_public_workspaces import find_public_workspace_by_id
ws_doc = find_public_workspace_by_id(public_workspace_id)
if not ws_doc:
- return {'color': color or get_default_tag_color(tag_name)}
+ return {'color': safe_color}
tag_defs = ws_doc.get('tag_definitions', {})
if tag_name not in tag_defs:
tag_defs[tag_name] = {
- 'color': color if color else get_default_tag_color(tag_name),
+ 'color': safe_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
ws_doc['tag_definitions'] = tag_defs
cosmos_public_workspaces_container.upsert_item(ws_doc)
- return tag_defs[tag_name]
+ stored_tag_def = dict(tag_defs[tag_name])
+ stored_tag_def['color'] = get_safe_tag_color(stored_tag_def.get('color'), tag_name)
+ return stored_tag_def
else:
# Personal: store in user settings
from functions_settings import get_user_settings, update_user_settings
@@ -7771,12 +7831,14 @@ def get_or_create_tag_definition(user_id, tag_name, workspace_type='personal', c
if tag_name not in workspace_tags:
workspace_tags[tag_name] = {
- 'color': color if color else get_default_tag_color(tag_name),
+ 'color': safe_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
update_user_settings(user_id, {'tag_definitions': tag_definitions})
- return workspace_tags[tag_name]
+ stored_tag_def = dict(workspace_tags[tag_name])
+ stored_tag_def['color'] = get_safe_tag_color(stored_tag_def.get('color'), tag_name)
+ return stored_tag_def
def propagate_tags_to_blob_metadata(document_id, tags, user_id, group_id=None, public_workspace_id=None):
diff --git a/application/single_app/functions_group.py b/application/single_app/functions_group.py
index e50b09f60..2c60fab58 100644
--- a/application/single_app/functions_group.py
+++ b/application/single_app/functions_group.py
@@ -1,6 +1,8 @@
# functions_group.py
from config import *
+import functions_authentication
+import functions_settings
from functions_authentication import *
from functions_settings import *
from typing import Iterable
@@ -103,12 +105,20 @@ def find_group_by_id(group_id):
except exceptions.CosmosResourceNotFoundError:
return None
-def update_active_group_for_user(group_id):
- user_id = get_current_user_id()
+def update_active_group_for_user(group_id, user_id=None):
+ if not user_id:
+ user_id = functions_authentication.get_current_user_id()
+
+ assert_group_role(
+ user_id,
+ group_id,
+ allowed_roles=("Owner", "Admin", "DocumentManager", "User"),
+ )
+
new_settings = {
"activeGroupOid": group_id
}
- update_user_settings(user_id, new_settings)
+ functions_settings.update_user_settings(user_id, new_settings)
def get_user_role_in_group(group_doc, user_id):
"""Determine the user's role in the given group doc."""
@@ -129,12 +139,17 @@ def get_user_role_in_group(group_doc, user_id):
return None
-def require_active_group(user_id: str) -> str:
- """Return the active group id for a user or raise ValueError if missing."""
- settings = get_user_settings(user_id)
+def require_active_group(
+ user_id: str,
+ allowed_roles: Iterable[str] = ("Owner", "Admin", "DocumentManager", "User"),
+) -> str:
+ """Return the active group id for a user after validating current membership."""
+ settings = functions_settings.get_user_settings(user_id)
active_group_id = settings.get("settings", {}).get("activeGroupOid")
if not active_group_id:
raise ValueError("No active group selected")
+
+ assert_group_role(user_id, active_group_id, allowed_roles=allowed_roles)
return active_group_id
diff --git a/application/single_app/functions_keyvault.py b/application/single_app/functions_keyvault.py
index a523eeaa9..c40977507 100644
--- a/application/single_app/functions_keyvault.py
+++ b/application/single_app/functions_keyvault.py
@@ -60,6 +60,109 @@ class SecretReturnType(Enum):
NAME = "name"
+def _normalize_allowed_sources(allowed_sources):
+ """Normalize one or many allowed sources into a comparable set."""
+ if allowed_sources is None:
+ return None
+ if isinstance(allowed_sources, str):
+ return {allowed_sources}
+ return {
+ str(source).strip()
+ for source in allowed_sources
+ if str(source).strip()
+ }
+
+
+def parse_secret_name_dynamic(secret_name):
+ """Return parsed Key Vault secret reference parts when the name is valid."""
+ scopes_pattern = '|'.join(re.escape(scope) for scope in supported_scopes)
+ sources_pattern = '|'.join(re.escape(source) for source in supported_sources)
+ pattern = (
+ rf"^(?P.+?)--(?P{sources_pattern})--"
+ rf"(?P{scopes_pattern})--(?P.+)$"
+ )
+ match = re.match(pattern, secret_name)
+ if not match:
+ return None
+ if len(secret_name) > 127:
+ return None
+ return match.groupdict()
+
+
+def secret_reference_matches_context(secret_name, scope_value=None, scope=None, allowed_sources=None):
+ """Return True when a secret reference belongs to the expected scope and source."""
+ parsed = parse_secret_name_dynamic(secret_name)
+ if not parsed:
+ return False
+
+ normalized_sources = _normalize_allowed_sources(allowed_sources)
+ expected_scope_value = None
+ if scope_value is not None:
+ expected_scope_value = clean_name_for_keyvault(str(scope_value))
+
+ if expected_scope_value is not None and parsed["scope_value"] != expected_scope_value:
+ return False
+ if scope is not None and parsed["scope"] != scope:
+ return False
+ if normalized_sources is not None and parsed["source"] not in normalized_sources:
+ return False
+ return True
+
+
+def _log_secret_reference_context_mismatch(secret_name, context_label, scope_value=None, scope=None, allowed_sources=None):
+ """Emit a warning when a stored secret reference does not match its expected context."""
+ parsed = parse_secret_name_dynamic(secret_name) or {}
+ expected_scope_value = None
+ if scope_value is not None:
+ expected_scope_value = clean_name_for_keyvault(str(scope_value))
+
+ log_event(
+ f"[KeyVault] Rejected mismatched secret reference for {context_label}.",
+ extra={
+ "context_label": context_label,
+ "expected_scope_value": expected_scope_value,
+ "expected_scope": scope,
+ "allowed_sources": sorted(_normalize_allowed_sources(allowed_sources) or []),
+ "provided_scope_value": parsed.get("scope_value"),
+ "provided_scope": parsed.get("scope"),
+ "provided_source": parsed.get("source"),
+ },
+ level=logging.WARNING,
+ )
+
+
+def resolve_secret_reference_for_context(
+ secret_name,
+ scope_value=None,
+ scope=None,
+ allowed_sources=None,
+ context_label="secret reference",
+):
+ """Resolve a Key Vault reference only when it matches the expected context."""
+ if not validate_secret_name_dynamic(secret_name):
+ return secret_name
+
+ if not secret_reference_matches_context(
+ secret_name,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources=allowed_sources,
+ ):
+ _log_secret_reference_context_mismatch(
+ secret_name,
+ context_label,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources=allowed_sources,
+ )
+ raise ValueError(f"Stored Key Vault reference for {context_label} does not match the expected scope.")
+
+ resolved_value = retrieve_secret_from_key_vault_by_full_name(secret_name)
+ if validate_secret_name_dynamic(resolved_value):
+ raise ValueError(f"Unable to resolve stored Key Vault secret for {context_label}.")
+ return resolved_value
+
+
def _get_nested_dict_value(data, path):
"""Return a nested dictionary value, or None when the path is missing."""
current = data
@@ -119,10 +222,28 @@ def _store_plugin_secret_reference(updated_plugin, existing_plugin, path, secret
if not value:
return
+ path_label = ".".join(path)
+
existing_reference = _get_existing_secret_reference(existing_plugin, path)
if value == ui_trigger_word:
if existing_reference:
+ if not secret_reference_matches_context(
+ existing_reference,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={source},
+ ):
+ _log_secret_reference_context_mismatch(
+ existing_reference,
+ f"plugin field '{path_label}' existing reference",
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={source},
+ )
+ raise ValueError(
+ f"Stored Key Vault reference for '{path_label}' no longer matches the expected scope. Re-enter the secret value."
+ )
_set_nested_dict_value(updated_plugin, path, existing_reference)
return
_set_nested_dict_value(
@@ -133,6 +254,22 @@ def _store_plugin_secret_reference(updated_plugin, existing_plugin, path, secret
return
if validate_secret_name_dynamic(value):
+ if not secret_reference_matches_context(
+ value,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={source},
+ ):
+ _log_secret_reference_context_mismatch(
+ value,
+ f"plugin field '{path_label}'",
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={source},
+ )
+ raise ValueError(
+ f"Stored Key Vault reference for '{path_label}' does not match the expected scope."
+ )
_set_nested_dict_value(updated_plugin, path, value)
return
@@ -377,18 +514,7 @@ def validate_secret_name_dynamic(secret_name):
Returns:
bool: True if valid, False otherwise.
"""
- # Build regex pattern dynamically
- scopes_pattern = '|'.join(re.escape(scope) for scope in supported_scopes)
- sources_pattern = '|'.join(re.escape(source) for source in supported_sources)
- # Wildcards for secret_name and scope_value
- pattern = rf"^(.+)--({sources_pattern})--({scopes_pattern})--(.+)$"
- match = re.match(pattern, secret_name)
- if not match:
- return False
- # Optionally, check length
- if len(secret_name) > 127:
- return False
- return True
+ return parse_secret_name_dynamic(secret_name) is not None
def keyvault_agent_save_helper(agent_dict, scope_value, scope="global"):
"""
@@ -616,8 +742,20 @@ def keyvault_plugin_get_helper(plugin_dict, scope_value, scope="global", return_
value = auth.get(auth_field)
if value and validate_secret_name_dynamic(value):
try:
+ is_expected_reference = secret_reference_matches_context(
+ value,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action"},
+ )
if return_type == SecretReturnType.VALUE:
- new_auth[auth_field] = retrieve_secret_from_key_vault_by_full_name(value)
+ new_auth[auth_field] = resolve_secret_reference_for_context(
+ value,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action"},
+ context_label=f"action auth field '{auth_field}'",
+ )
elif return_type == SecretReturnType.NAME:
new_auth[auth_field] = value
else:
@@ -635,8 +773,20 @@ def keyvault_plugin_get_helper(plugin_dict, scope_value, scope="global", return_
for k, v in additional_fields.items():
if (k.endswith('__Secret') or _is_sql_sensitive_additional_field(updated, k)) and v and validate_secret_name_dynamic(v):
try:
+ is_expected_reference = secret_reference_matches_context(
+ v,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action-addset"},
+ )
if return_type == SecretReturnType.VALUE:
- new_additional_fields[k] = retrieve_secret_from_key_vault_by_full_name(v)
+ new_additional_fields[k] = resolve_secret_reference_for_context(
+ v,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action-addset"},
+ context_label=f"action additional field '{k}'",
+ )
elif return_type == SecretReturnType.NAME:
new_additional_fields[k] = v
else:
@@ -834,6 +984,20 @@ def keyvault_plugin_delete_helper(plugin_dict, scope_value, scope="global"):
for auth_field in ('key', *SQL_PLUGIN_SENSITIVE_AUTH_FIELDS):
secret_name = auth.get(auth_field)
if secret_name and validate_secret_name_dynamic(secret_name):
+ if not secret_reference_matches_context(
+ secret_name,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action"},
+ ):
+ _log_secret_reference_context_mismatch(
+ secret_name,
+ f"action auth field '{auth_field}' deletion",
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action"},
+ )
+ continue
try:
key_vault_url = f"https://{key_vault_name}{KEY_VAULT_DOMAIN}"
log_event(f"Deleting action auth secret '{auth_field}' for action '{plugin_name}' for '{scope}' '{scope_value}'", level=logging.INFO)
@@ -847,6 +1011,20 @@ def keyvault_plugin_delete_helper(plugin_dict, scope_value, scope="global"):
if isinstance(additional_fields, dict):
for k, v in additional_fields.items():
if (k.endswith('__Secret') or _is_sql_sensitive_additional_field(plugin_dict, k)) and v and validate_secret_name_dynamic(v):
+ if not secret_reference_matches_context(
+ v,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action-addset"},
+ ):
+ _log_secret_reference_context_mismatch(
+ v,
+ f"action additional field '{k}' deletion",
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action-addset"},
+ )
+ continue
try:
key_vault_url = f"https://{key_vault_name}{KEY_VAULT_DOMAIN}"
log_event(f"Deleting action additionalField secret '{k}' for action '{plugin_name}' for '{scope}' '{scope_value}'", level=logging.INFO)
diff --git a/application/single_app/functions_public_workspaces.py b/application/single_app/functions_public_workspaces.py
index 45e5f80e6..8039845f4 100644
--- a/application/single_app/functions_public_workspaces.py
+++ b/application/single_app/functions_public_workspaces.py
@@ -1,8 +1,10 @@
# functions_public_workspaces.py
from config import *
+import functions_settings
from functions_authentication import *
from functions_group import *
+from typing import Iterable
def create_public_workspace(name: str, description: str) -> dict:
"""
@@ -114,13 +116,57 @@ def get_user_role_in_public_workspace(ws_doc: dict, user_id: str) -> str | None:
return None
if ws_doc.get("owner", {}).get("userId") == user_id:
return "Owner"
- if user_id in ws_doc.get("admins", []):
- return "Admin"
+ for admin in ws_doc.get("admins", []):
+ if isinstance(admin, str) and admin == user_id:
+ return "Admin"
+ if isinstance(admin, dict) and admin.get("userId") == user_id:
+ return "Admin"
if any(dm["userId"] == user_id for dm in ws_doc.get("documentManagers", [])):
return "DocumentManager"
return None
+def build_public_workspace_public_summary(ws_doc: dict) -> dict:
+ """Return the non-sensitive workspace fields safe for any authenticated caller."""
+ owner = ws_doc.get("owner", {}) or {}
+ return {
+ "id": ws_doc.get("id", ""),
+ "name": ws_doc.get("name", ""),
+ "description": ws_doc.get("description", ""),
+ "owner": {
+ "displayName": owner.get("displayName", ""),
+ },
+ "status": ws_doc.get("status", "active"),
+ "heroColor": ws_doc.get("heroColor", "#0078d4"),
+ "userRole": None,
+ "isMember": False,
+ }
+
+
+def build_public_workspace_member_payload(ws_doc: dict, user_id: str) -> dict:
+ """Return the workspace fields required by member-facing workspace pages."""
+ role = get_user_role_in_public_workspace(ws_doc, user_id)
+ owner = ws_doc.get("owner", {}) or {}
+ payload = {
+ "id": ws_doc.get("id", ""),
+ "name": ws_doc.get("name", ""),
+ "description": ws_doc.get("description", ""),
+ "owner": {
+ "displayName": owner.get("displayName", ""),
+ "email": owner.get("email", ""),
+ },
+ "status": ws_doc.get("status", "active"),
+ "heroColor": ws_doc.get("heroColor", "#0078d4"),
+ "userRole": role,
+ "isMember": bool(role),
+ }
+
+ if role in ("Owner", "Admin") and "retention_policy" in ws_doc:
+ payload["retention_policy"] = ws_doc.get("retention_policy")
+
+ return payload
+
+
def is_user_in_public_workspace(ws_doc: dict, user_id: str) -> bool:
"""
Check if a user has any role in the workspace.
@@ -224,9 +270,46 @@ def count_public_workspace_documents(ws_id: str) -> int:
def update_active_public_workspace_for_user(user_id: str, ws_id: str) -> None:
"""
- Persist the user's activePublicWorkspaceOid in their settings.
+ Persist the user's activePublicWorkspaceOid after validating the workspace.
"""
- update_user_settings(user_id, {"activePublicWorkspaceOid": ws_id})
+ normalized_workspace_id = str(ws_id or "").strip()
+ if not normalized_workspace_id:
+ functions_settings.update_user_settings(user_id, {"activePublicWorkspaceOid": ""})
+ return
+
+ workspace_doc = find_public_workspace_by_id(normalized_workspace_id)
+ if not workspace_doc:
+ raise LookupError("Workspace not found")
+
+ functions_settings.update_user_settings(
+ user_id,
+ {"activePublicWorkspaceOid": normalized_workspace_id},
+ )
+
+
+def require_active_public_workspace(
+ user_id: str,
+ allowed_roles: Iterable[str] = ("Owner", "Admin", "DocumentManager"),
+) -> tuple[str, dict, str]:
+ """Return the active public workspace after validating it still exists and the user can access it."""
+ settings = functions_settings.get_user_settings(user_id)
+ active_workspace_id = str(settings.get("settings", {}).get("activePublicWorkspaceOid") or "").strip()
+ if not active_workspace_id:
+ raise ValueError("No active public workspace selected")
+
+ workspace_doc = find_public_workspace_by_id(active_workspace_id)
+ if not workspace_doc:
+ raise LookupError("Active public workspace not found")
+
+ role = get_user_role_in_public_workspace(workspace_doc, user_id)
+ if not role:
+ raise PermissionError("Access denied")
+
+ allowed = {allowed_role.lower() for allowed_role in allowed_roles}
+ if role.lower() not in allowed:
+ raise PermissionError("Access denied")
+
+ return active_workspace_id, workspace_doc, role
def get_user_visible_public_workspaces(user_id: str) -> list:
diff --git a/application/single_app/functions_search.py b/application/single_app/functions_search.py
index 6851778f0..0859cf064 100644
--- a/application/single_app/functions_search.py
+++ b/application/single_app/functions_search.py
@@ -94,6 +94,22 @@ def build_tags_filter(tags_filter):
tag_conditions = [f"document_tags/any(t: t eq '{tag}')" for tag in safe_tags]
return " and ".join(tag_conditions)
+
+def _escape_odata_literal(value: Any) -> str:
+ """Escape a value for safe inclusion inside an OData single-quoted literal."""
+ return str(value or "").replace("'", "''")
+
+
+def _build_odata_eq(field_name: str, value: Any) -> str:
+ """Build a simple equality clause with an escaped OData literal."""
+ return f"{field_name} eq '{_escape_odata_literal(value)}'"
+
+
+def _build_odata_any_eq(collection_field: str, iterator_name: str, value: Any) -> str:
+ """Build an OData any(...) equality clause with an escaped literal."""
+ escaped_value = _escape_odata_literal(value)
+ return f"{collection_field}/any({iterator_name}: {iterator_name} eq '{escaped_value}')"
+
def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12, doc_scope="all", active_group_id=None, active_group_ids=None, active_public_workspace_id=None, enable_file_sharing=True, tags_filter=None):
"""
Hybrid search that queries the user doc index, group doc index, or public doc index
@@ -155,9 +171,9 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
doc_id_filter = None
if document_ids and len(document_ids) > 0:
if len(document_ids) == 1:
- doc_id_filter = f"document_id eq '{document_ids[0]}'"
+ doc_id_filter = _build_odata_eq("document_id", document_ids[0])
else:
- conditions = " or ".join([f"document_id eq '{did}'" for did in document_ids])
+ conditions = " or ".join([_build_odata_eq("document_id", did) for did in document_ids])
doc_id_filter = f"({conditions})"
# Generate cache key including document set fingerprints and tags filter
@@ -237,9 +253,9 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
# Build user filter with optional tags
user_base_filter = (
(
- f"(user_id eq '{user_id}' or shared_user_ids/any(u: u eq '{user_id},approved')) "
+ f"({_build_odata_eq('user_id', user_id)} or {_build_odata_any_eq('shared_user_ids', 'u', f'{user_id},approved')}) "
if enable_file_sharing else
- f"user_id eq '{user_id}' "
+ f"{_build_odata_eq('user_id', user_id)} "
) +
f"and {doc_id_filter}"
)
@@ -258,8 +274,11 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
# Only search group index if active_group_ids is provided
if active_group_ids:
- group_conditions = " or ".join([f"group_id eq '{gid}'" for gid in active_group_ids])
- shared_conditions = " or ".join([f"shared_group_ids/any(g: g eq '{gid},approved')" for gid in active_group_ids])
+ group_conditions = " or ".join([_build_odata_eq("group_id", gid) for gid in active_group_ids])
+ shared_conditions = " or ".join([
+ _build_odata_any_eq("shared_group_ids", "g", f"{gid},approved")
+ for gid in active_group_ids
+ ])
group_base_filter = f"({group_conditions} or {shared_conditions}) and {doc_id_filter}"
group_filter = f"{group_base_filter} and {tags_filter_clause}" if tags_filter_clause else group_base_filter
@@ -282,11 +301,14 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
# Create filter for visible public workspaces
if visible_public_workspace_ids:
# Use 'or' conditions instead of 'in' operator for OData compatibility
- workspace_conditions = " or ".join([f"public_workspace_id eq '{id}'" for id in visible_public_workspace_ids])
+ workspace_conditions = " or ".join([
+ _build_odata_eq("public_workspace_id", workspace_id)
+ for workspace_id in visible_public_workspace_ids
+ ])
public_base_filter = f"({workspace_conditions}) and {doc_id_filter}"
else:
# Fallback to active_public_workspace_id if no visible workspaces
- public_base_filter = f"public_workspace_id eq '{active_public_workspace_id}' and {doc_id_filter}"
+ public_base_filter = f"{_build_odata_eq('public_workspace_id', active_public_workspace_id)} and {doc_id_filter}"
public_filter = f"{public_base_filter} and {tags_filter_clause}" if tags_filter_clause else public_base_filter
@@ -303,9 +325,9 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
else:
# Build user filter with optional tags
user_base_filter = (
- f"(user_id eq '{user_id}' or shared_user_ids/any(u: u eq '{user_id},approved')) "
+ f"({_build_odata_eq('user_id', user_id)} or {_build_odata_any_eq('shared_user_ids', 'u', f'{user_id},approved')}) "
if enable_file_sharing else
- f"user_id eq '{user_id}' "
+ f"{_build_odata_eq('user_id', user_id)} "
)
user_filter = f"{user_base_filter} and {tags_filter_clause}" if tags_filter_clause else user_base_filter.strip()
@@ -322,8 +344,11 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
# Only search group index if active_group_ids is provided
if active_group_ids:
- group_conditions = " or ".join([f"group_id eq '{gid}'" for gid in active_group_ids])
- shared_conditions = " or ".join([f"shared_group_ids/any(g: g eq '{gid},approved')" for gid in active_group_ids])
+ group_conditions = " or ".join([_build_odata_eq("group_id", gid) for gid in active_group_ids])
+ shared_conditions = " or ".join([
+ _build_odata_any_eq("shared_group_ids", "g", f"{gid},approved")
+ for gid in active_group_ids
+ ])
group_base_filter = f"({group_conditions} or {shared_conditions})"
group_filter = f"{group_base_filter} and {tags_filter_clause}" if tags_filter_clause else group_base_filter
@@ -346,11 +371,14 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
# Create filter for visible public workspaces
if visible_public_workspace_ids:
# Use 'or' conditions instead of 'in' operator for OData compatibility
- workspace_conditions = " or ".join([f"public_workspace_id eq '{id}'" for id in visible_public_workspace_ids])
+ workspace_conditions = " or ".join([
+ _build_odata_eq("public_workspace_id", workspace_id)
+ for workspace_id in visible_public_workspace_ids
+ ])
public_base_filter = f"({workspace_conditions})"
else:
# Fallback to active_public_workspace_id if no visible workspaces
- public_base_filter = f"public_workspace_id eq '{active_public_workspace_id}'"
+ public_base_filter = _build_odata_eq("public_workspace_id", active_public_workspace_id)
public_filter = f"{public_base_filter} and {tags_filter_clause}" if tags_filter_clause else public_base_filter
@@ -396,9 +424,9 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
if doc_id_filter:
user_base_filter = (
(
- f"(user_id eq '{user_id}' or shared_user_ids/any(u: u eq '{user_id},approved')) "
+ f"({_build_odata_eq('user_id', user_id)} or {_build_odata_any_eq('shared_user_ids', 'u', f'{user_id},approved')}) "
if enable_file_sharing else
- f"user_id eq '{user_id}' "
+ f"{_build_odata_eq('user_id', user_id)} "
) +
f"and {doc_id_filter}"
)
@@ -417,9 +445,9 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
results = extract_search_results(user_results, top_n)
else:
user_base_filter = (
- f"(user_id eq '{user_id}' or shared_user_ids/any(u: u eq '{user_id},approved')) "
+ f"({_build_odata_eq('user_id', user_id)} or {_build_odata_any_eq('shared_user_ids', 'u', f'{user_id},approved')}) "
if enable_file_sharing else
- f"user_id eq '{user_id}' "
+ f"{_build_odata_eq('user_id', user_id)} "
)
user_filter = f"{user_base_filter} and {tags_filter_clause}" if tags_filter_clause else user_base_filter.strip()
@@ -439,8 +467,11 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
if not active_group_ids:
results = []
elif doc_id_filter:
- group_conditions = " or ".join([f"group_id eq '{gid}'" for gid in active_group_ids])
- shared_conditions = " or ".join([f"shared_group_ids/any(g: g eq '{gid},approved')" for gid in active_group_ids])
+ group_conditions = " or ".join([_build_odata_eq("group_id", gid) for gid in active_group_ids])
+ shared_conditions = " or ".join([
+ _build_odata_any_eq("shared_group_ids", "g", f"{gid},approved")
+ for gid in active_group_ids
+ ])
group_base_filter = f"({group_conditions} or {shared_conditions}) and {doc_id_filter}"
group_filter = f"{group_base_filter} and {tags_filter_clause}" if tags_filter_clause else group_base_filter
@@ -456,8 +487,11 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
)
results = extract_search_results(group_results, top_n)
else:
- group_conditions = " or ".join([f"group_id eq '{gid}'" for gid in active_group_ids])
- shared_conditions = " or ".join([f"shared_group_ids/any(g: g eq '{gid},approved')" for gid in active_group_ids])
+ group_conditions = " or ".join([_build_odata_eq("group_id", gid) for gid in active_group_ids])
+ shared_conditions = " or ".join([
+ _build_odata_any_eq("shared_group_ids", "g", f"{gid},approved")
+ for gid in active_group_ids
+ ])
group_base_filter = f"({group_conditions} or {shared_conditions})"
group_filter = f"{group_base_filter} and {tags_filter_clause}" if tags_filter_clause else group_base_filter
@@ -481,11 +515,14 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
# Create filter for visible public workspaces
if visible_public_workspace_ids:
# Use 'or' conditions instead of 'in' operator for OData compatibility
- workspace_conditions = " or ".join([f"public_workspace_id eq '{id}'" for id in visible_public_workspace_ids])
+ workspace_conditions = " or ".join([
+ _build_odata_eq("public_workspace_id", workspace_id)
+ for workspace_id in visible_public_workspace_ids
+ ])
public_base_filter = f"({workspace_conditions}) and {doc_id_filter}"
else:
# Fallback to active_public_workspace_id if no visible workspaces
- public_base_filter = f"public_workspace_id eq '{active_public_workspace_id}' and {doc_id_filter}"
+ public_base_filter = f"{_build_odata_eq('public_workspace_id', active_public_workspace_id)} and {doc_id_filter}"
public_filter = f"{public_base_filter} and {tags_filter_clause}" if tags_filter_clause else public_base_filter
@@ -507,11 +544,14 @@ def hybrid_search(query, user_id, document_id=None, document_ids=None, top_n=12,
# Create filter for visible public workspaces
if visible_public_workspace_ids:
# Use 'or' conditions instead of 'in' operator for OData compatibility
- workspace_conditions = " or ".join([f"public_workspace_id eq '{id}'" for id in visible_public_workspace_ids])
+ workspace_conditions = " or ".join([
+ _build_odata_eq("public_workspace_id", workspace_id)
+ for workspace_id in visible_public_workspace_ids
+ ])
public_base_filter = f"({workspace_conditions})"
else:
# Fallback to active_public_workspace_id if no visible workspaces
- public_base_filter = f"public_workspace_id eq '{active_public_workspace_id}'"
+ public_base_filter = _build_odata_eq("public_workspace_id", active_public_workspace_id)
public_filter = f"{public_base_filter} and {tags_filter_clause}" if tags_filter_clause else public_base_filter
diff --git a/application/single_app/functions_settings.py b/application/single_app/functions_settings.py
index 8d09ee614..0091d0a38 100644
--- a/application/single_app/functions_settings.py
+++ b/application/single_app/functions_settings.py
@@ -1,5 +1,7 @@
# functions_settings.py
+from flask import has_request_context
+
from config import *
from functions_appinsights import log_event
import app_settings_cache
@@ -15,6 +17,43 @@
def is_tabular_processing_enabled(settings):
"""Tabular processing is available whenever enhanced citations is enabled."""
return bool((settings or {}).get('enable_enhanced_citations', False))
+
+
+def _authorize_user_settings_access(user_id, operation, allow_cross_user=False):
+ """Authorize user-settings access for the current request context."""
+ normalized_user_id = str(user_id or '').strip()
+ if allow_cross_user or not has_request_context():
+ return None
+
+ try:
+ # Import locally to avoid a circular dependency during app startup.
+ from functions_authentication import get_current_user_id
+ except ImportError:
+ from application.single_app.functions_authentication import get_current_user_id
+
+ actor_user_id = str(get_current_user_id() or '').strip()
+ if actor_user_id and normalized_user_id and actor_user_id != normalized_user_id:
+ log_event(
+ f"[UserSettings] Denied cross-user {operation}",
+ {
+ "actor_user_id": actor_user_id,
+ "target_user_id": normalized_user_id,
+ "operation": operation,
+ },
+ level=logging.WARNING,
+ )
+ raise PermissionError(f"Cannot {operation} settings for another user.")
+
+ return actor_user_id or None
+
+
+def _should_sync_session_profile(target_user_id, actor_user_id, allow_cross_user=False):
+ """Return True when session-derived profile data should update the target settings doc."""
+ if allow_cross_user or not has_request_context():
+ return False
+ normalized_target_user_id = str(target_user_id or '').strip()
+ normalized_actor_user_id = str(actor_user_id or '').strip()
+ return bool(normalized_target_user_id and normalized_actor_user_id and normalized_target_user_id == normalized_actor_user_id)
import copy
from support_menu_config import (
get_default_support_latest_features_visibility,
@@ -31,7 +70,8 @@ def get_settings(use_cosmos=False, include_source=False):
import secrets
default_settings = {
# External health check
- 'enable_external_healthcheck': True,
+ 'enable_external_healthcheck': False,
+ 'enable_no_auth_external_healthcheck': False,
# Security settings
'enable_appinsights_global_logging': False,
'enable_debug_logging': False,
@@ -306,7 +346,7 @@ def get_settings(use_cosmos=False, include_source=False):
'enable_web_search': False,
'web_search_consent_accepted': False,
'enable_web_search_user_notice': False, # Show popup to users explaining their message will be sent to Bing
- 'web_search_user_notice_text': 'Your message will be sent to Microsoft Bing for web search. Only your current message is sent, not your conversation history.',
+ 'web_search_user_notice_text': 'Your current message will be sent to Microsoft Bing for web search. Conversation history is not sent for web search, but any sensitive content you paste into this message may be sent.',
'web_search_agent': {
'agent_type': 'aifoundry',
'azure_openai_gpt_endpoint': '',
@@ -351,8 +391,6 @@ def get_settings(use_cosmos=False, include_source=False):
'file_timer_value': 1,
'file_timer_unit': 'hours',
'file_processing_logs_turnoff_time': None,
- 'enable_external_healthcheck': False,
-
# Streaming settings
'streamingEnabled': True,
@@ -1035,9 +1073,14 @@ def decrypt_key(encrypted_key):
)
return None
-def get_user_settings(user_id):
+def get_user_settings(user_id, allow_cross_user=False):
"""Fetches the user settings document from Cosmos DB, ensuring email and display_name are present if possible."""
- from flask import session
+ actor_user_id = _authorize_user_settings_access(user_id, "read", allow_cross_user=allow_cross_user)
+ should_sync_session_profile = _should_sync_session_profile(
+ user_id,
+ actor_user_id,
+ allow_cross_user=allow_cross_user,
+ )
try:
doc = cosmos_user_settings_container.read_item(item=user_id, partition_key=user_id)
updated = False
@@ -1058,27 +1101,62 @@ def get_user_settings(user_id):
doc['settings']['showTutorialButtons'] = True
updated = True
- # Try to update email/display_name if missing and available in session
- user = session.get("user", {})
- email = user.get("preferred_username") or user.get("email")
- display_name = user.get("name")
- if email and doc.get("email") != email:
- doc["email"] = email
- updated = True
- if display_name and doc.get("display_name") != display_name:
- doc["display_name"] = display_name
- updated = True
-
- # Check if profile image needs to be fetched
- if 'profileImage' not in doc['settings']:
+ if should_sync_session_profile:
+ # Try to update email/display_name if missing and available in session
+ user = session.get("user", {})
+ email = user.get("preferred_username") or user.get("email")
+ display_name = user.get("name")
+ if email and doc.get("email") != email:
+ doc["email"] = email
+ updated = True
+ if display_name and doc.get("display_name") != display_name:
+ doc["display_name"] = display_name
+ updated = True
+
+ # Check if profile image needs to be fetched
+ if 'profileImage' not in doc['settings']:
+ from functions_authentication import get_user_profile_image
+ try:
+ profile_image = get_user_profile_image()
+ doc['settings']['profileImage'] = profile_image
+ updated = True
+ except Exception as e:
+ log_event(
+ "Could not fetch profile image for existing user.",
+ extra={
+ "user_id": user_id,
+ "error": str(e)
+ },
+ level=logging.WARNING
+ )
+ doc['settings']['profileImage'] = None
+ updated = True
+
+ if updated:
+ cosmos_user_settings_container.upsert_item(body=doc)
+ return doc
+ except exceptions.CosmosResourceNotFoundError:
+ # Return a default structure if the user has no settings saved yet
+ doc = {"id": user_id, "settings": {}}
+ doc["settings"]["personal_model_endpoints"] = []
+ doc["settings"]["showTutorialButtons"] = True
+ if should_sync_session_profile:
+ user = session.get("user", {})
+ email = user.get("preferred_username") or user.get("email")
+ display_name = user.get("name")
+ if email:
+ doc["email"] = email
+ if display_name:
+ doc["display_name"] = display_name
+
+ # Try to fetch profile image for new user
from functions_authentication import get_user_profile_image
try:
profile_image = get_user_profile_image()
doc['settings']['profileImage'] = profile_image
- updated = True
except Exception as e:
log_event(
- "Could not fetch profile image for existing user.",
+ "Could not fetch profile image for new user.",
extra={
"user_id": user_id,
"error": str(e)
@@ -1086,39 +1164,6 @@ def get_user_settings(user_id):
level=logging.WARNING
)
doc['settings']['profileImage'] = None
- updated = True
-
- if updated:
- cosmos_user_settings_container.upsert_item(body=doc)
- return doc
- except exceptions.CosmosResourceNotFoundError:
- # Return a default structure if the user has no settings saved yet
- user = session.get("user", {})
- email = user.get("preferred_username") or user.get("email")
- display_name = user.get("name")
- doc = {"id": user_id, "settings": {}}
- doc["settings"]["personal_model_endpoints"] = []
- doc["settings"]["showTutorialButtons"] = True
- if email:
- doc["email"] = email
- if display_name:
- doc["display_name"] = display_name
-
- # Try to fetch profile image for new user
- from functions_authentication import get_user_profile_image
- try:
- profile_image = get_user_profile_image()
- doc['settings']['profileImage'] = profile_image
- except Exception as e:
- log_event(
- "Could not fetch profile image for new user.",
- extra={
- "user_id": user_id,
- "error": str(e)
- },
- level=logging.WARNING
- )
- doc['settings']['profileImage'] = None
cosmos_user_settings_container.upsert_item(body=doc)
return doc
@@ -1134,7 +1179,7 @@ def get_user_settings(user_id):
)
raise # Re-raise the exception to be handled by the route
-def update_user_settings(user_id, settings_to_update):
+def update_user_settings(user_id, settings_to_update, allow_cross_user=False):
"""
Updates or creates user settings in Cosmos DB, merging new settings
into the existing 'settings' sub-dictionary and updating 'lastUpdated'.
@@ -1147,8 +1192,21 @@ def update_user_settings(user_id, settings_to_update):
Returns:
bool: True if the update was successful, False otherwise.
"""
+ actor_user_id = _authorize_user_settings_access(
+ user_id,
+ "update",
+ allow_cross_user=allow_cross_user,
+ )
sanitized_settings_to_update = sanitize_settings_for_logging(settings_to_update)
- log_event("[UserSettings] Update Attempt", {"user_id": user_id, "settings_to_update": sanitized_settings_to_update})
+ log_event(
+ "[UserSettings] Update Attempt",
+ {
+ "user_id": user_id,
+ "actor_user_id": actor_user_id,
+ "allow_cross_user": allow_cross_user,
+ "settings_to_update": sanitized_settings_to_update,
+ },
+ )
try:
diff --git a/application/single_app/requirements.txt b/application/single_app/requirements.txt
index 48aa08772..ec6817567 100644
--- a/application/single_app/requirements.txt
+++ b/application/single_app/requirements.txt
@@ -56,4 +56,4 @@ pyyaml==6.0.2
aiohttp==3.13.4
html2text==2025.4.15
matplotlib==3.10.7
-azure-cognitiveservices-speech==1.47.0
\ No newline at end of file
+azure-cognitiveservices-speech==1.48.2
\ No newline at end of file
diff --git a/application/single_app/route_backend_chats.py b/application/single_app/route_backend_chats.py
index e16d72423..4d31db45f 100644
--- a/application/single_app/route_backend_chats.py
+++ b/application/single_app/route_backend_chats.py
@@ -218,6 +218,235 @@ def build_fact_memory_citation(query_text, matched_facts, search_mode):
}
+def _normalize_requested_scope_ids(*scope_values):
+ """Normalize single-value and list-based scope ids into a de-duplicated list."""
+ normalized_values = []
+ for scope_value in scope_values:
+ if scope_value is None:
+ continue
+
+ if isinstance(scope_value, (list, tuple, set)):
+ candidates = list(scope_value)
+ else:
+ candidates = [scope_value]
+
+ for candidate in candidates:
+ normalized_candidate = str(candidate or '').strip()
+ if not normalized_candidate or normalized_candidate in normalized_values:
+ continue
+ normalized_values.append(normalized_candidate)
+
+ return normalized_values
+
+
+def _get_authorized_chat_scope_context(
+ user_id,
+ active_group_id=None,
+ active_group_ids=None,
+ active_public_workspace_id=None,
+ active_public_workspace_ids=None,
+):
+ """Filter request-provided chat scopes down to the caller's current access."""
+ requested_group_ids = _normalize_requested_scope_ids(active_group_ids, active_group_id)
+ allowed_group_ids = []
+ for group_id in requested_group_ids:
+ group_doc = find_group_by_id(group_id)
+ if group_doc and get_user_role_in_group(group_doc, user_id):
+ allowed_group_ids.append(group_id)
+
+ requested_public_workspace_ids = _normalize_requested_scope_ids(
+ active_public_workspace_ids,
+ active_public_workspace_id,
+ )
+ visible_public_workspace_ids = set(
+ _normalize_requested_scope_ids(get_user_visible_public_workspace_ids_from_settings(user_id) or [])
+ )
+ allowed_public_workspace_ids = [
+ workspace_id
+ for workspace_id in requested_public_workspace_ids
+ if workspace_id in visible_public_workspace_ids
+ ]
+
+ return {
+ 'active_group_ids': allowed_group_ids,
+ 'active_group_id': allowed_group_ids[0] if allowed_group_ids else None,
+ 'active_public_workspace_ids': allowed_public_workspace_ids,
+ 'active_public_workspace_id': (
+ allowed_public_workspace_ids[0] if allowed_public_workspace_ids else None
+ ),
+ }
+
+
+def _set_authorized_chat_request_context(user_id, conversation_id, scope_context):
+ """Persist the canonical request authorization context for downstream plugin checks."""
+ authorized_context = {
+ 'user_id': user_id,
+ 'conversation_id': conversation_id,
+ 'active_group_ids': list(scope_context.get('active_group_ids') or []),
+ 'active_group_id': scope_context.get('active_group_id'),
+ 'active_public_workspace_ids': list(scope_context.get('active_public_workspace_ids') or []),
+ 'active_public_workspace_id': scope_context.get('active_public_workspace_id'),
+ }
+ authorized_context['fact_memory_scope_id'] = authorized_context['active_group_id'] or user_id
+ authorized_context['fact_memory_scope_type'] = (
+ 'group' if authorized_context['active_group_id'] else 'user'
+ )
+
+ g.conversation_id = conversation_id
+ g.authorized_chat_context = authorized_context
+ return authorized_context
+
+
+def _resolve_chat_selected_document_metadata(document_id, user_id=None, document_scope='personal',
+ active_group_id=None, active_group_ids=None,
+ active_public_workspace_id=None,
+ active_public_workspace_ids=None):
+ """Resolve selected-document metadata using the authorized chat scope model."""
+ normalized_document_id = str(document_id or '').strip()
+ if not normalized_document_id or normalized_document_id == 'all':
+ return None
+
+ normalized_scope = str(document_scope or 'personal').strip().lower()
+ authorized_group_ids = _normalize_requested_scope_ids(active_group_ids, active_group_id)
+ authorized_public_workspace_ids = _normalize_requested_scope_ids(
+ active_public_workspace_ids,
+ active_public_workspace_id,
+ )
+
+ resolution_queries = []
+
+ if normalized_scope in {'personal', 'workspace', 'all'} and user_id:
+ resolution_queries.append({
+ 'source_hint': 'workspace',
+ 'cosmos_container': cosmos_user_documents_container,
+ 'query': """
+ SELECT TOP 1 c.id, c.file_name, c.title, c.group_id, c.public_workspace_id
+ FROM c
+ WHERE c.id = @doc_id
+ AND (
+ c.user_id = @user_id
+ OR ARRAY_CONTAINS(c.shared_user_ids, @user_id)
+ OR EXISTS(SELECT VALUE s FROM s IN c.shared_user_ids WHERE STARTSWITH(s, @user_id_prefix))
+ )
+ ORDER BY c.version DESC
+ """,
+ 'parameters': [
+ {'name': '@doc_id', 'value': normalized_document_id},
+ {'name': '@user_id', 'value': user_id},
+ {'name': '@user_id_prefix', 'value': f"{user_id},"},
+ ],
+ })
+
+ if normalized_scope in {'group', 'all'}:
+ for group_id in authorized_group_ids:
+ resolution_queries.append({
+ 'source_hint': 'group',
+ 'cosmos_container': cosmos_group_documents_container,
+ 'query': """
+ SELECT TOP 1 c.id, c.file_name, c.title, c.group_id, c.public_workspace_id
+ FROM c
+ WHERE c.id = @doc_id
+ AND (
+ c.group_id = @group_id
+ OR ARRAY_CONTAINS(c.shared_group_ids, @group_id)
+ OR ARRAY_CONTAINS(c.shared_group_ids, @group_id_approved)
+ )
+ ORDER BY c.version DESC
+ """,
+ 'parameters': [
+ {'name': '@doc_id', 'value': normalized_document_id},
+ {'name': '@group_id', 'value': group_id},
+ {'name': '@group_id_approved', 'value': f"{group_id},approved"},
+ ],
+ })
+
+ if normalized_scope in {'public', 'all'}:
+ for public_workspace_id in authorized_public_workspace_ids:
+ resolution_queries.append({
+ 'source_hint': 'public',
+ 'cosmos_container': cosmos_public_documents_container,
+ 'query': """
+ SELECT TOP 1 c.id, c.file_name, c.title, c.group_id, c.public_workspace_id
+ FROM c
+ WHERE c.id = @doc_id
+ AND c.public_workspace_id = @public_workspace_id
+ ORDER BY c.version DESC
+ """,
+ 'parameters': [
+ {'name': '@doc_id', 'value': normalized_document_id},
+ {'name': '@public_workspace_id', 'value': public_workspace_id},
+ ],
+ })
+
+ for resolution_query in resolution_queries:
+ doc_results = list(resolution_query['cosmos_container'].query_items(
+ query=resolution_query['query'],
+ parameters=resolution_query['parameters'],
+ enable_cross_partition_query=True,
+ ))
+ if not doc_results:
+ continue
+
+ doc_info = dict(doc_results[0])
+ doc_info['source_hint'] = resolution_query['source_hint']
+ return doc_info
+
+ return None
+
+
+def _create_personal_conversation(user_id, conversation_id=None):
+ """Create and persist a new personal conversation owned by the current user."""
+ resolved_conversation_id = str(conversation_id or uuid.uuid4())
+ conversation_item = {
+ 'id': resolved_conversation_id,
+ 'user_id': user_id,
+ 'last_updated': datetime.utcnow().isoformat(),
+ 'title': 'New Conversation',
+ 'context': [],
+ 'tags': [],
+ 'strict': False,
+ 'chat_type': 'new'
+ }
+ cosmos_conversations_container.upsert_item(conversation_item)
+
+ log_conversation_creation(
+ user_id=user_id,
+ conversation_id=resolved_conversation_id,
+ title='New Conversation',
+ workspace_type='personal'
+ )
+
+ conversation_item['added_to_activity_log'] = True
+ cosmos_conversations_container.upsert_item(conversation_item)
+ return conversation_item
+
+
+def _authorize_personal_conversation_access(user_id, conversation_id):
+ """Load a personal conversation and ensure the caller owns it."""
+ try:
+ conversation_item = cosmos_conversations_container.read_item(
+ item=conversation_id,
+ partition_key=conversation_id,
+ )
+ except CosmosResourceNotFoundError as exc:
+ raise LookupError(f"Conversation {conversation_id} not found") from exc
+
+ if conversation_item.get('user_id') != user_id:
+ raise PermissionError('You can only access your own conversations')
+
+ return conversation_item
+
+
+def _resolve_or_create_authorized_personal_conversation(user_id, conversation_id):
+ """Create new personal conversations server-side or load an authorized existing one."""
+ if not conversation_id:
+ conversation_item = _create_personal_conversation(user_id)
+ return conversation_item, conversation_item['id']
+
+ conversation_item = _authorize_personal_conversation_access(user_id, conversation_id)
+ return conversation_item, conversation_id
+
+
def build_instruction_memory_payload(
scope_id,
scope_type,
@@ -5127,8 +5356,10 @@ def infer_tabular_source_context_from_document(source_doc, document_scope='perso
def get_selected_workspace_tabular_file_contexts(selected_document_ids=None, selected_document_id=None,
- document_scope='personal', active_group_id=None,
- active_public_workspace_id=None):
+ document_scope='personal', user_id=None,
+ active_group_id=None, active_group_ids=None,
+ active_public_workspace_id=None,
+ active_public_workspace_ids=None):
"""Resolve explicitly selected workspace documents and return tabular source contexts."""
selected_ids = list(selected_document_ids or [])
if not selected_ids and selected_document_id and selected_document_id != 'all':
@@ -5144,33 +5375,26 @@ def get_selected_workspace_tabular_file_contexts(selected_document_ids=None, sel
continue
try:
- doc_query = (
- "SELECT TOP 1 c.file_name, c.title, c.group_id, c.public_workspace_id "
- "FROM c WHERE c.id = @doc_id "
- "ORDER BY c.version DESC"
+ doc_info = _resolve_chat_selected_document_metadata(
+ doc_id,
+ user_id=user_id,
+ document_scope=document_scope,
+ active_group_id=active_group_id,
+ active_group_ids=active_group_ids,
+ active_public_workspace_id=active_public_workspace_id,
+ active_public_workspace_ids=active_public_workspace_ids,
)
- doc_params = [{"name": "@doc_id", "value": doc_id}]
-
- for source_hint, cosmos_container in get_document_containers_for_scope(document_scope):
- doc_results = list(cosmos_container.query_items(
- query=doc_query,
- parameters=doc_params,
- enable_cross_partition_query=True
- ))
-
- if not doc_results:
- continue
+ if not doc_info:
+ continue
- doc_info = doc_results[0]
- file_context = build_tabular_file_context(
- doc_info.get('file_name') or doc_info.get('title'),
- source_hint=source_hint,
- group_id=doc_info.get('group_id') or active_group_id,
- public_workspace_id=doc_info.get('public_workspace_id') or active_public_workspace_id,
- )
- if file_context:
- tabular_file_contexts.append(file_context)
- break
+ file_context = build_tabular_file_context(
+ doc_info.get('file_name') or doc_info.get('title'),
+ source_hint=doc_info.get('source_hint', 'workspace'),
+ group_id=doc_info.get('group_id') or active_group_id,
+ public_workspace_id=doc_info.get('public_workspace_id') or active_public_workspace_id,
+ )
+ if file_context:
+ tabular_file_contexts.append(file_context)
except Exception as e:
log_event(
f"[Tabular SK Analysis] Failed to resolve selected document '{doc_id}': {e}",
@@ -5182,7 +5406,10 @@ def get_selected_workspace_tabular_file_contexts(selected_document_ids=None, sel
def collect_workspace_tabular_file_contexts(combined_documents=None, selected_document_ids=None,
selected_document_id=None, document_scope='personal',
- active_group_id=None, active_public_workspace_id=None):
+ user_id=None, active_group_id=None,
+ active_group_ids=None,
+ active_public_workspace_id=None,
+ active_public_workspace_ids=None):
"""Collect tabular source contexts from search results and explicit workspace selection."""
tabular_file_contexts = []
@@ -5200,8 +5427,11 @@ def collect_workspace_tabular_file_contexts(combined_documents=None, selected_do
selected_document_ids=selected_document_ids,
selected_document_id=selected_document_id,
document_scope=document_scope,
+ user_id=user_id,
active_group_id=active_group_id,
+ active_group_ids=active_group_ids,
active_public_workspace_id=active_public_workspace_id,
+ active_public_workspace_ids=active_public_workspace_ids,
))
return dedupe_tabular_file_contexts(tabular_file_contexts)
@@ -5209,15 +5439,21 @@ def collect_workspace_tabular_file_contexts(combined_documents=None, selected_do
def collect_workspace_tabular_filenames(combined_documents=None, selected_document_ids=None,
selected_document_id=None, document_scope='personal',
- active_group_id=None, active_public_workspace_id=None):
+ user_id=None, active_group_id=None,
+ active_group_ids=None,
+ active_public_workspace_id=None,
+ active_public_workspace_ids=None):
"""Collect unique tabular filenames from search results and explicit workspace selection."""
tabular_file_contexts = collect_workspace_tabular_file_contexts(
combined_documents=combined_documents,
selected_document_ids=selected_document_ids,
selected_document_id=selected_document_id,
document_scope=document_scope,
+ user_id=user_id,
active_group_id=active_group_id,
+ active_group_ids=active_group_ids,
active_public_workspace_id=active_public_workspace_id,
+ active_public_workspace_ids=active_public_workspace_ids,
)
return {file_context['file_name'] for file_context in tabular_file_contexts}
@@ -6031,22 +6267,19 @@ def result_requires_message_reload(result: Any) -> bool:
active_group_id = data.get('active_group_id')
active_group_ids = data.get('active_group_ids', [])
- # Backwards compat: if new list not provided, wrap single ID
- if not active_group_ids and active_group_id:
- active_group_ids = [active_group_id]
- # Permission validation: only keep groups user is a member of
- validated_group_ids = []
- for gid in active_group_ids:
- g_doc = find_group_by_id(gid)
- if g_doc and get_user_role_in_group(g_doc, user_id):
- validated_group_ids.append(gid)
- active_group_ids = validated_group_ids
- # Keep single ID for backwards compat in metadata/context
- active_group_id = active_group_ids[0] if active_group_ids else data.get('active_group_id')
active_public_workspace_id = data.get('active_public_workspace_id') # Extract active public workspace ID
active_public_workspace_ids = data.get('active_public_workspace_ids', [])
- if not active_public_workspace_ids and active_public_workspace_id:
- active_public_workspace_ids = [active_public_workspace_id]
+ scope_context = _get_authorized_chat_scope_context(
+ user_id,
+ active_group_id=active_group_id,
+ active_group_ids=active_group_ids,
+ active_public_workspace_id=active_public_workspace_id,
+ active_public_workspace_ids=active_public_workspace_ids,
+ )
+ active_group_ids = scope_context['active_group_ids']
+ active_group_id = scope_context['active_group_id']
+ active_public_workspace_ids = scope_context['active_public_workspace_ids']
+ active_public_workspace_id = scope_context['active_public_workspace_id']
frontend_gpt_model = data.get('model_deployment')
top_n_results = data.get('top_n') # Extract top_n parameter from request
classifications_to_send = data.get('classifications') # Extract classifications parameter from request
@@ -6064,20 +6297,12 @@ def result_requires_message_reload(result: Any) -> bool:
operation_type = 'Edit' if is_edit else 'Retry'
debug_print(f"🔍 Chat API - {operation_type} detected! user_message_id={retry_user_message_id}, thread_id={retry_thread_id}, attempt={retry_thread_attempt}")
- # Store conversation_id in Flask context for plugin logger access
- g.conversation_id = conversation_id
-
- # Clear plugin invocations at start of message processing to ensure
- # each message only shows citations for tools executed during that specific interaction
- from semantic_kernel_plugins.plugin_invocation_logger import get_plugin_logger
- plugin_logger = get_plugin_logger()
- plugin_logger.clear_invocations_for_conversation(user_id, conversation_id)
-
# Validate chat_type
if chat_type not in ('user', 'group'):
chat_type = 'user'
search_query = user_message # <--- ADD THIS LINE (Initialize search_query)
+ web_search_query_text = build_web_search_query_text(user_message)
hybrid_citations_list = [] # <--- ADD THIS LINE (Initialize hybrid list)
agent_citations_list = [] # <--- ADD THIS LINE (Initialize agent citations list)
web_search_citations_list = []
@@ -6236,65 +6461,25 @@ def result_requires_message_reload(result: Any) -> bool:
# ---------------------------------------------------------------------
# 1) Load or create conversation
# ---------------------------------------------------------------------
- if not conversation_id:
- conversation_id = str(uuid.uuid4())
- conversation_item = {
- 'id': conversation_id,
- 'user_id': user_id,
- 'last_updated': datetime.utcnow().isoformat(),
- 'title': 'New Conversation',
- 'context': [],
- 'tags': [],
- 'strict': False,
- 'chat_type': 'new'
- }
- cosmos_conversations_container.upsert_item(conversation_item)
-
- # Log conversation creation
- log_conversation_creation(
- user_id=user_id,
- conversation_id=conversation_id,
- title='New Conversation',
- workspace_type='personal'
+ try:
+ conversation_item, conversation_id = _resolve_or_create_authorized_personal_conversation(
+ user_id,
+ conversation_id,
)
-
- # Mark as logged to activity logs to prevent duplicate migration
- conversation_item['added_to_activity_log'] = True
- cosmos_conversations_container.upsert_item(conversation_item)
- else:
- try:
- conversation_item = cosmos_conversations_container.read_item(item=conversation_id, partition_key=conversation_id)
- except CosmosResourceNotFoundError:
- # If conversation ID is provided but not found, create a new one with that ID
- # Or decide if you want to return an error instead
- conversation_item = {
- 'id': conversation_id, # Keep the provided ID if needed for linking
- 'user_id': user_id,
- 'last_updated': datetime.utcnow().isoformat(),
- 'title': 'New Conversation', # Or maybe fetch title differently?
- 'context': [],
- 'tags': [],
- 'strict': False,
- 'chat_type': 'new'
- }
- # Optionally log that a conversation was expected but not found
- debug_print(f"Warning: Conversation ID {conversation_id} not found, creating new.")
- cosmos_conversations_container.upsert_item(conversation_item)
-
- # Log conversation creation
- log_conversation_creation(
- user_id=user_id,
- conversation_id=conversation_id,
- title='New Conversation',
- workspace_type='personal'
- )
-
- # Mark as logged to activity logs to prevent duplicate migration
- conversation_item['added_to_activity_log'] = True
- cosmos_conversations_container.upsert_item(conversation_item)
- except Exception as e:
- debug_print(f"Error reading conversation {conversation_id}: {e}")
- return jsonify({'error': f'Error reading conversation: {str(e)}'}), 500
+ except LookupError:
+ return jsonify({'error': 'Conversation not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'Forbidden'}), 403
+ except Exception as e:
+ debug_print(f"Error reading conversation {conversation_id}: {e}")
+ return jsonify({'error': f'Error reading conversation: {str(e)}'}), 500
+
+ _set_authorized_chat_request_context(user_id, conversation_id, scope_context)
+
+ # Clear plugin invocations at start of message processing to ensure
+ # each message only shows citations for tools executed during that specific interaction
+ plugin_logger = get_plugin_logger()
+ plugin_logger.clear_invocations_for_conversation(user_id, conversation_id)
# Determine the actual chat context based on existing conversation or document usage
# For existing conversations, use the chat_type from conversation metadata
@@ -6404,21 +6589,16 @@ def result_requires_message_reload(result: Any) -> bool:
# Get document details if specific document selected
if selected_document_id and selected_document_id != "all":
try:
- # Use the appropriate documents container based on scope
- if document_scope == 'group':
- cosmos_container = cosmos_group_documents_container
- elif document_scope == 'public':
- cosmos_container = cosmos_public_documents_container
- elif document_scope == 'personal':
- cosmos_container = cosmos_user_documents_container
-
- doc_query = "SELECT c.file_name, c.title, c.document_id, c.group_id FROM c WHERE c.id = @doc_id"
- doc_params = [{"name": "@doc_id", "value": selected_document_id}]
- doc_results = list(cosmos_container.query_items(
- query=doc_query, parameters=doc_params, enable_cross_partition_query=True
- ))
- if doc_results and 'workspace_search' in user_metadata:
- doc_info = doc_results[0]
+ doc_info = _resolve_chat_selected_document_metadata(
+ selected_document_id,
+ user_id=user_id,
+ document_scope=document_scope,
+ active_group_id=active_group_id,
+ active_group_ids=active_group_ids,
+ active_public_workspace_id=active_public_workspace_id,
+ active_public_workspace_ids=active_public_workspace_ids,
+ )
+ if doc_info and 'workspace_search' in user_metadata:
user_metadata['workspace_search']['document_name'] = doc_info.get('title') or doc_info.get('file_name')
user_metadata['workspace_search']['document_filename'] = doc_info.get('file_name')
except Exception as e:
@@ -6802,7 +6982,11 @@ def result_requires_message_reload(result: Any) -> bool:
fallback_search_parameters = build_prior_grounded_document_search_parameters(
prior_grounded_document_refs
)
- if fallback_search_parameters.get('document_ids'):
+ fallback_search_parameters = revalidate_prior_grounded_document_search_parameters(
+ user_id,
+ fallback_search_parameters,
+ )
+ if fallback_search_parameters.get('document_ids') and fallback_search_parameters.get('doc_scope'):
history_grounded_search_used = True
effective_document_scope = fallback_search_parameters.get('doc_scope') or 'all'
effective_selected_document_ids = list(
@@ -6889,6 +7073,7 @@ def result_requires_message_reload(result: Any) -> bool:
# Filter out inactive thread messages before summarizing
message_texts_search = []
for msg in last_messages_asc:
+ role = msg.get('role', 'user')
thread_info = msg.get('metadata', {}).get('thread_info', {})
active_thread = thread_info.get('active_thread')
@@ -6896,8 +7081,15 @@ def result_requires_message_reload(result: Any) -> bool:
if active_thread is False:
debug_print(f"[THREAD] Skipping inactive thread message {msg.get('id')} from search summary")
continue
-
- message_texts_search.append(f"{msg.get('role', 'user').upper()}: {msg.get('content', '')}")
+
+ if role not in ('user', 'assistant'):
+ continue
+
+ content = msg.get('content', '')
+ if role == 'assistant':
+ content = build_assistant_history_content_with_citations(msg, content)
+
+ message_texts_search.append(f"{role.upper()}: {content}")
if not message_texts_search:
# No active messages to summarize
@@ -7614,8 +7806,11 @@ def result_requires_message_reload(result: Any) -> bool:
selected_document_ids=effective_selected_document_ids,
selected_document_id=effective_selected_document_id,
document_scope=effective_document_scope,
+ user_id=user_id,
active_group_id=effective_active_group_id,
+ active_group_ids=effective_active_group_ids,
active_public_workspace_id=effective_active_public_workspace_id,
+ active_public_workspace_ids=effective_active_public_workspace_ids,
)
workspace_tabular_files = {
file_context['file_name'] for file_context in workspace_tabular_file_contexts
@@ -7689,7 +7884,7 @@ def result_requires_message_reload(result: Any) -> bool:
)
if web_search_enabled:
- thought_tracker.add_thought('web_search', f"Searching the web for '{(search_query or user_message)[:50]}'")
+ thought_tracker.add_thought('web_search', f"Searching the web for '{web_search_query_text[:50]}'")
perform_web_search(
settings=settings,
conversation_id=conversation_id,
@@ -7700,7 +7895,7 @@ def result_requires_message_reload(result: Any) -> bool:
document_scope=document_scope,
active_group_id=active_group_id,
active_public_workspace_id=active_public_workspace_id,
- search_query=search_query,
+ web_search_query_text=web_search_query_text,
system_messages_for_augmentation=system_messages_for_augmentation,
agent_citations_list=agent_citations_list,
web_search_citations_list=web_search_citations_list,
@@ -8906,9 +9101,32 @@ def chat_stream_api():
compatibility_mode = bool(data.get('image_generation')) or is_retry
requested_conversation_id = str(data.get('conversation_id') or '').strip() or None
+
+ if requested_conversation_id:
+ try:
+ _authorize_personal_conversation_access(user_id, requested_conversation_id)
+ except LookupError:
+ return jsonify({'error': 'Conversation not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'Forbidden'}), 403
+ except Exception as exc:
+ debug_print(f"[Streaming] Error authorizing conversation {requested_conversation_id}: {exc}")
+ return jsonify({'error': 'Failed to authorize conversation'}), 500
+
+ initial_scope_context = _get_authorized_chat_scope_context(
+ user_id,
+ active_group_id=data.get('active_group_id'),
+ active_group_ids=data.get('active_group_ids', []),
+ active_public_workspace_id=data.get('active_public_workspace_id'),
+ active_public_workspace_ids=data.get('active_public_workspace_ids', []),
+ )
finalized_conversation_id = requested_conversation_id or str(uuid.uuid4())
is_new_stream_conversation = requested_conversation_id is None
data['conversation_id'] = finalized_conversation_id
+ data['active_group_ids'] = list(initial_scope_context['active_group_ids'])
+ data['active_group_id'] = initial_scope_context['active_group_id']
+ data['active_public_workspace_ids'] = list(initial_scope_context['active_public_workspace_ids'])
+ data['active_public_workspace_id'] = initial_scope_context['active_public_workspace_id']
stream_session = CHAT_STREAM_REGISTRY.start_session(user_id, finalized_conversation_id)
request_message = (data.get('message') or '').strip()
@@ -9047,22 +9265,19 @@ def generate(publish_background_event=None):
tags_filter = data.get('tags', []) # Extract tags filter
active_group_id = data.get('active_group_id')
active_group_ids = data.get('active_group_ids', [])
- # Backwards compat: if new list not provided, wrap single ID
- if not active_group_ids and active_group_id:
- active_group_ids = [active_group_id]
- # Permission validation: only keep groups user is a member of
- validated_group_ids = []
- for gid in active_group_ids:
- g_doc = find_group_by_id(gid)
- if g_doc and get_user_role_in_group(g_doc, user_id):
- validated_group_ids.append(gid)
- active_group_ids = validated_group_ids
- # Keep single ID for backwards compat in metadata/context
- active_group_id = active_group_ids[0] if active_group_ids else data.get('active_group_id')
active_public_workspace_id = data.get('active_public_workspace_id') # Extract active public workspace ID
active_public_workspace_ids = data.get('active_public_workspace_ids', [])
- if not active_public_workspace_ids and active_public_workspace_id:
- active_public_workspace_ids = [active_public_workspace_id]
+ scope_context = _get_authorized_chat_scope_context(
+ user_id,
+ active_group_id=active_group_id,
+ active_group_ids=active_group_ids,
+ active_public_workspace_id=active_public_workspace_id,
+ active_public_workspace_ids=active_public_workspace_ids,
+ )
+ active_group_ids = scope_context['active_group_ids']
+ active_group_id = scope_context['active_group_id']
+ active_public_workspace_ids = scope_context['active_public_workspace_ids']
+ active_public_workspace_id = scope_context['active_public_workspace_id']
frontend_gpt_model = data.get('model_deployment')
frontend_model_id = data.get('model_id')
frontend_model_endpoint_id = data.get('model_endpoint_id')
@@ -9156,11 +9371,9 @@ def generate(publish_background_event=None):
yield f"data: {json.dumps({'error': 'Image generation is not supported in streaming mode'})}\n\n"
return
- # Initialize Flask context
- g.conversation_id = conversation_id
+ _set_authorized_chat_request_context(user_id, conversation_id, scope_context)
# Clear plugin invocations
- from semantic_kernel_plugins.plugin_invocation_logger import get_plugin_logger
plugin_logger = get_plugin_logger()
plugin_logger.clear_invocations_for_conversation(user_id, conversation_id)
debug_print(
@@ -9175,6 +9388,7 @@ def generate(publish_background_event=None):
# Initialize variables
search_query = user_message
+ web_search_query_text = build_web_search_query_text(user_message)
hybrid_citations_list = []
agent_citations_list = []
web_search_citations_list = []
@@ -9323,37 +9537,18 @@ def generate(publish_background_event=None):
# Load or create conversation (simplified)
if is_new_stream_conversation:
- conversation_item = {
- 'id': conversation_id,
- 'user_id': user_id,
- 'last_updated': datetime.utcnow().isoformat(),
- 'title': 'New Conversation',
- 'context': [],
- 'tags': [],
- 'strict': False,
- 'chat_type': 'new'
- }
- cosmos_conversations_container.upsert_item(conversation_item)
+ conversation_item = _create_personal_conversation(user_id, conversation_id=conversation_id)
debug_print(f"[Streaming] Created new conversation {conversation_id}")
else:
try:
- conversation_item = cosmos_conversations_container.read_item(
- item=conversation_id, partition_key=conversation_id
- )
+ conversation_item = _authorize_personal_conversation_access(user_id, conversation_id)
debug_print(f"[Streaming] Loaded existing conversation {conversation_id}")
- except CosmosResourceNotFoundError:
- conversation_item = {
- 'id': conversation_id,
- 'user_id': user_id,
- 'last_updated': datetime.utcnow().isoformat(),
- 'title': 'New Conversation',
- 'context': [],
- 'tags': [],
- 'strict': False,
- 'chat_type': 'new'
- }
- cosmos_conversations_container.upsert_item(conversation_item)
- debug_print(f"[Streaming] Conversation {conversation_id} not found; created replacement")
+ except LookupError:
+ yield f"data: {json.dumps({'error': 'Conversation not found'})}\n\n"
+ return
+ except PermissionError:
+ yield f"data: {json.dumps({'error': 'Forbidden'})}\n\n"
+ return
# Determine chat type
actual_chat_type = 'personal_single_user'
@@ -9402,21 +9597,16 @@ def generate(publish_background_event=None):
# Get document details if specific document selected
if selected_document_id and selected_document_id != "all":
try:
- # Use the appropriate documents container based on scope
- if document_scope == 'group':
- cosmos_container = cosmos_group_documents_container
- elif document_scope == 'public':
- cosmos_container = cosmos_public_documents_container
- elif document_scope == 'personal':
- cosmos_container = cosmos_user_documents_container
-
- doc_query = "SELECT c.file_name, c.title, c.document_id, c.group_id FROM c WHERE c.id = @doc_id"
- doc_params = [{"name": "@doc_id", "value": selected_document_id}]
- doc_results = list(cosmos_container.query_items(
- query=doc_query, parameters=doc_params, enable_cross_partition_query=True
- ))
- if doc_results:
- doc_info = doc_results[0]
+ doc_info = _resolve_chat_selected_document_metadata(
+ selected_document_id,
+ user_id=user_id,
+ document_scope=document_scope,
+ active_group_id=active_group_id,
+ active_group_ids=active_group_ids,
+ active_public_workspace_id=active_public_workspace_id,
+ active_public_workspace_ids=active_public_workspace_ids,
+ )
+ if doc_info:
user_metadata['workspace_search']['document_name'] = doc_info.get('title') or doc_info.get('file_name')
user_metadata['workspace_search']['document_filename'] = doc_info.get('file_name')
except Exception as e:
@@ -9744,7 +9934,11 @@ def publish_live_plugin_thought(thought_payload):
fallback_search_parameters = build_prior_grounded_document_search_parameters(
prior_grounded_document_refs
)
- if fallback_search_parameters.get('document_ids'):
+ fallback_search_parameters = revalidate_prior_grounded_document_search_parameters(
+ user_id,
+ fallback_search_parameters,
+ )
+ if fallback_search_parameters.get('document_ids') and fallback_search_parameters.get('doc_scope'):
history_grounded_search_used = True
effective_document_scope = fallback_search_parameters.get('doc_scope') or 'all'
effective_selected_document_ids = list(
@@ -10072,8 +10266,11 @@ def publish_live_plugin_thought(thought_payload):
selected_document_ids=effective_selected_document_ids,
selected_document_id=effective_selected_document_id,
document_scope=effective_document_scope,
+ user_id=user_id,
active_group_id=effective_active_group_id,
+ active_group_ids=effective_active_group_ids,
active_public_workspace_id=effective_active_public_workspace_id,
+ active_public_workspace_ids=effective_active_public_workspace_ids,
)
workspace_tabular_files = {
file_context['file_name'] for file_context in workspace_tabular_file_contexts
@@ -10160,7 +10357,7 @@ def publish_live_plugin_thought(thought_payload):
debug_print(
f"[Streaming] Starting web search augmentation for conversation_id={conversation_id}"
)
- yield emit_thought('web_search', f"Searching the web for '{(search_query or user_message)[:50]}'")
+ yield emit_thought('web_search', f"Searching the web for '{web_search_query_text[:50]}'")
perform_web_search(
settings=settings,
conversation_id=conversation_id,
@@ -10171,7 +10368,7 @@ def publish_live_plugin_thought(thought_payload):
document_scope=document_scope,
active_group_id=active_group_id,
active_public_workspace_id=active_public_workspace_id,
- search_query=search_query,
+ web_search_query_text=web_search_query_text,
system_messages_for_augmentation=system_messages_for_augmentation,
agent_citations_list=agent_citations_list,
web_search_citations_list=web_search_citations_list,
@@ -11102,7 +11299,13 @@ def mask_message_api(message_id):
# Get action: "mask_all", "mask_selection", or "unmask_all"
action = data.get('action')
selection = data.get('selection', {})
- user_display_name = data.get('display_name', 'Unknown User')
+ current_user = get_current_user_info() or {}
+ user_display_name = (
+ current_user.get('displayName')
+ or current_user.get('email')
+ or current_user.get('userPrincipalName')
+ or 'Unknown User'
+ )
# Validate action
if action not in ['mask_all', 'mask_selection', 'unmask_all']:
@@ -11688,6 +11891,43 @@ def build_prior_grounded_document_search_parameters(grounded_refs):
}
+def revalidate_prior_grounded_document_search_parameters(user_id, search_parameters):
+ """Filter fallback search parameters to scopes the caller can still access."""
+ normalized_parameters = dict(search_parameters or {})
+ scope_types = set(normalized_parameters.get('scope_types') or [])
+ scope_context = _get_authorized_chat_scope_context(
+ user_id,
+ active_group_ids=normalized_parameters.get('active_group_ids') or [],
+ active_public_workspace_ids=normalized_parameters.get('active_public_workspace_ids') or [],
+ )
+ allowed_group_ids = scope_context['active_group_ids']
+ allowed_public_workspace_ids = scope_context['active_public_workspace_ids']
+
+ allowed_scope_types = []
+ if 'personal' in scope_types:
+ allowed_scope_types.append('personal')
+ if allowed_group_ids:
+ allowed_scope_types.append('group')
+ if allowed_public_workspace_ids:
+ allowed_scope_types.append('public')
+
+ normalized_parameters['active_group_ids'] = allowed_group_ids
+ normalized_parameters['active_group_id'] = scope_context['active_group_id']
+ normalized_parameters['active_public_workspace_ids'] = allowed_public_workspace_ids
+ normalized_parameters['active_public_workspace_id'] = scope_context['active_public_workspace_id']
+ normalized_parameters['scope_types'] = allowed_scope_types
+
+ if not allowed_scope_types:
+ normalized_parameters['document_ids'] = []
+ normalized_parameters['doc_scope'] = None
+ return normalized_parameters
+
+ normalized_parameters['doc_scope'] = (
+ allowed_scope_types[0] if len(allowed_scope_types) == 1 else 'all'
+ )
+ return normalized_parameters
+
+
def build_history_only_assessment_messages(history_segments, default_system_prompt=''):
"""Construct the prompt context used to decide whether history alone is sufficient."""
assessment_messages = []
@@ -12294,6 +12534,11 @@ def to_int(value: Any) -> Optional[int]:
"completion_tokens": int(completion_tokens),
}
+
+def build_web_search_query_text(user_message):
+ """Return the only chat content allowed to leave the app for external web search."""
+ return str(user_message or "").strip()
+
def perform_web_search(
*,
settings,
@@ -12305,7 +12550,7 @@ def perform_web_search(
document_scope,
active_group_id,
active_public_workspace_id,
- search_query,
+ web_search_query_text,
system_messages_for_augmentation,
agent_citations_list,
web_search_citations_list,
@@ -12320,7 +12565,10 @@ def perform_web_search(
debug_print(f"[WebSearch] document_scope: {document_scope}")
debug_print(f"[WebSearch] active_group_id: {active_group_id}")
debug_print(f"[WebSearch] active_public_workspace_id: {active_public_workspace_id}")
- debug_print(f"[WebSearch] search_query: {search_query[:100] if search_query else None}...")
+ debug_print(
+ "[WebSearch] web_search_query_text: "
+ f"{web_search_query_text[:100] if web_search_query_text else None}..."
+ )
enable_web_search = settings.get("enable_web_search")
debug_print(f"[WebSearch] enable_web_search setting: {enable_web_search}")
@@ -12328,15 +12576,13 @@ def perform_web_search(
if not enable_web_search:
debug_print("[WebSearch] Web search is DISABLED in settings, returning early")
return True # Not an error, just disabled
-
- debug_print("[WebSearch] Web search is ENABLED, proceeding...")
web_search_agent = settings.get("web_search_agent") or {}
debug_print(f"[WebSearch] web_search_agent config present: {bool(web_search_agent)}")
if web_search_agent:
# Avoid logging sensitive data, just log structure
debug_print(f"[WebSearch] web_search_agent keys: {list(web_search_agent.keys())}")
-
+
other_settings = web_search_agent.get("other_settings") or {}
debug_print(f"[WebSearch] other_settings keys: {list(other_settings.keys()) if other_settings else ''}")
@@ -12369,16 +12615,8 @@ def perform_web_search(
return False # Configuration error
debug_print(f"[WebSearch] Agent ID is configured: {agent_id}")
-
- query_text = None
- try:
- query_text = search_query
- debug_print(f"[WebSearch] Using search_query as query_text: {query_text[:100] if query_text else None}...")
- except NameError:
- query_text = None
- debug_print("[WebSearch] search_query not defined, query_text is None")
- query_text = (query_text or user_message or "").strip()
+ query_text = (web_search_query_text or user_message or "").strip()
debug_print(f"[WebSearch] Final query_text after fallback: '{query_text[:100] if query_text else ''}'")
if not query_text:
@@ -12400,17 +12638,8 @@ def perform_web_search(
debug_print(f"[WebSearch] Message history created with {len(message_history)} message(s)")
try:
- foundry_metadata = {
- "conversation_id": conversation_id,
- "user_id": user_id,
- "message_id": user_message_id,
- "chat_type": chat_type,
- "document_scope": document_scope,
- "group_id": active_group_id if chat_type == "group" else None,
- "public_workspace_id": active_public_workspace_id,
- "search_query": query_text,
- }
- debug_print(f"[WebSearch] Foundry metadata prepared: {json.dumps(foundry_metadata, default=str)}")
+ foundry_metadata = {}
+ debug_print("[WebSearch] Foundry metadata prepared: {}")
debug_print("[WebSearch] Calling execute_foundry_agent...")
debug_print(f"[WebSearch] foundry_settings keys: {list(foundry_settings.keys())}")
diff --git a/application/single_app/route_backend_control_center.py b/application/single_app/route_backend_control_center.py
index a7e5e8a0d..a6df92897 100644
--- a/application/single_app/route_backend_control_center.py
+++ b/application/single_app/route_backend_control_center.py
@@ -1,5 +1,13 @@
# route_backend_control_center.py
+import csv
+import logging
+import math
+import time
+from io import StringIO
+
+from flask import make_response
+
from config import *
from functions_authentication import *
from functions_settings import *
@@ -15,6 +23,10 @@
from functions_debug import debug_print
+ACTIVITY_LOGS_DEFAULT_PER_PAGE = 50
+ACTIVITY_LOGS_MAX_PER_PAGE = 200
+
+
def normalize_token_filter_value(value):
"""Normalize optional token filter values from query params or request JSON."""
if value is None:
@@ -125,6 +137,284 @@ def get_distinct_token_filter_values(query):
debug_print(f"[Token Filters] Error loading distinct values: {ex}")
return []
+
+def validate_activity_logs_pagination(request_args):
+ """Validate pagination parameters for the interactive activity logs API."""
+ page_raw = request_args.get('page', 1)
+ per_page_raw = request_args.get('per_page', ACTIVITY_LOGS_DEFAULT_PER_PAGE)
+
+ try:
+ page = int(page_raw)
+ per_page = int(per_page_raw)
+ except (TypeError, ValueError) as ex:
+ raise ValueError('page and per_page must be integers.') from ex
+
+ if page < 1:
+ raise ValueError('page must be greater than or equal to 1.')
+
+ if per_page < 1:
+ raise ValueError('per_page must be greater than or equal to 1.')
+
+ if per_page > ACTIVITY_LOGS_MAX_PER_PAGE:
+ raise ValueError(
+ f'per_page must be less than or equal to {ACTIVITY_LOGS_MAX_PER_PAGE} for activity log browsing.'
+ )
+
+ return page, per_page
+
+
+def build_activity_logs_query_context(activity_type_filter='all', search_term=''):
+ """Build the shared Cosmos WHERE clause and parameters for activity log queries."""
+ query_conditions = []
+ parameters = []
+
+ if activity_type_filter and activity_type_filter != 'all':
+ query_conditions.append("c.activity_type = @activity_type")
+ parameters.append({"name": "@activity_type", "value": activity_type_filter})
+
+ normalized_search_term = (search_term or '').strip().lower()
+ if normalized_search_term:
+ query_conditions.append(
+ "(" + " OR ".join([
+ "(IS_DEFINED(c.activity_type) AND CONTAINS(LOWER(c.activity_type), @activity_search_term))",
+ "(IS_DEFINED(c.user_id) AND CONTAINS(LOWER(c.user_id), @activity_search_term))",
+ "(IS_DEFINED(c.admin_email) AND CONTAINS(LOWER(c.admin_email), @activity_search_term))",
+ "(IS_DEFINED(c.requester_email) AND CONTAINS(LOWER(c.requester_email), @activity_search_term))",
+ "(IS_DEFINED(c.added_by_email) AND CONTAINS(LOWER(c.added_by_email), @activity_search_term))",
+ "(IS_DEFINED(c.approver_email) AND CONTAINS(LOWER(c.approver_email), @activity_search_term))",
+ "(IS_DEFINED(c.member_email) AND CONTAINS(LOWER(c.member_email), @activity_search_term))",
+ "(IS_DEFINED(c.member_name) AND CONTAINS(LOWER(c.member_name), @activity_search_term))",
+ "(IS_DEFINED(c.group_name) AND CONTAINS(LOWER(c.group_name), @activity_search_term))",
+ "(IS_DEFINED(c.workspace_name) AND CONTAINS(LOWER(c.workspace_name), @activity_search_term))",
+ "(IS_DEFINED(c.public_workspace_name) AND CONTAINS(LOWER(c.public_workspace_name), @activity_search_term))",
+ "(IS_DEFINED(c.login_method) AND CONTAINS(LOWER(c.login_method), @activity_search_term))",
+ "(IS_DEFINED(c.token_type) AND CONTAINS(LOWER(c.token_type), @activity_search_term))",
+ "(IS_DEFINED(c.workspace_type) AND CONTAINS(LOWER(c.workspace_type), @activity_search_term))",
+ "(IS_DEFINED(c.description) AND CONTAINS(LOWER(c.description), @activity_search_term))",
+ "(IS_DEFINED(c.conversation.title) AND CONTAINS(LOWER(c.conversation.title), @activity_search_term))",
+ "(IS_DEFINED(c.document.file_name) AND CONTAINS(LOWER(c.document.file_name), @activity_search_term))",
+ "(IS_DEFINED(c.usage.model) AND CONTAINS(LOWER(c.usage.model), @activity_search_term))",
+ "(IS_DEFINED(c.workspace_context.group_id) AND CONTAINS(LOWER(c.workspace_context.group_id), @activity_search_term))",
+ "(IS_DEFINED(c.workspace_context.public_workspace_id) AND CONTAINS(LOWER(c.workspace_context.public_workspace_id), @activity_search_term))"
+ ]) + ")"
+ )
+ parameters.append({"name": "@activity_search_term", "value": normalized_search_term})
+
+ where_clause = " WHERE " + " AND ".join(query_conditions) if query_conditions else ""
+ return where_clause, parameters
+
+
+def normalize_activity_log_value(value):
+ """Recursively coerce Cosmos activity log values into JSON-safe data."""
+ if value is None or isinstance(value, (str, bool, int)):
+ return value
+
+ if isinstance(value, float):
+ if math.isnan(value) or math.isinf(value):
+ return None
+ return value
+
+ if isinstance(value, datetime):
+ return value.isoformat()
+
+ if isinstance(value, bytes):
+ return value.decode('utf-8', errors='replace')
+
+ if isinstance(value, dict):
+ return {
+ str(key): normalize_activity_log_value(nested_value)
+ for key, nested_value in value.items()
+ }
+
+ if isinstance(value, (list, tuple, set)):
+ return [normalize_activity_log_value(item) for item in value]
+
+ return str(value)
+
+
+def normalize_activity_log_record(log_record):
+ """Return a browser-safe activity log document with stable core fields."""
+ normalized_record = normalize_activity_log_value(log_record)
+ if not isinstance(normalized_record, dict):
+ normalized_record = {'raw_value': normalized_record}
+
+ for field_name in ('user_id', 'admin_user_id', 'added_by_user_id'):
+ if field_name in normalized_record:
+ normalized_record[field_name] = coerce_activity_log_user_id(normalized_record.get(field_name))
+
+ admin_payload = normalized_record.get('admin')
+ if isinstance(admin_payload, dict):
+ admin_payload['user_id'] = coerce_activity_log_user_id(admin_payload.get('user_id'))
+ if not normalized_record.get('user_id') and admin_payload.get('user_id'):
+ normalized_record['user_id'] = admin_payload.get('user_id')
+
+ normalized_record.setdefault('id', '')
+ normalized_record.setdefault('activity_type', 'unknown')
+ normalized_record.setdefault('timestamp', '')
+ normalized_record.setdefault('workspace_type', '')
+ return normalized_record
+
+
+def get_activity_log_user_details(user_id, user_cache):
+ """Resolve a user display payload once and reuse it across pagination or export."""
+ user_id = coerce_activity_log_user_id(user_id)
+ if not user_id:
+ return {'email': '', 'display_name': ''}
+
+ if user_id in user_cache:
+ return user_cache[user_id]
+
+ try:
+ user_doc = cosmos_user_settings_container.read_item(item=user_id, partition_key=user_id)
+ user_cache[user_id] = {
+ 'email': user_doc.get('email', ''),
+ 'display_name': user_doc.get('display_name', '')
+ }
+ except Exception as ex:
+ user_cache[user_id] = {'email': '', 'display_name': ''}
+ log_event(
+ '[ControlCenter][ActivityLogs] Failed to resolve activity log user details.',
+ extra={
+ 'activity_log_user_id': user_id,
+ 'error_type': type(ex).__name__
+ },
+ debug_only=True,
+ category='CONTROL_CENTER'
+ )
+
+ return user_cache[user_id]
+
+
+def build_activity_log_user_map(logs):
+ """Build a user map keyed by user_id for the current activity log payload."""
+ user_cache = {}
+ for log_record in logs:
+ user_id = (
+ log_record.get('user_id')
+ or (log_record.get('admin') or {}).get('user_id')
+ or log_record.get('admin_user_id')
+ or log_record.get('added_by_user_id')
+ )
+ user_id = coerce_activity_log_user_id(user_id)
+ if user_id:
+ get_activity_log_user_details(user_id, user_cache)
+ return user_cache
+
+
+def format_activity_log_details_for_csv(log_record):
+ """Format activity log details as a plain string suitable for CSV export."""
+ activity_type = log_record.get('activity_type', '')
+
+ if activity_type == 'user_login':
+ return f"Login method: {log_record.get('login_method') or log_record.get('details', {}).get('login_method', 'N/A')}"
+
+ if activity_type == 'conversation_creation':
+ conversation = log_record.get('conversation', {})
+ return f"Title: {conversation.get('title', 'Untitled')}, ID: {conversation.get('conversation_id', 'N/A')}"
+
+ if activity_type == 'conversation_deletion':
+ conversation = log_record.get('conversation', {})
+ return f"Deleted: {conversation.get('title', 'Untitled')}, ID: {conversation.get('conversation_id', 'N/A')}"
+
+ if activity_type == 'conversation_archival':
+ conversation = log_record.get('conversation', {})
+ return f"Archived: {conversation.get('title', 'Untitled')}, ID: {conversation.get('conversation_id', 'N/A')}"
+
+ if activity_type == 'document_creation':
+ document = log_record.get('document', {})
+ return f"File: {document.get('file_name', 'Unknown')}, Type: {document.get('file_type', '')}"
+
+ if activity_type == 'document_deletion':
+ document = log_record.get('document', {})
+ return f"Deleted: {document.get('file_name', 'Unknown')}, Type: {document.get('file_type', '')}"
+
+ if activity_type == 'document_metadata_update':
+ updated_fields = ', '.join((log_record.get('updated_fields') or {}).keys()) or 'N/A'
+ document = log_record.get('document', {})
+ return f"File: {document.get('file_name', 'Unknown')}, Updated: {updated_fields}"
+
+ if activity_type == 'token_usage':
+ usage = log_record.get('usage', {})
+ scope_details = []
+ workspace_type = log_record.get('workspace_type')
+ if workspace_type:
+ scope_details.append(f"Workspace: {workspace_type}")
+ workspace_context = log_record.get('workspace_context', {})
+ if workspace_context.get('group_id'):
+ scope_details.append(f"Group: {workspace_context.get('group_id')}")
+ if workspace_context.get('public_workspace_id'):
+ scope_details.append(f"Public Workspace: {workspace_context.get('public_workspace_id')}")
+ scope_suffix = f"; {' | '.join(scope_details)}" if scope_details else ''
+ return (
+ f"Type: {log_record.get('token_type', 'unknown')}, "
+ f"Tokens: {usage.get('total_tokens', 0)}, "
+ f"Model: {usage.get('model', 'N/A')}{scope_suffix}"
+ )
+
+ if activity_type in {'group_status_change', 'public_workspace_status_change'}:
+ status_change = log_record.get('status_change', {})
+ entity_name = (
+ log_record.get('group', {}).get('group_name')
+ or log_record.get('public_workspace', {}).get('workspace_name')
+ or log_record.get('workspace_context', {}).get('public_workspace_name')
+ or log_record.get('group_name')
+ or log_record.get('workspace_name')
+ or log_record.get('public_workspace_name')
+ or 'Unknown'
+ )
+ return (
+ f"Name: {entity_name}, "
+ f"Status: {status_change.get('old_status', 'N/A')} -> {status_change.get('new_status', 'N/A')}"
+ )
+
+ if activity_type in {
+ 'group_member_deleted',
+ 'add_member_directly',
+ 'admin_add_member_csv',
+ 'add_workspace_member_directly',
+ 'admin_add_workspace_member_csv'
+ }:
+ member_name = (
+ log_record.get('member_name')
+ or log_record.get('removed_member', {}).get('name')
+ or log_record.get('removed_member', {}).get('email')
+ or log_record.get('member_email')
+ or 'Unknown'
+ )
+ target_name = (
+ log_record.get('group_name')
+ or log_record.get('group', {}).get('group_name')
+ or log_record.get('workspace_name')
+ or log_record.get('public_workspace_name')
+ or 'Unknown'
+ )
+ role = log_record.get('member_role', '')
+ role_suffix = f" ({role})" if role else ''
+ return f"Member: {member_name}, Target: {target_name}{role_suffix}"
+
+ if activity_type in {
+ 'admin_take_ownership_approved',
+ 'transfer_ownership_approved',
+ 'delete_group_approved',
+ 'delete_all_documents_approved',
+ 'admin_take_workspace_ownership_approved',
+ 'transfer_workspace_ownership_approved',
+ 'delete_workspace_documents_approved',
+ 'delete_workspace_approved'
+ }:
+ return log_record.get('description') or 'Administrative approval activity'
+
+ return log_record.get('description') or 'N/A'
+
+
+def create_activity_log_csv_response(csv_content):
+ """Create a CSV download response for activity log exports."""
+ timestamp = datetime.utcnow().strftime('%Y%m%d_%H%M%S')
+ response = make_response(csv_content)
+ response.headers['Content-Type'] = 'text/csv; charset=utf-8'
+ response.headers['Content-Disposition'] = f'attachment; filename="activity_logs_{timestamp}.csv"'
+ return response
+
def enhance_user_with_activity(user, force_refresh=False):
"""
Enhance user data with activity information and computed fields.
@@ -539,7 +829,7 @@ def enhance_user_with_activity(user, force_refresh=False):
# Update user settings with cached metrics
settings_update = {'metrics': metrics_cache}
- update_success = update_user_settings(user.get('id'), settings_update)
+ update_success = update_user_settings(user.get('id'), settings_update, allow_cross_user=True)
if update_success:
debug_print(f"Successfully cached metrics for user {user.get('id')}")
@@ -2315,7 +2605,7 @@ def api_update_user_access(user_id):
}
}
- success = update_user_settings(user_id, access_settings)
+ success = update_user_settings(user_id, access_settings, allow_cross_user=True)
if success:
# Log admin action
@@ -2371,7 +2661,7 @@ def api_update_user_file_uploads(user_id):
}
}
- success = update_user_settings(user_id, file_upload_settings)
+ success = update_user_settings(user_id, file_upload_settings, allow_cross_user=True)
if success:
# Log admin action
@@ -2515,7 +2805,7 @@ def api_bulk_user_action():
for user_id in user_ids:
try:
- success = update_user_settings(user_id, update_settings)
+ success = update_user_settings(user_id, update_settings, allow_cross_user=True)
if success:
success_count += 1
else:
@@ -5751,103 +6041,80 @@ def api_get_activity_logs():
Supports search and filtering by activity type.
"""
try:
- # Get query parameters
- page = int(request.args.get('page', 1))
- per_page = int(request.args.get('per_page', 50))
+ page, per_page = validate_activity_logs_pagination(request.args)
search_term = request.args.get('search', '').strip().lower()
activity_type_filter = request.args.get('activity_type_filter', 'all').strip()
-
- # Build query conditions
- query_conditions = []
- parameters = []
-
- # Filter by activity type if not 'all'
- if activity_type_filter and activity_type_filter != 'all':
- query_conditions.append("c.activity_type = @activity_type")
- parameters.append({"name": "@activity_type", "value": activity_type_filter})
-
- # Build WHERE clause (empty if no conditions)
- where_clause = " WHERE " + " AND ".join(query_conditions) if query_conditions else ""
-
- # Get total count for pagination
+
+ request_started = time.perf_counter()
+ where_clause, parameters = build_activity_logs_query_context(activity_type_filter, search_term)
+
+ log_event(
+ '[ControlCenter][ActivityLogs] Loading activity logs page.',
+ extra={
+ 'page': page,
+ 'per_page': per_page,
+ 'has_search': bool(search_term),
+ 'search_length': len(search_term),
+ 'activity_type_filter': activity_type_filter or 'all'
+ },
+ debug_only=True,
+ category='CONTROL_CENTER'
+ )
+
+ count_started = time.perf_counter()
count_query = f"SELECT VALUE COUNT(1) FROM c{where_clause}"
total_items_result = list(cosmos_activity_logs_container.query_items(
query=count_query,
parameters=parameters,
enable_cross_partition_query=True
))
+ count_duration_ms = int((time.perf_counter() - count_started) * 1000)
total_items = total_items_result[0] if total_items_result and isinstance(total_items_result[0], int) else 0
-
- # Calculate pagination
- offset = (page - 1) * per_page
+
total_pages = (total_items + per_page - 1) // per_page if total_items > 0 else 1
-
- # Get paginated results
+ offset = (page - 1) * per_page
logs_query = f"""
SELECT * FROM c{where_clause}
ORDER BY c.timestamp DESC
OFFSET {offset} LIMIT {per_page}
"""
-
- debug_print(f"Activity logs query: {logs_query}")
- debug_print(f"Query parameters: {parameters}")
-
- logs = list(cosmos_activity_logs_container.query_items(
- query=logs_query,
- parameters=parameters,
- enable_cross_partition_query=True
- ))
-
- # Apply search filter in Python (after fetching from Cosmos)
- if search_term:
- filtered_logs = []
- for log in logs:
- # Search in various fields
- usage = log.get('usage', {})
- workspace_context = log.get('workspace_context', {})
- searchable_text = ' '.join([
- str(log.get('activity_type', '')),
- str(log.get('user_id', '')),
- str(log.get('login_method', '')),
- str(log.get('conversation', {}).get('title', '')),
- str(log.get('document', {}).get('file_name', '')),
- str(log.get('token_type', '')),
- str(log.get('workspace_type', '')),
- str(usage.get('model', '')),
- str(workspace_context.get('group_id', '')),
- str(workspace_context.get('public_workspace_id', ''))
- ]).lower()
-
- if search_term in searchable_text:
- filtered_logs.append(log)
-
- logs = filtered_logs
- # Recalculate total_items for filtered results
- total_items = len(logs)
- total_pages = (total_items + per_page - 1) // per_page if total_items > 0 else 1
-
- # Get unique user IDs from logs
- user_ids = set(log.get('user_id') for log in logs if log.get('user_id'))
-
- # Fetch user information for display names/emails
- user_map = {}
- if user_ids:
- for user_id in user_ids:
- try:
- user_doc = cosmos_user_settings_container.read_item(
- item=user_id,
- partition_key=user_id
- )
- user_map[user_id] = {
- 'email': user_doc.get('email', ''),
- 'display_name': user_doc.get('display_name', '')
- }
- except Exception as ex:
- user_map[user_id] = {
- 'email': '',
- 'display_name': ''
- }
-
+
+ query_started = time.perf_counter()
+ logs = [
+ normalize_activity_log_record(log_record)
+ for log_record in cosmos_activity_logs_container.query_items(
+ query=logs_query,
+ parameters=parameters,
+ enable_cross_partition_query=True
+ )
+ ]
+ query_duration_ms = int((time.perf_counter() - query_started) * 1000)
+
+ user_lookup_started = time.perf_counter()
+ user_map = build_activity_log_user_map(logs)
+ user_lookup_duration_ms = int((time.perf_counter() - user_lookup_started) * 1000)
+ total_duration_ms = int((time.perf_counter() - request_started) * 1000)
+
+ log_event(
+ '[ControlCenter][ActivityLogs] Activity logs page loaded.',
+ extra={
+ 'page': page,
+ 'per_page': per_page,
+ 'returned_items': len(logs),
+ 'total_items': total_items,
+ 'total_pages': total_pages,
+ 'unique_user_count': len(user_map),
+ 'count_duration_ms': count_duration_ms,
+ 'query_duration_ms': query_duration_ms,
+ 'user_lookup_duration_ms': user_lookup_duration_ms,
+ 'total_duration_ms': total_duration_ms,
+ 'has_search': bool(search_term),
+ 'activity_type_filter': activity_type_filter or 'all'
+ },
+ debug_only=True,
+ category='CONTROL_CENTER'
+ )
+
return jsonify({
'logs': logs,
'user_map': user_map,
@@ -5860,13 +6127,127 @@ def api_get_activity_logs():
'has_next': page < total_pages
}
}), 200
-
- except Exception as e:
- debug_print(f"Error getting activity logs: {e}")
- import traceback
- traceback.print_exc()
+
+ except ValueError as ex:
+ log_event(
+ '[ControlCenter][ActivityLogs] Invalid activity logs request.',
+ extra={
+ 'page': request.args.get('page'),
+ 'per_page': request.args.get('per_page'),
+ 'error_message': str(ex)
+ },
+ level=logging.WARNING
+ )
+ return jsonify({'error': str(ex)}), 400
+
+ except Exception as ex:
+ log_event(
+ '[ControlCenter][ActivityLogs] Failed to fetch activity logs.',
+ extra={
+ 'page': request.args.get('page'),
+ 'per_page': request.args.get('per_page'),
+ 'search': request.args.get('search', ''),
+ 'activity_type_filter': request.args.get('activity_type_filter', 'all'),
+ 'error_type': type(ex).__name__,
+ 'error_message': str(ex)
+ },
+ level=logging.ERROR,
+ exceptionTraceback=True
+ )
return jsonify({'error': 'Failed to fetch activity logs'}), 500
+ @app.route('/api/admin/control-center/activity-logs/export', methods=['GET'])
+ @swagger_route(security=get_auth_security())
+ @login_required
+ @control_center_required('admin')
+ def api_export_activity_logs():
+ """Export all matching activity logs as CSV without relying on the paged JSON endpoint."""
+ try:
+ search_term = request.args.get('search', '').strip().lower()
+ activity_type_filter = request.args.get('activity_type_filter', 'all').strip()
+ export_started = time.perf_counter()
+ where_clause, parameters = build_activity_logs_query_context(activity_type_filter, search_term)
+
+ log_event(
+ '[ControlCenter][ActivityLogs] Starting activity log export.',
+ extra={
+ 'has_search': bool(search_term),
+ 'search_length': len(search_term),
+ 'activity_type_filter': activity_type_filter or 'all'
+ },
+ debug_only=True,
+ category='CONTROL_CENTER'
+ )
+
+ logs_query = f"""
+ SELECT * FROM c{where_clause}
+ ORDER BY c.timestamp DESC
+ """
+
+ output = StringIO()
+ writer = csv.writer(output)
+ writer.writerow(['Timestamp', 'Activity Type', 'User ID', 'User Email', 'User Name', 'Details', 'Workspace Type'])
+
+ exported_count = 0
+ user_cache = {}
+ for log_record in cosmos_activity_logs_container.query_items(
+ query=logs_query,
+ parameters=parameters,
+ enable_cross_partition_query=True
+ ):
+ normalized_log = normalize_activity_log_record(log_record)
+ resolved_user_id = (
+ normalized_log.get('user_id')
+ or normalized_log.get('admin_user_id')
+ or normalized_log.get('added_by_user_id')
+ or ''
+ )
+ user_details = get_activity_log_user_details(resolved_user_id, user_cache)
+ writer.writerow([
+ normalized_log.get('timestamp', ''),
+ normalized_log.get('activity_type', ''),
+ resolved_user_id,
+ user_details.get('email')
+ or normalized_log.get('admin_email')
+ or normalized_log.get('requester_email')
+ or normalized_log.get('added_by_email')
+ or normalized_log.get('member_email', ''),
+ user_details.get('display_name') or normalized_log.get('member_name', ''),
+ format_activity_log_details_for_csv(normalized_log),
+ normalized_log.get('workspace_type', '')
+ ])
+ exported_count += 1
+
+ export_duration_ms = int((time.perf_counter() - export_started) * 1000)
+ log_event(
+ '[ControlCenter][ActivityLogs] Activity log export completed.',
+ extra={
+ 'exported_count': exported_count,
+ 'unique_user_count': len(user_cache),
+ 'duration_ms': export_duration_ms,
+ 'has_search': bool(search_term),
+ 'activity_type_filter': activity_type_filter or 'all'
+ },
+ debug_only=True,
+ category='CONTROL_CENTER'
+ )
+
+ return create_activity_log_csv_response(output.getvalue())
+
+ except Exception as ex:
+ log_event(
+ '[ControlCenter][ActivityLogs] Failed to export activity logs.',
+ extra={
+ 'search': request.args.get('search', ''),
+ 'activity_type_filter': request.args.get('activity_type_filter', 'all'),
+ 'error_type': type(ex).__name__,
+ 'error_message': str(ex)
+ },
+ level=logging.ERROR,
+ exceptionTraceback=True
+ )
+ return jsonify({'error': 'Failed to export activity logs'}), 500
+
# ============================================================================
# APPROVAL WORKFLOW ENDPOINTS
# ============================================================================
@@ -5940,6 +6321,22 @@ def api_admin_get_approvals():
debug_print(traceback.format_exc())
return jsonify({'error': 'Failed to fetch approvals', 'details': str(e)}), 500
+ def _get_authorized_route_approval(approval_id, group_id, require_approval_rights=False):
+ """Resolve the current user and return an authorized approval plus user context."""
+ user = session.get('user', {})
+ user_id = user.get('oid') or user.get('sub')
+ user_roles = user.get('roles', [])
+ user_email = user.get('preferred_username', user.get('email', 'unknown'))
+ user_name = user.get('name', user_email)
+ approval = get_authorized_approval(
+ approval_id,
+ group_id,
+ user_id,
+ user_roles,
+ require_approval_rights=require_approval_rights,
+ )
+ return approval, user_id, user_roles, user_email, user_name
+
@app.route('/api/admin/control-center/approvals/', methods=['GET'])
@swagger_route(security=get_auth_security())
@login_required
@@ -5952,23 +6349,23 @@ def api_admin_get_approval_by_id(approval_id):
group_id (str): Group ID (partition key)
"""
try:
- user = session.get('user', {})
- user_id = user.get('oid') or user.get('sub')
-
group_id = request.args.get('group_id')
if not group_id:
return jsonify({'error': 'group_id query parameter is required'}), 400
-
- # Get the approval
- approval = cosmos_approvals_container.read_item(
- item=approval_id,
- partition_key=group_id
+
+ approval, user_id, user_roles, _user_email, _user_name = _get_authorized_route_approval(
+ approval_id,
+ group_id,
)
# Add can_approve field
- approval['can_approve'] = (approval.get('requester_id') != user_id)
+ approval['can_approve'] = _can_user_approve(approval, user_id, user_roles)
return jsonify(approval), 200
+ except LookupError:
+ return jsonify({'error': 'Approval not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'You are not authorized to view this approval'}), 403
except Exception as e:
debug_print(f"Error fetching approval {approval_id}: {e}")
@@ -5989,17 +6386,18 @@ def api_admin_approve_request(approval_id):
comment (str, optional): Approval comment
"""
try:
- user = session.get('user', {})
- user_id = user.get('oid') or user.get('sub')
- user_email = user.get('preferred_username', user.get('email', 'unknown'))
- user_name = user.get('name', user_email)
-
data = request.get_json()
group_id = data.get('group_id')
comment = data.get('comment', '')
if not group_id:
return jsonify({'error': 'group_id is required'}), 400
+
+ approval, user_id, _user_roles, user_email, user_name = _get_authorized_route_approval(
+ approval_id,
+ group_id,
+ require_approval_rights=True,
+ )
# Approve the request
approval = approve_request(
@@ -6008,7 +6406,8 @@ def api_admin_approve_request(approval_id):
approver_id=user_id,
approver_email=user_email,
approver_name=user_name,
- comment=comment
+ comment=comment,
+ approval=approval,
)
# Execute the approved action
@@ -6020,6 +6419,10 @@ def api_admin_approve_request(approval_id):
'approval': approval,
'execution_result': execution_result
}), 200
+ except LookupError:
+ return jsonify({'error': 'Approval not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'You are not eligible to approve this request'}), 403
except Exception as e:
debug_print(f"Error approving request: {e}")
@@ -6038,11 +6441,6 @@ def api_admin_deny_request(approval_id):
comment (str): Reason for denial (required)
"""
try:
- user = session.get('user', {})
- user_id = user.get('oid') or user.get('sub')
- user_email = user.get('preferred_username', user.get('email', 'unknown'))
- user_name = user.get('name', user_email)
-
data = request.get_json()
group_id = data.get('group_id')
comment = data.get('comment', '')
@@ -6052,6 +6450,12 @@ def api_admin_deny_request(approval_id):
if not comment:
return jsonify({'error': 'comment is required for denial'}), 400
+
+ approval, user_id, _user_roles, user_email, user_name = _get_authorized_route_approval(
+ approval_id,
+ group_id,
+ require_approval_rights=True,
+ )
# Deny the request
approval = deny_request(
@@ -6061,7 +6465,8 @@ def api_admin_deny_request(approval_id):
denier_email=user_email,
denier_name=user_name,
comment=comment,
- auto_denied=False
+ auto_denied=False,
+ approval=approval,
)
return jsonify({
@@ -6069,6 +6474,10 @@ def api_admin_deny_request(approval_id):
'message': 'Request denied',
'approval': approval
}), 200
+ except LookupError:
+ return jsonify({'error': 'Approval not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'You are not eligible to deny this request'}), 403
except Exception as e:
debug_print(f"Error denying request: {e}")
@@ -6127,8 +6536,7 @@ def api_get_approvals():
approvals_with_permission = []
for approval in result.get('approvals', []):
approval_copy = dict(approval)
- # User can approve if they didn't create the request
- approval_copy['can_approve'] = (approval.get('requester_id') != user_id)
+ approval_copy['can_approve'] = _can_user_approve(approval, user_id, user_roles)
approvals_with_permission.append(approval_copy)
return jsonify({
@@ -6157,23 +6565,23 @@ def api_get_approval_by_id(approval_id):
group_id (str): Group ID (partition key)
"""
try:
- user = session.get('user', {})
- user_id = user.get('oid') or user.get('sub')
-
group_id = request.args.get('group_id')
if not group_id:
return jsonify({'error': 'group_id query parameter is required'}), 400
-
- # Get the approval
- approval = cosmos_approvals_container.read_item(
- item=approval_id,
- partition_key=group_id
+
+ approval, user_id, user_roles, _user_email, _user_name = _get_authorized_route_approval(
+ approval_id,
+ group_id,
)
# Add can_approve field
- approval['can_approve'] = (approval.get('requester_id') != user_id)
+ approval['can_approve'] = _can_user_approve(approval, user_id, user_roles)
return jsonify(approval), 200
+ except LookupError:
+ return jsonify({'error': 'Approval not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'You are not authorized to view this approval'}), 403
except Exception as e:
debug_print(f"Error fetching approval {approval_id}: {e}")
@@ -6193,17 +6601,18 @@ def api_approve_request(approval_id):
comment (str, optional): Approval comment
"""
try:
- user = session.get('user', {})
- user_id = user.get('oid') or user.get('sub')
- user_email = user.get('preferred_username', user.get('email', 'unknown'))
- user_name = user.get('name', user_email)
-
data = request.get_json()
group_id = data.get('group_id')
comment = data.get('comment', '')
if not group_id:
return jsonify({'error': 'group_id is required'}), 400
+
+ approval, user_id, _user_roles, user_email, user_name = _get_authorized_route_approval(
+ approval_id,
+ group_id,
+ require_approval_rights=True,
+ )
# Approve the request
approval = approve_request(
@@ -6212,7 +6621,8 @@ def api_approve_request(approval_id):
approver_id=user_id,
approver_email=user_email,
approver_name=user_name,
- comment=comment
+ comment=comment,
+ approval=approval,
)
# Execute the approved action
@@ -6224,6 +6634,10 @@ def api_approve_request(approval_id):
'approval': approval,
'execution_result': execution_result
}), 200
+ except LookupError:
+ return jsonify({'error': 'Approval not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'You are not eligible to approve this request'}), 403
except Exception as e:
debug_print(f"Error approving request: {e}")
@@ -6241,11 +6655,6 @@ def api_deny_request(approval_id):
comment (str): Reason for denial (required)
"""
try:
- user = session.get('user', {})
- user_id = user.get('oid') or user.get('sub')
- user_email = user.get('preferred_username', user.get('email', 'unknown'))
- user_name = user.get('name', user_email)
-
data = request.get_json()
group_id = data.get('group_id')
comment = data.get('comment', '')
@@ -6255,6 +6664,12 @@ def api_deny_request(approval_id):
if not comment:
return jsonify({'error': 'comment is required for denial'}), 400
+
+ approval, user_id, _user_roles, user_email, user_name = _get_authorized_route_approval(
+ approval_id,
+ group_id,
+ require_approval_rights=True,
+ )
# Deny the request
approval = deny_request(
@@ -6264,7 +6679,8 @@ def api_deny_request(approval_id):
denier_email=user_email,
denier_name=user_name,
comment=comment,
- auto_denied=False
+ auto_denied=False,
+ approval=approval,
)
return jsonify({
@@ -6272,6 +6688,10 @@ def api_deny_request(approval_id):
'message': 'Request denied',
'approval': approval
}), 200
+ except LookupError:
+ return jsonify({'error': 'Approval not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'You are not eligible to deny this request'}), 403
except Exception as e:
debug_print(f"Error denying request: {e}")
diff --git a/application/single_app/route_backend_conversations.py b/application/single_app/route_backend_conversations.py
index 23f1a0714..a99e7c54b 100644
--- a/application/single_app/route_backend_conversations.py
+++ b/application/single_app/route_backend_conversations.py
@@ -72,6 +72,22 @@ def _collect_child_message_documents(conversation_id, root_message_ids):
return child_docs
+
+def _authorize_personal_conversation_read(user_id, conversation_id):
+ """Load a personal conversation and ensure the caller owns it."""
+ try:
+ conversation_item = cosmos_conversations_container.read_item(
+ item=conversation_id,
+ partition_key=conversation_id,
+ )
+ except CosmosResourceNotFoundError as exc:
+ raise LookupError(f"Conversation {conversation_id} not found") from exc
+
+ if conversation_item.get('user_id') != user_id:
+ raise PermissionError('Forbidden')
+
+ return conversation_item
+
def register_route_backend_conversations(app):
@app.route('/api/get_messages', methods=['GET'])
@@ -86,10 +102,7 @@ def api_get_messages():
if not conversation_id:
return jsonify({'error': 'No conversation_id provided'}), 400
try:
- conversation_item = cosmos_conversations_container.read_item(
- item=conversation_id,
- partition_key=conversation_id
- )
+ _authorize_personal_conversation_read(user_id, conversation_id)
# Query all messages in cosmos_messages_container
# We'll filter for active_thread in Python since Cosmos DB boolean queries can be tricky
message_query = f"""
@@ -209,7 +222,9 @@ def api_get_messages():
message['vision_analysis'] = vision_analysis
return jsonify({'messages': messages})
- except CosmosResourceNotFoundError:
+ except PermissionError:
+ return jsonify({'error': 'Forbidden'}), 403
+ except LookupError:
return jsonify({'messages': []})
except Exception as e:
print(f"ERROR: Failed to get messages: {str(e)}")
@@ -240,6 +255,8 @@ def api_get_image(image_id):
conversation_id = '_'.join(parts[:-3])
debug_print(f"Serving image {image_id} from conversation {conversation_id}")
+
+ _authorize_personal_conversation_read(user_id, conversation_id)
# Query for the main image document and chunks
message_query = f"SELECT * FROM c WHERE c.conversation_id = '{conversation_id}'"
@@ -334,6 +351,11 @@ def api_get_image(image_id):
)
else:
return jsonify({'error': 'Invalid image format'}), 400
+
+ except PermissionError:
+ return jsonify({'error': 'Forbidden'}), 403
+ except LookupError:
+ return jsonify({'error': 'Image not found'}), 404
except Exception as e:
print(f"ERROR: Failed to serve image {image_id}: {str(e)}")
@@ -456,18 +478,21 @@ def delete_conversation(conversation_id):
"""
Delete a conversation. If archiving is enabled, copy it to archived_conversations first.
"""
+ user_id = get_current_user_id()
+ if not user_id:
+ return jsonify({'error': 'User not authenticated'}), 401
+
settings = get_settings()
archiving_enabled = settings.get('enable_conversation_archiving', False)
try:
- conversation_item = cosmos_conversations_container.read_item(
- item=conversation_id,
- partition_key=conversation_id
- )
- except CosmosResourceNotFoundError:
+ conversation_item = _authorize_personal_conversation_read(user_id, conversation_id)
+ except LookupError:
return jsonify({
"error": f"Conversation {conversation_id} not found."
}), 404
+ except PermissionError:
+ return jsonify({'error': 'Forbidden'}), 403
except Exception as e:
return jsonify({
"error": str(e)
diff --git a/application/single_app/route_backend_documents.py b/application/single_app/route_backend_documents.py
index b70d99806..c74375e0e 100644
--- a/application/single_app/route_backend_documents.py
+++ b/application/single_app/route_backend_documents.py
@@ -135,7 +135,7 @@ def get_file_content():
return jsonify({'error': 'Missing conversation_id or id'}), 400
try:
- _ = cosmos_conversations_container.read_item(
+ conversation_item = cosmos_conversations_container.read_item(
item=conversation_id,
partition_key=conversation_id
)
@@ -143,6 +143,9 @@ def get_file_content():
return jsonify({'error': 'Conversation not found'}), 404
except Exception as e:
return jsonify({'error': f'Error reading conversation: {str(e)}'}), 500
+
+ if conversation_item.get('user_id') != user_id:
+ return jsonify({'error': 'Forbidden'}), 403
add_file_task_to_file_processing_log(document_id=file_id, user_id=user_id, content="Conversation exists, retrieving file content")
try:
@@ -919,12 +922,12 @@ def api_create_tag():
data = request.get_json()
tag_name = data.get('tag_name')
- color = data.get('color', '#0d6efd') # Default blue color
+ color = data.get('color')
if not tag_name:
return jsonify({'error': 'tag_name is required'}), 400
- from functions_documents import normalize_tag, validate_tags
+ from functions_documents import normalize_tag, validate_tag_color, validate_tags
from functions_settings import get_user_settings, update_user_settings
from datetime import datetime, timezone
@@ -935,6 +938,9 @@ def api_create_tag():
return jsonify({'error': error_msg}), 400
normalized_tag = normalized_tags[0]
+ is_valid_color, color_error, normalized_color = validate_tag_color(color, normalized_tag)
+ if not is_valid_color:
+ return jsonify({'error': color_error}), 400
# Get existing tag definitions from settings
user_settings = get_user_settings(user_id)
@@ -954,7 +960,7 @@ def api_create_tag():
# Add new tag to existing tags (don't replace)
personal_tags[normalized_tag] = {
- 'color': color,
+ 'color': normalized_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
@@ -973,7 +979,7 @@ def api_create_tag():
'message': f'Tag "{normalized_tag}" created successfully',
'tag': {
'name': normalized_tag,
- 'color': color
+ 'color': normalized_color
}
}), 201
@@ -1157,7 +1163,7 @@ def api_update_tag(tag_name):
debug_print(f"[UPDATE TAG] Request data - new_name: {new_name}, new_color: {new_color}")
from functions_documents import (
- normalize_tag, validate_tags, get_documents,
+ normalize_tag, validate_tag_color, validate_tags, get_documents,
update_document, propagate_tags_to_chunks
)
from functions_settings import get_user_settings, update_user_settings
@@ -1267,6 +1273,10 @@ def api_update_tag(tag_name):
# Handle color change only
if new_color:
debug_print(f"[UPDATE TAG] Handling color change operation...")
+ is_valid_color, color_error, normalized_color = validate_tag_color(new_color, normalized_old_tag)
+ if not is_valid_color:
+ return jsonify({'error': color_error}), 400
+
user_settings = get_user_settings(user_id)
settings_dict = user_settings.get('settings', {})
tag_defs = settings_dict.get('tag_definitions', {})
@@ -1276,13 +1286,13 @@ def api_update_tag(tag_name):
debug_print(f"[UPDATE TAG] Looking for tag: {normalized_old_tag}")
if normalized_old_tag in personal_tags:
- debug_print(f"[UPDATE TAG] Found tag, updating color to: {new_color}")
- personal_tags[normalized_old_tag]['color'] = new_color
+ debug_print(f"[UPDATE TAG] Found tag, updating color to: {normalized_color}")
+ personal_tags[normalized_old_tag]['color'] = normalized_color
else:
- debug_print(f"[UPDATE TAG] Tag not found, creating new entry with color: {new_color}")
+ debug_print(f"[UPDATE TAG] Tag not found, creating new entry with color: {normalized_color}")
from datetime import datetime, timezone
personal_tags[normalized_old_tag] = {
- 'color': new_color,
+ 'color': normalized_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
@@ -1293,7 +1303,11 @@ def api_update_tag(tag_name):
debug_print(f"[UPDATE TAG] Color change completed successfully")
return jsonify({
- 'message': f'Tag color updated for "{normalized_old_tag}"'
+ 'message': f'Tag color updated for "{normalized_old_tag}"',
+ 'tag': {
+ 'name': normalized_old_tag,
+ 'color': normalized_color
+ }
}), 200
debug_print(f"[UPDATE TAG] No updates specified!")
diff --git a/application/single_app/route_backend_feedback.py b/application/single_app/route_backend_feedback.py
index 49167cc85..8eeb31694 100644
--- a/application/single_app/route_backend_feedback.py
+++ b/application/single_app/route_backend_feedback.py
@@ -5,6 +5,22 @@
from functions_settings import *
from swagger_wrapper import swagger_route, get_auth_security
+
+def _authorize_feedback_conversation(user_id, conversation_id):
+ """Load the target conversation and ensure the caller owns it."""
+ try:
+ conversation_item = cosmos_conversations_container.read_item(
+ item=conversation_id,
+ partition_key=conversation_id,
+ )
+ except CosmosResourceNotFoundError as exc:
+ raise LookupError(f"Conversation {conversation_id} not found") from exc
+
+ if conversation_item.get("user_id") != user_id:
+ raise PermissionError("Forbidden")
+
+ return conversation_item
+
def register_route_backend_feedback(app):
@app.route("/feedback/submit", methods=["POST"])
@@ -18,7 +34,7 @@ def feedback_submit():
POST /feedback/submit
JSON body: { messageId, conversationId, feedbackType, reason }
"""
- data = request.get_json()
+ data = request.get_json() or {}
messageId = data.get("messageId") # This is the ID of the specific AI message
conversationId = data.get("conversationId") # This is the ID of the conversation
feedbackType = data.get("feedbackType")
@@ -30,6 +46,16 @@ def feedback_submit():
if not messageId or not conversationId or not feedbackType:
return jsonify({"error": "Missing required fields"}), 400
+ if not user_id:
+ return jsonify({"error": "No user ID found in session"}), 403
+
+ try:
+ _authorize_feedback_conversation(user_id, conversationId)
+ except LookupError:
+ return jsonify({"error": "Conversation not found"}), 404
+ except PermissionError:
+ return jsonify({"error": "Forbidden", "message": "You do not have access to this conversation"}), 403
+
ai_message_text = None
user_prompt_text = None
all_messages = [] # Initialize an empty list for messages
@@ -51,10 +77,7 @@ def feedback_submit():
# --- END CORRECTED PART ---
if not message_items:
- # No messages found for this conversation ID, which is unexpected if feedback is given
- # You might want to log this or handle it differently
- print(f"Warning: No messages found for conversationId {conversationId} during feedback submission.")
- # Keep ai_message_text and user_prompt_text as None initially
+ return jsonify({"error": "Assistant message not found"}), 404
all_messages = message_items # Assign the query results to all_messages
@@ -70,6 +93,9 @@ def feedback_submit():
ai_msg_index = i
break
+ if ai_msg_index == -1:
+ return jsonify({"error": "Assistant message not found"}), 404
+
# Find the user message immediately preceding the AI message
if ai_msg_index > 0:
# Iterate backwards from the message before the AI's message
@@ -87,21 +113,12 @@ def feedback_submit():
if all_messages[i].get("role") == "user":
user_prompt_text = all_messages[i].get("content")
break
-
-
- except exceptions.CosmosResourceNotFoundError:
- # This specific exception might not be raised by query_items if the container exists but no items match.
- # A query returning empty is more likely. Handle general exceptions.
- print(f"Error querying messages for conversation {conversationId}: Resource not found (unexpected).")
- # Decide how to handle - maybe proceed with default text?
except Exception as e:
print(f"Error querying messages for conversation {conversationId}: {e}")
- # Log the error, maybe return a 500 or proceed with default text
- # For now, let the default text logic below handle it.
- pass # Allow execution to continue to the default text part
+ return jsonify({"error": "Failed to load feedback target"}), 500
# Set default text if messages weren't found
- if not ai_message_text:
+ if ai_message_text is None:
ai_message_text = "[AI response text not found in cosmos_messages_container]"
if not user_prompt_text:
diff --git a/application/single_app/route_backend_group_documents.py b/application/single_app/route_backend_group_documents.py
index d8f00a04c..e8a72cba7 100644
--- a/application/single_app/route_backend_group_documents.py
+++ b/application/single_app/route_backend_group_documents.py
@@ -1046,12 +1046,12 @@ def api_create_group_tag():
data = request.get_json()
tag_name = data.get('tag_name')
- color = data.get('color', '#0d6efd')
+ color = data.get('color')
if not tag_name:
return jsonify({'error': 'tag_name is required'}), 400
- from functions_documents import normalize_tag, validate_tags
+ from functions_documents import normalize_tag, validate_tag_color, validate_tags
from datetime import datetime, timezone
try:
@@ -1060,6 +1060,9 @@ def api_create_group_tag():
return jsonify({'error': error_msg}), 400
normalized_tag = normalized_tags[0]
+ is_valid_color, color_error, normalized_color = validate_tag_color(color, normalized_tag)
+ if not is_valid_color:
+ return jsonify({'error': color_error}), 400
tag_defs = group_doc.get('tag_definitions', {})
@@ -1067,7 +1070,7 @@ def api_create_group_tag():
return jsonify({'error': 'Tag already exists'}), 409
tag_defs[normalized_tag] = {
- 'color': color,
+ 'color': normalized_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
group_doc['tag_definitions'] = tag_defs
@@ -1077,7 +1080,7 @@ def api_create_group_tag():
'message': f'Tag "{normalized_tag}" created successfully',
'tag': {
'name': normalized_tag,
- 'color': color
+ 'color': normalized_color
}
}), 201
@@ -1248,7 +1251,7 @@ def api_update_group_tag(tag_name):
new_name = data.get('new_name')
new_color = data.get('color')
- from functions_documents import normalize_tag, validate_tags, update_document, propagate_tags_to_chunks
+ from functions_documents import normalize_tag, validate_tag_color, validate_tags, update_document, propagate_tags_to_chunks
try:
normalized_old_tag = normalize_tag(tag_name)
@@ -1309,14 +1312,18 @@ def api_update_group_tag(tag_name):
}), 200
if new_color:
+ is_valid_color, color_error, normalized_color = validate_tag_color(new_color, normalized_old_tag)
+ if not is_valid_color:
+ return jsonify({'error': color_error}), 400
+
tag_defs = group_doc.get('tag_definitions', {})
if normalized_old_tag in tag_defs:
- tag_defs[normalized_old_tag]['color'] = new_color
+ tag_defs[normalized_old_tag]['color'] = normalized_color
else:
from datetime import datetime, timezone
tag_defs[normalized_old_tag] = {
- 'color': new_color,
+ 'color': normalized_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
@@ -1324,7 +1331,11 @@ def api_update_group_tag(tag_name):
cosmos_groups_container.upsert_item(group_doc)
return jsonify({
- 'message': f'Tag color updated for "{normalized_old_tag}"'
+ 'message': f'Tag color updated for "{normalized_old_tag}"',
+ 'tag': {
+ 'name': normalized_old_tag,
+ 'color': normalized_color
+ }
}), 200
return jsonify({'error': 'No updates specified'}), 400
diff --git a/application/single_app/route_backend_group_prompts.py b/application/single_app/route_backend_group_prompts.py
index ec87d4ec5..bd49da209 100644
--- a/application/single_app/route_backend_group_prompts.py
+++ b/application/single_app/route_backend_group_prompts.py
@@ -2,10 +2,26 @@
from config import *
from functions_authentication import *
+from functions_group import require_active_group
from functions_settings import *
from functions_prompts import *
from swagger_wrapper import swagger_route, get_auth_security
+
+def _get_active_group_or_error(user_id):
+ try:
+ return require_active_group(
+ user_id,
+ allowed_roles=("Owner", "Admin", "DocumentManager", "User"),
+ ), None
+ except ValueError:
+ return None, (jsonify({"error": "No active group selected"}), 400)
+ except LookupError:
+ return None, (jsonify({"error": "Active group not found"}), 404)
+ except PermissionError:
+ return None, (jsonify({"error": "You are not a member of the active group"}), 403)
+
+
def register_route_backend_group_prompts(app):
@app.route('/api/group_prompts', methods=['GET'])
@swagger_route(security=get_auth_security())
@@ -13,10 +29,10 @@ def register_route_backend_group_prompts(app):
@user_required
@enabled_required("enable_group_workspaces")
def get_group_prompts():
- user_id = get_current_user_id()
- active_group = get_user_settings(user_id)["settings"].get("activeGroupOid")
- if not active_group:
- return jsonify({"error":"No active group selected"}), 400
+ user_id = get_current_user_id()
+ active_group, error_response = _get_active_group_or_error(user_id)
+ if error_response:
+ return error_response
try:
items, total, page, page_size = list_prompts(
@@ -41,10 +57,10 @@ def get_group_prompts():
@user_required
@enabled_required("enable_group_workspaces")
def create_group_prompt():
- user_id = get_current_user_id()
- active_group = get_user_settings(user_id)["settings"].get("activeGroupOid")
- if not active_group:
- return jsonify({"error":"No active group selected"}), 400
+ user_id = get_current_user_id()
+ active_group, error_response = _get_active_group_or_error(user_id)
+ if error_response:
+ return error_response
data = request.get_json() or {}
name = data.get("name","").strip()
@@ -71,10 +87,10 @@ def create_group_prompt():
@user_required
@enabled_required("enable_group_workspaces")
def get_group_prompt(prompt_id):
- user_id = get_current_user_id()
- active_group = get_user_settings(user_id)["settings"].get("activeGroupOid")
- if not active_group:
- return jsonify({"error":"No active group selected"}), 400
+ user_id = get_current_user_id()
+ active_group, error_response = _get_active_group_or_error(user_id)
+ if error_response:
+ return error_response
try:
item = get_prompt_doc(
@@ -96,10 +112,10 @@ def get_group_prompt(prompt_id):
@user_required
@enabled_required("enable_group_workspaces")
def update_group_prompt(prompt_id):
- user_id = get_current_user_id()
- active_group = get_user_settings(user_id)["settings"].get("activeGroupOid")
- if not active_group:
- return jsonify({"error":"No active group selected"}), 400
+ user_id = get_current_user_id()
+ active_group, error_response = _get_active_group_or_error(user_id)
+ if error_response:
+ return error_response
data = request.get_json() or {}
updates = {}
@@ -135,10 +151,10 @@ def update_group_prompt(prompt_id):
@user_required
@enabled_required("enable_group_workspaces")
def delete_group_prompt(prompt_id):
- user_id = get_current_user_id()
- active_group = get_user_settings(user_id)["settings"].get("activeGroupOid")
- if not active_group:
- return jsonify({"error":"No active group selected"}), 400
+ user_id = get_current_user_id()
+ active_group, error_response = _get_active_group_or_error(user_id)
+ if error_response:
+ return error_response
try:
success = delete_prompt_doc(
diff --git a/application/single_app/route_backend_groups.py b/application/single_app/route_backend_groups.py
index 0e35d211b..9186e4ce4 100644
--- a/application/single_app/route_backend_groups.py
+++ b/application/single_app/route_backend_groups.py
@@ -163,10 +163,17 @@ def api_get_group_details(group_id):
GET /api/groups/
Returns the full group details for that group.
"""
+ user_info = get_current_user_info()
+ user_id = user_info["userId"]
+
group_doc = find_group_by_id(group_id)
if not group_doc:
return jsonify({"error": "Group not found"}), 404
+
+ if not get_user_role_in_group(group_doc, user_id):
+ return jsonify({"error": "You are not a member of this group"}), 403
+
return jsonify(group_doc), 200
@app.route("/api/groups/", methods=["DELETE"])
diff --git a/application/single_app/route_backend_plugins.py b/application/single_app/route_backend_plugins.py
index 115bf2828..b60f81894 100644
--- a/application/single_app/route_backend_plugins.py
+++ b/application/single_app/route_backend_plugins.py
@@ -28,6 +28,7 @@
validate_group_action_payload,
)
from functions_keyvault import (
+ resolve_secret_reference_for_context,
SecretReturnType,
redact_plugin_secret_values,
retrieve_secret_from_key_vault_by_full_name,
@@ -230,17 +231,35 @@ def _redact_plugin_for_logging(plugin):
return redact_plugin_secret_values(plugin)
-def _resolve_secret_value_for_sql_test(value, field_name):
+def _resolve_plugin_secret_context(plugin_manifest, fallback_scope_value, fallback_scope="user"):
+ """Infer the expected Key Vault scope for SQL test-connection secret resolution."""
+ if not isinstance(plugin_manifest, dict):
+ return fallback_scope_value, fallback_scope
+
+ plugin_scope = str(plugin_manifest.get("scope") or "").strip().lower()
+ if plugin_scope == "group" or plugin_manifest.get("is_group"):
+ return plugin_manifest.get("group_id"), "group"
+ if plugin_scope == "global" or plugin_manifest.get("is_global"):
+ return plugin_manifest.get("id") or fallback_scope_value, "global"
+ if plugin_scope == "user" or plugin_manifest.get("user_id"):
+ return plugin_manifest.get("user_id") or fallback_scope_value, "user"
+ return fallback_scope_value, fallback_scope
+
+
+def _resolve_secret_value_for_sql_test(value, field_name, scope_value, scope):
"""Resolve a Key Vault reference for SQL test-connection flows."""
if not isinstance(value, str) or not value:
return value
if not validate_secret_name_dynamic(value):
return value
- resolved_value = retrieve_secret_from_key_vault_by_full_name(value)
- if validate_secret_name_dynamic(resolved_value):
- raise ValueError(f"Unable to resolve stored Key Vault secret for SQL field '{field_name}'.")
- return resolved_value
+ return resolve_secret_reference_for_context(
+ value,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action-addset"},
+ context_label=f"SQL field '{field_name}'",
+ )
def _load_existing_plugin_for_sql_test(plugin_context, user_id):
@@ -1093,9 +1112,21 @@ def test_sql_connection():
field_list = ', '.join(unresolved_fields)
return jsonify({'success': False, 'error': f"Stored SQL secret could not be resolved for testing. Re-enter the {field_list}."}), 400
+ plugin_scope_value, plugin_scope = _resolve_plugin_secret_context(existing_plugin, user_id)
+
try:
- connection_string = _resolve_secret_value_for_sql_test(connection_string, 'connection_string')
- password = _resolve_secret_value_for_sql_test(password, 'password')
+ connection_string = _resolve_secret_value_for_sql_test(
+ connection_string,
+ 'connection_string',
+ scope_value=plugin_scope_value,
+ scope=plugin_scope,
+ )
+ password = _resolve_secret_value_for_sql_test(
+ password,
+ 'password',
+ scope_value=plugin_scope_value,
+ scope=plugin_scope,
+ )
except ValueError as exc:
return jsonify({'success': False, 'error': str(exc)}), 400
diff --git a/application/single_app/route_backend_public_documents.py b/application/single_app/route_backend_public_documents.py
index 3b7486bd2..f4396c241 100644
--- a/application/single_app/route_backend_public_documents.py
+++ b/application/single_app/route_backend_public_documents.py
@@ -536,12 +536,12 @@ def api_create_public_workspace_tag():
data = request.get_json()
tag_name = data.get('tag_name')
- color = data.get('color', '#0d6efd')
+ color = data.get('color')
if not tag_name:
return jsonify({'error': 'tag_name is required'}), 400
- from functions_documents import normalize_tag, validate_tags
+ from functions_documents import normalize_tag, validate_tag_color, validate_tags
from datetime import datetime, timezone
try:
@@ -550,6 +550,9 @@ def api_create_public_workspace_tag():
return jsonify({'error': error_msg}), 400
normalized_tag = normalized_tags[0]
+ is_valid_color, color_error, normalized_color = validate_tag_color(color, normalized_tag)
+ if not is_valid_color:
+ return jsonify({'error': color_error}), 400
tag_defs = ws_doc.get('tag_definitions', {})
@@ -557,7 +560,7 @@ def api_create_public_workspace_tag():
return jsonify({'error': 'Tag already exists'}), 409
tag_defs[normalized_tag] = {
- 'color': color,
+ 'color': normalized_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
ws_doc['tag_definitions'] = tag_defs
@@ -567,7 +570,7 @@ def api_create_public_workspace_tag():
'message': f'Tag "{normalized_tag}" created successfully',
'tag': {
'name': normalized_tag,
- 'color': color
+ 'color': normalized_color
}
}), 201
@@ -740,7 +743,7 @@ def api_update_public_workspace_tag(tag_name):
new_name = data.get('new_name')
new_color = data.get('color')
- from functions_documents import normalize_tag, validate_tags, update_document, propagate_tags_to_chunks
+ from functions_documents import normalize_tag, validate_tag_color, validate_tags, update_document, propagate_tags_to_chunks
try:
normalized_old_tag = normalize_tag(tag_name)
@@ -801,14 +804,18 @@ def api_update_public_workspace_tag(tag_name):
}), 200
if new_color:
+ is_valid_color, color_error, normalized_color = validate_tag_color(new_color, normalized_old_tag)
+ if not is_valid_color:
+ return jsonify({'error': color_error}), 400
+
tag_defs = ws_doc.get('tag_definitions', {})
if normalized_old_tag in tag_defs:
- tag_defs[normalized_old_tag]['color'] = new_color
+ tag_defs[normalized_old_tag]['color'] = normalized_color
else:
from datetime import datetime, timezone
tag_defs[normalized_old_tag] = {
- 'color': new_color,
+ 'color': normalized_color,
'created_at': datetime.now(timezone.utc).isoformat()
}
@@ -816,7 +823,11 @@ def api_update_public_workspace_tag(tag_name):
cosmos_public_workspaces_container.upsert_item(ws_doc)
return jsonify({
- 'message': f'Tag color updated for "{normalized_old_tag}"'
+ 'message': f'Tag color updated for "{normalized_old_tag}"',
+ 'tag': {
+ 'name': normalized_old_tag,
+ 'color': normalized_color
+ }
}), 200
return jsonify({'error': 'No updates specified'}), 400
diff --git a/application/single_app/route_backend_public_prompts.py b/application/single_app/route_backend_public_prompts.py
index f125cbb32..75c3f678e 100644
--- a/application/single_app/route_backend_public_prompts.py
+++ b/application/single_app/route_backend_public_prompts.py
@@ -8,6 +8,20 @@
from functions_prompts import *
from swagger_wrapper import swagger_route, get_auth_security
+
+def _get_active_public_workspace_or_error(user_id):
+ try:
+ return require_active_public_workspace(
+ user_id,
+ allowed_roles=("Owner", "Admin", "DocumentManager"),
+ ), None
+ except ValueError:
+ return None, (jsonify({'error': 'No active public workspace selected'}), 400)
+ except LookupError:
+ return None, (jsonify({'error': 'Workspace not found'}), 404)
+ except PermissionError:
+ return None, (jsonify({'error': 'Access denied'}), 403)
+
def register_route_backend_public_prompts(app):
"""
Backend routes for public-workspace–scoped prompts management
@@ -20,15 +34,10 @@ def register_route_backend_public_prompts(app):
@enabled_required('enable_public_workspaces')
def api_list_public_prompts():
user_id = get_current_user_id()
- settings = get_user_settings(user_id)
- active_ws = settings['settings'].get('activePublicWorkspaceOid')
- if not active_ws:
- return jsonify({'error': 'No active public workspace selected'}), 400
- ws = find_public_workspace_by_id(active_ws)
- if not ws:
- return jsonify({'error': 'Workspace not found'}), 404
- if not get_user_role_in_public_workspace(ws, user_id):
- return jsonify({'error': 'Access denied'}), 403
+ active_workspace_context, error_response = _get_active_public_workspace_or_error(user_id)
+ if error_response:
+ return error_response
+ active_ws, _, _ = active_workspace_context
try:
items, total, page, page_size = list_prompts(
@@ -54,15 +63,10 @@ def api_list_public_prompts():
@enabled_required('enable_public_workspaces')
def api_create_public_prompt():
user_id = get_current_user_id()
- settings = get_user_settings(user_id)
- active_ws = settings['settings'].get('activePublicWorkspaceOid')
- if not active_ws:
- return jsonify({'error': 'No active public workspace selected'}), 400
- ws = find_public_workspace_by_id(active_ws)
- if not ws:
- return jsonify({'error': 'Workspace not found'}), 404
- if not get_user_role_in_public_workspace(ws, user_id):
- return jsonify({'error': 'Access denied'}), 403
+ active_workspace_context, error_response = _get_active_public_workspace_or_error(user_id)
+ if error_response:
+ return error_response
+ active_ws, _, _ = active_workspace_context
data = request.get_json() or {}
name = data.get('name','').strip()
@@ -90,15 +94,10 @@ def api_create_public_prompt():
@enabled_required('enable_public_workspaces')
def api_get_public_prompt(prompt_id):
user_id = get_current_user_id()
- settings = get_user_settings(user_id)
- active_ws = settings['settings'].get('activePublicWorkspaceOid')
- if not active_ws:
- return jsonify({'error': 'No active public workspace selected'}), 400
- ws = find_public_workspace_by_id(active_ws)
- if not ws:
- return jsonify({'error': 'Workspace not found'}), 404
- if not get_user_role_in_public_workspace(ws, user_id):
- return jsonify({'error': 'Access denied'}), 403
+ active_workspace_context, error_response = _get_active_public_workspace_or_error(user_id)
+ if error_response:
+ return error_response
+ active_ws, _, _ = active_workspace_context
try:
item = get_prompt_doc(
@@ -121,15 +120,10 @@ def api_get_public_prompt(prompt_id):
@enabled_required('enable_public_workspaces')
def api_update_public_prompt(prompt_id):
user_id = get_current_user_id()
- settings = get_user_settings(user_id)
- active_ws = settings['settings'].get('activePublicWorkspaceOid')
- if not active_ws:
- return jsonify({'error': 'No active public workspace selected'}), 400
- ws = find_public_workspace_by_id(active_ws)
- if not ws:
- return jsonify({'error': 'Workspace not found'}), 404
- if not get_user_role_in_public_workspace(ws, user_id):
- return jsonify({'error': 'Access denied'}), 403
+ active_workspace_context, error_response = _get_active_public_workspace_or_error(user_id)
+ if error_response:
+ return error_response
+ active_ws, _, _ = active_workspace_context
data = request.get_json() or {}
updates = {}
@@ -166,15 +160,10 @@ def api_update_public_prompt(prompt_id):
@enabled_required('enable_public_workspaces')
def api_delete_public_prompt(prompt_id):
user_id = get_current_user_id()
- settings = get_user_settings(user_id)
- active_ws = settings['settings'].get('activePublicWorkspaceOid')
- if not active_ws:
- return jsonify({'error': 'No active public workspace selected'}), 400
- ws = find_public_workspace_by_id(active_ws)
- if not ws:
- return jsonify({'error': 'Workspace not found'}), 404
- if not get_user_role_in_public_workspace(ws, user_id):
- return jsonify({'error': 'Access denied'}), 403
+ active_workspace_context, error_response = _get_active_public_workspace_or_error(user_id)
+ if error_response:
+ return error_response
+ active_ws, _, _ = active_workspace_context
try:
success = delete_prompt_doc(
diff --git a/application/single_app/route_backend_public_workspaces.py b/application/single_app/route_backend_public_workspaces.py
index 9307bc3c3..8a7ade4ed 100644
--- a/application/single_app/route_backend_public_workspaces.py
+++ b/application/single_app/route_backend_public_workspaces.py
@@ -211,12 +211,20 @@ def api_create_public_workspace():
def api_get_public_workspace(ws_id):
"""
GET /api/public_workspaces/
- Returns full workspace document.
+ Returns a role-aware workspace payload.
"""
+ info = get_current_user_info()
+ user_id = info["userId"]
+
ws = find_public_workspace_by_id(ws_id)
if not ws:
return jsonify({"error": "Workspace not found"}), 404
- return jsonify(ws), 200
+
+ role = get_user_role_in_public_workspace(ws, user_id)
+ if role:
+ return jsonify(build_public_workspace_member_payload(ws, user_id)), 200
+
+ return jsonify(build_public_workspace_public_summary(ws)), 200
@app.route("/api/public_workspaces/", methods=["PATCH", "PUT"])
@swagger_route(security=get_auth_security())
@@ -289,13 +297,11 @@ def api_set_active_public_workspace():
info = get_current_user_info()
user_id = info["userId"]
- ws = find_public_workspace_by_id(ws_id)
- if not ws:
+ try:
+ update_active_public_workspace_for_user(user_id, ws_id)
+ except LookupError:
return jsonify({"error": "Workspace not found"}), 404
- # Public workspaces are accessible to all authenticated users for chat.
- # No membership check needed — any user can set a public workspace as active.
- update_active_public_workspace_for_user(user_id, ws_id)
return jsonify({"message": f"Active set to {ws_id}"}), 200
@app.route("/api/public_workspaces//requests", methods=["GET"])
diff --git a/application/single_app/route_backend_users.py b/application/single_app/route_backend_users.py
index 459aa800c..eac97c37e 100644
--- a/application/single_app/route_backend_users.py
+++ b/application/single_app/route_backend_users.py
@@ -2,9 +2,16 @@
from config import *
from functions_authentication import *
+from functions_group import update_active_group_for_user
+from functions_public_workspaces import update_active_public_workspace_for_user
from functions_settings import *
from swagger_wrapper import swagger_route, get_auth_security
+
+def _escape_graph_odata_literal(value):
+ return str(value or "").replace("'", "''")
+
+
def register_route_backend_users(app):
"""
This route will expose GET /api/userSearch?query= which calls
@@ -20,6 +27,8 @@ def api_user_search():
if not query:
return jsonify([]), 200
+ safe_query = _escape_graph_odata_literal(query)
+
token = get_valid_access_token()
if not token:
return jsonify({"error": "Could not acquire access token"}), 401
@@ -32,9 +41,9 @@ def api_user_search():
}
filter_str = (
- f"startswith(displayName, '{query}') "
- f"or startswith(mail, '{query}') "
- f"or startswith(userPrincipalName, '{query}')"
+ f"startswith(displayName, '{safe_query}') "
+ f"or startswith(mail, '{safe_query}') "
+ f"or startswith(userPrincipalName, '{safe_query}')"
)
params = {
"$filter": filter_str,
@@ -163,13 +172,54 @@ def user_settings():
invalid_keys = set(settings_to_update.keys()) - allowed_keys
if invalid_keys:
print(f"Warning: Received invalid settings keys: {invalid_keys}")
- # Decide whether to ignore them or return an error
- # To ignore: settings_to_update = {k: v for k, v in settings_to_update.items() if k in allowed_keys}
- # To error: return jsonify({"error": f"Invalid settings keys provided: {', '.join(invalid_keys)}"}), 400
+ settings_to_update = {
+ key: value
+ for key, value in settings_to_update.items()
+ if key in allowed_keys
+ }
+ if not settings_to_update:
+ return jsonify({"error": "No valid settings keys provided"}), 400
+
+
+ settings_to_update = dict(settings_to_update)
+ active_group_updated = False
+ active_public_workspace_updated = False
+
+ if "activeGroupOid" in settings_to_update:
+ requested_active_group = str(settings_to_update.pop("activeGroupOid") or "").strip()
+ if requested_active_group:
+ try:
+ update_active_group_for_user(requested_active_group, user_id=user_id)
+ active_group_updated = True
+ except LookupError:
+ return jsonify({"error": "Group not found"}), 404
+ except PermissionError:
+ return jsonify({"error": "You are not a member of this group"}), 403
+ else:
+ settings_to_update["activeGroupOid"] = requested_active_group
+ if "activePublicWorkspaceOid" in settings_to_update:
+ requested_active_public_workspace = str(
+ settings_to_update.pop("activePublicWorkspaceOid") or ""
+ ).strip()
+ if requested_active_public_workspace:
+ try:
+ update_active_public_workspace_for_user(
+ user_id,
+ requested_active_public_workspace,
+ )
+ active_public_workspace_updated = True
+ except LookupError:
+ return jsonify({"error": "Workspace not found"}), 404
+ else:
+ settings_to_update["activePublicWorkspaceOid"] = requested_active_public_workspace
# Call the updated function - it handles merging and timestamp
- success = update_user_settings(user_id, settings_to_update)
+ success = True
+ if settings_to_update:
+ success = update_user_settings(user_id, settings_to_update)
+ elif active_group_updated or active_public_workspace_updated:
+ success = True
if success:
return jsonify({"message": "User settings updated successfully"}), 200
diff --git a/application/single_app/route_enhanced_citations.py b/application/single_app/route_enhanced_citations.py
index ca1b9e48e..b3c90f967 100644
--- a/application/single_app/route_enhanced_citations.py
+++ b/application/single_app/route_enhanced_citations.py
@@ -3,16 +3,19 @@
from flask import jsonify, request, Response
from datetime import datetime, timedelta
+import logging
import os
import tempfile
import requests
import mimetypes
import io
import pandas
+import fitz
from functions_authentication import login_required, user_required, get_current_user_id
+from functions_appinsights import log_event
from functions_settings import get_settings, enabled_required
-from functions_documents import get_document_metadata
+from functions_documents import get_document_metadata, get_document_blob_storage_info
from functions_group import get_user_groups
from functions_public_workspaces import get_user_visible_public_workspace_ids_from_settings
from swagger_wrapper import swagger_route, get_auth_security
@@ -62,6 +65,28 @@ def _serialize_tabular_preview_table(df_preview):
]
return columns, rows
+
+def _log_enhanced_citations_debug(message, **details):
+ """Write debug-gated enhanced citations diagnostics."""
+ log_event(
+ f"[EnhancedCitations] {message}",
+ extra=details or None,
+ debug_only=True,
+ category="EnhancedCitations",
+ )
+
+
+def _log_enhanced_citations_error(message, error, **details):
+ """Write structured error diagnostics for enhanced citations failures."""
+ error_details = dict(details)
+ error_details["error"] = str(error)
+ log_event(
+ f"[EnhancedCitations] {message}",
+ extra=error_details,
+ level=logging.ERROR,
+ exceptionTraceback=True,
+ )
+
def register_enhanced_citations_routes(app):
"""Register enhanced citations routes"""
@@ -234,7 +259,13 @@ def get_enhanced_citation_pdf():
if not doc_id:
return jsonify({"error": "doc_id is required"}), 400
- debug_print(f"Enhanced citations PDF request - doc_id: {doc_id}, page: {page_number}, show_all: {show_all}")
+ _log_enhanced_citations_debug(
+ "PDF request received",
+ doc_id=doc_id,
+ page=page_number,
+ show_all=show_all,
+ download=download,
+ )
user_id = get_current_user_id()
if not user_id:
@@ -263,6 +294,14 @@ def get_enhanced_citation_pdf():
return serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all)
except Exception as e:
+ _log_enhanced_citations_error(
+ "PDF request failed",
+ e,
+ doc_id=doc_id,
+ page=page_number,
+ show_all=show_all,
+ download=download,
+ )
return jsonify({"error": str(e)}), 500
@app.route("/api/enhanced_citations/tabular", methods=["GET"])
@@ -607,35 +646,60 @@ def get_blob_name(raw_doc, workspace_type):
"""
_, blob_name = get_document_blob_storage_info(raw_doc)
if blob_name:
+ _log_enhanced_citations_debug(
+ "Using stored blob path for citation content",
+ doc_id=raw_doc.get('id'),
+ workspace_type=workspace_type,
+ blob_name=blob_name,
+ )
return blob_name
if workspace_type == 'public':
- return f"{raw_doc['public_workspace_id']}/{raw_doc['file_name']}"
+ fallback_blob_name = f"{raw_doc['public_workspace_id']}/{raw_doc['file_name']}"
elif workspace_type == 'group':
- return f"{raw_doc['group_id']}/{raw_doc['file_name']}"
+ fallback_blob_name = f"{raw_doc['group_id']}/{raw_doc['file_name']}"
else:
- return f"{raw_doc['user_id']}/{raw_doc['file_name']}"
+ fallback_blob_name = f"{raw_doc['user_id']}/{raw_doc['file_name']}"
+
+ _log_enhanced_citations_debug(
+ "Using legacy blob path fallback for citation content",
+ doc_id=raw_doc.get('id'),
+ workspace_type=workspace_type,
+ blob_name=fallback_blob_name,
+ )
+ return fallback_blob_name
def serve_enhanced_citation_content(raw_doc, content_type=None, force_download=False):
"""
Server-side rendering: Serve enhanced citation file content directly
Based on the logic from the existing view_pdf function but serves content directly
"""
- settings = get_settings()
-
# Get blob storage client
blob_service_client = CLIENTS.get("storage_account_office_docs_client")
if not blob_service_client:
raise Exception("Blob storage client not available")
-
- # Determine workspace type and container
- workspace_type, container_name = determine_workspace_type_and_container(raw_doc)
- container_client = blob_service_client.get_container_client(container_name)
-
- # Build blob name based on workspace type
- blob_name = get_blob_name(raw_doc, workspace_type)
-
+
+ doc_id = raw_doc.get('id')
+ file_name = raw_doc.get('file_name')
+ workspace_type = None
+ container_name = None
+ blob_name = None
+
try:
+ workspace_type, container_name = determine_workspace_type_and_container(raw_doc)
+ blob_name = get_blob_name(raw_doc, workspace_type)
+ container_client = blob_service_client.get_container_client(container_name)
+
+ _log_enhanced_citations_debug(
+ "Downloading citation content from blob storage",
+ doc_id=doc_id,
+ file_name=file_name,
+ workspace_type=workspace_type,
+ container_name=container_name,
+ blob_name=blob_name,
+ force_download=force_download,
+ )
+
# Download blob content directly
blob_client = container_client.get_blob_client(blob_name)
blob_data = blob_client.download_blob()
@@ -659,6 +723,18 @@ def serve_enhanced_citation_content(raw_doc, content_type=None, force_download=F
content_type = 'audio/mpeg'
else:
content_type = 'application/octet-stream'
+
+ _log_enhanced_citations_debug(
+ "Citation content downloaded successfully",
+ doc_id=doc_id,
+ file_name=file_name,
+ workspace_type=workspace_type,
+ container_name=container_name,
+ blob_name=blob_name,
+ content_type=content_type,
+ content_length=len(content),
+ force_download=force_download,
+ )
# Set content disposition based on force_download parameter
disposition = 'attachment' if force_download else 'inline'
@@ -678,8 +754,17 @@ def serve_enhanced_citation_content(raw_doc, content_type=None, force_download=F
return response
except Exception as e:
- print(f"Error serving enhanced citation content: {e}")
- raise Exception(f"Failed to load content: {str(e)}")
+ _log_enhanced_citations_error(
+ "Failed to serve citation content",
+ e,
+ doc_id=doc_id,
+ file_name=file_name,
+ workspace_type=workspace_type,
+ container_name=container_name,
+ blob_name=blob_name,
+ force_download=force_download,
+ )
+ raise Exception(f"Failed to load content: {str(e)}") from e
def serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all=False):
"""
@@ -691,25 +776,40 @@ def serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all=False):
page_number: Current page number
show_all: If True, show all pages instead of just ±1 pages around current
"""
- debug_print(f"serve_enhanced_citation_pdf_content called with show_all: {show_all}")
-
- import io
- import uuid
- import tempfile
- import fitz # PyMuPDF
+ _log_enhanced_citations_debug(
+ "Preparing PDF citation content",
+ doc_id=raw_doc.get('id'),
+ file_name=raw_doc.get('file_name'),
+ page=page_number,
+ show_all=show_all,
+ )
blob_service_client = CLIENTS.get("storage_account_office_docs_client")
if not blob_service_client:
raise Exception("Blob storage client not available")
-
- # Determine workspace type and container
- workspace_type, container_name = determine_workspace_type_and_container(raw_doc)
- container_client = blob_service_client.get_container_client(container_name)
-
- # Build blob name based on workspace type
- blob_name = get_blob_name(raw_doc, workspace_type)
-
+
+ doc_id = raw_doc.get('id')
+ file_name = raw_doc.get('file_name')
+ workspace_type = None
+ container_name = None
+ blob_name = None
+
try:
+ workspace_type, container_name = determine_workspace_type_and_container(raw_doc)
+ blob_name = get_blob_name(raw_doc, workspace_type)
+ container_client = blob_service_client.get_container_client(container_name)
+
+ _log_enhanced_citations_debug(
+ "Downloading PDF citation blob",
+ doc_id=doc_id,
+ file_name=file_name,
+ workspace_type=workspace_type,
+ container_name=container_name,
+ blob_name=blob_name,
+ page=page_number,
+ show_all=show_all,
+ )
+
# Download blob content directly
blob_client = container_client.get_blob_client(blob_name)
blob_data = blob_client.download_blob()
@@ -727,6 +827,13 @@ def serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all=False):
current_idx = page_number - 1 # zero-based
if current_idx < 0 or current_idx >= total_pages:
+ _log_enhanced_citations_debug(
+ "Requested PDF page was out of range",
+ doc_id=doc_id,
+ file_name=file_name,
+ page=page_number,
+ total_pages=total_pages,
+ )
pdf_document.close()
os.remove(temp_pdf_path)
return jsonify({"error": "Requested page out of range"}), 400
@@ -778,6 +885,19 @@ def serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all=False):
extracted_pdf.close()
pdf_document.close()
+ _log_enhanced_citations_debug(
+ "Built PDF citation sub-document",
+ doc_id=doc_id,
+ file_name=file_name,
+ page=page_number,
+ show_all=show_all,
+ total_pages=total_pages,
+ start_idx=start_idx,
+ end_idx=end_idx,
+ viewer_page=new_page_number,
+ content_length=len(extracted_content),
+ )
+
# Return the extracted PDF
headers = {
'Content-Length': str(len(extracted_content)),
@@ -789,7 +909,12 @@ def serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all=False):
# When show_all is True, allow iframe embedding
if show_all:
- debug_print(f"Setting CSP headers for iframe embedding (show_all={show_all})")
+ _log_enhanced_citations_debug(
+ "Setting CSP headers for iframe embedding",
+ doc_id=doc_id,
+ file_name=file_name,
+ show_all=show_all,
+ )
headers['Content-Security-Policy'] = (
"default-src 'self'; "
"frame-ancestors 'self'; " # Allow embedding in same origin
@@ -797,7 +922,12 @@ def serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all=False):
)
headers['X-Frame-Options'] = 'SAMEORIGIN' # Allow same-origin framing
else:
- debug_print(f"NOT setting CSP headers for iframe embedding (show_all={show_all})")
+ _log_enhanced_citations_debug(
+ "Skipping iframe embedding headers for sub-document response",
+ doc_id=doc_id,
+ file_name=file_name,
+ show_all=show_all,
+ )
response = Response(
extracted_content,
@@ -812,5 +942,15 @@ def serve_enhanced_citation_pdf_content(raw_doc, page_number, show_all=False):
os.remove(temp_pdf_path)
except Exception as e:
- print(f"Error serving PDF citation content: {e}")
- raise Exception(f"Failed to load PDF content: {str(e)}")
+ _log_enhanced_citations_error(
+ "Failed to serve PDF citation content",
+ e,
+ doc_id=doc_id,
+ file_name=file_name,
+ workspace_type=workspace_type,
+ container_name=container_name,
+ blob_name=blob_name,
+ page=page_number,
+ show_all=show_all,
+ )
+ raise Exception(f"Failed to load PDF content: {str(e)}") from e
diff --git a/application/single_app/route_external_health.py b/application/single_app/route_external_health.py
index ce4508d00..3da829c37 100644
--- a/application/single_app/route_external_health.py
+++ b/application/single_app/route_external_health.py
@@ -16,4 +16,15 @@ def external_health_check():
"""External health check endpoint for monitoring."""
now = datetime.now()
time_string = now.strftime("%Y-%m-%d %H:%M:%S")
- return time_string, 200
\ No newline at end of file
+ return time_string, 200
+
+
+def register_no_auth_health(app):
+ @app.route('/external/healthcheckz', methods=['GET'])
+ @swagger_route()
+ @enabled_required("enable_no_auth_external_healthcheck")
+ def no_auth_external_healthcheck():
+ return {
+ "status": "ok",
+ "time": datetime.now().strftime("%Y-%m-%d %H:%M:%S")
+ }, 200
\ No newline at end of file
diff --git a/application/single_app/route_frontend_admin_settings.py b/application/single_app/route_frontend_admin_settings.py
index 129dfcde6..435134c65 100644
--- a/application/single_app/route_frontend_admin_settings.py
+++ b/application/single_app/route_frontend_admin_settings.py
@@ -154,6 +154,10 @@ def admin_settings():
# --- Add default for swagger documentation ---
if 'enable_swagger' not in settings:
settings['enable_swagger'] = True # Default enabled for development/testing
+ if 'enable_external_healthcheck' not in settings:
+ settings['enable_external_healthcheck'] = False
+ if 'enable_no_auth_external_healthcheck' not in settings:
+ settings['enable_no_auth_external_healthcheck'] = False
if 'release_notifications_registered' not in settings:
settings['release_notifications_registered'] = False
if 'release_notifications_name' not in settings:
@@ -709,7 +713,7 @@ def parse_admin_int(raw_value, fallback_value, field_name="unknown", hard_defaul
level=logging.INFO,
)
log_general_admin_action(
- admin_user_id=admin_user,
+ admin_user_id=user_id,
admin_email=admin_email,
action='Enabled and migrated multi-model endpoints',
description=f'Migrated {len(migrated_models)} models to multi-endpoint configuration.'
@@ -1101,6 +1105,7 @@ def is_valid_url(url):
'release_notifications_registered_at': form_data.get('release_notifications_registered_at', settings.get('release_notifications_registered_at', '')).strip(),
'release_notifications_updated_at': form_data.get('release_notifications_updated_at', settings.get('release_notifications_updated_at', '')).strip(),
'enable_external_healthcheck': form_data.get('enable_external_healthcheck') == 'on',
+ 'enable_no_auth_external_healthcheck': form_data.get('enable_no_auth_external_healthcheck') == 'on',
'enable_swagger': form_data.get('enable_swagger') == 'on',
'enable_semantic_kernel': form_data.get('enable_semantic_kernel') == 'on',
'per_user_semantic_kernel': form_data.get('per_user_semantic_kernel') == 'on',
@@ -1261,7 +1266,7 @@ def is_valid_url(url):
'enable_web_search': enable_web_search,
'web_search_consent_accepted': web_search_consent_accepted,
'enable_web_search_user_notice': form_data.get('enable_web_search_user_notice') == 'on',
- 'web_search_user_notice_text': form_data.get('web_search_user_notice_text', 'Your message will be sent to Microsoft Bing for web search. Only your current message is sent, not your conversation history.').strip(),
+ 'web_search_user_notice_text': form_data.get('web_search_user_notice_text', 'Your current message will be sent to Microsoft Bing for web search. Conversation history is not sent for web search, but any sensitive content you paste into this message may be sent.').strip(),
'web_search_agent': {
'agent_type': 'aifoundry',
'azure_openai_gpt_endpoint': form_data.get('web_search_foundry_endpoint', '').strip(),
diff --git a/application/single_app/route_frontend_authentication.py b/application/single_app/route_frontend_authentication.py
index 3dbec4526..95de17e3d 100644
--- a/application/single_app/route_frontend_authentication.py
+++ b/application/single_app/route_frontend_authentication.py
@@ -2,6 +2,7 @@
from unittest import result
from config import *
+from functions_activity_logging import log_user_login, record_user_login_session_activity
from functions_authentication import _build_msal_app, _load_cache, _save_cache, clear_requested_oauth_scopes, get_requested_oauth_scopes
from functions_debug import debug_print
from swagger_wrapper import swagger_route, get_auth_security
@@ -133,10 +134,10 @@ def authorized():
# Log the login activity
try:
- from functions_activity_logging import log_user_login
user_id = session['user'].get('oid') or session['user'].get('sub')
if user_id:
log_user_login(user_id, 'azure_ad')
+ record_user_login_session_activity(session)
except Exception as e:
debug_print(f"Could not log login activity: {e}")
diff --git a/application/single_app/route_frontend_conversations.py b/application/single_app/route_frontend_conversations.py
index d2b428fe2..b2e65be49 100644
--- a/application/single_app/route_frontend_conversations.py
+++ b/application/single_app/route_frontend_conversations.py
@@ -10,6 +10,22 @@
)
from swagger_wrapper import swagger_route, get_auth_security
+
+def _authorize_frontend_personal_conversation_access(user_id, conversation_id):
+ """Load a personal conversation and ensure the caller owns it."""
+ try:
+ conversation_item = cosmos_conversations_container.read_item(
+ item=conversation_id,
+ partition_key=conversation_id,
+ )
+ except CosmosResourceNotFoundError as exc:
+ raise LookupError(f"Conversation {conversation_id} not found") from exc
+
+ if conversation_item.get('user_id') != user_id:
+ raise PermissionError('Forbidden')
+
+ return conversation_item
+
def register_route_frontend_conversations(app):
@app.route('/conversations')
@swagger_route(security=get_auth_security())
@@ -41,12 +57,11 @@ def view_conversation(conversation_id):
if not user_id:
return redirect(url_for('login'))
try:
- conversation_item = cosmos_conversations_container.read_item(
- item=conversation_id,
- partition_key=conversation_id
- )
- except Exception:
+ _authorize_frontend_personal_conversation_access(user_id, conversation_id)
+ except LookupError:
return "Conversation not found", 404
+ except PermissionError:
+ return "Forbidden", 403
message_query = f"""
SELECT * FROM c
@@ -70,9 +85,11 @@ def get_conversation_messages(conversation_id):
return jsonify({'error': 'User not authenticated'}), 401
try:
- _ = cosmos_conversations_container.read_item(conversation_id, conversation_id)
- except CosmosResourceNotFoundError:
+ _authorize_frontend_personal_conversation_access(user_id, conversation_id)
+ except LookupError:
return jsonify({'error': 'Conversation not found'}), 404
+ except PermissionError:
+ return jsonify({'error': 'Forbidden'}), 403
msg_query = f"""
SELECT * FROM c
diff --git a/application/single_app/route_frontend_group_workspaces.py b/application/single_app/route_frontend_group_workspaces.py
index e92d5a980..6e3186f62 100644
--- a/application/single_app/route_frontend_group_workspaces.py
+++ b/application/single_app/route_frontend_group_workspaces.py
@@ -2,7 +2,7 @@
from config import *
from functions_authentication import *
-from functions_group import get_group_model_endpoints
+from functions_group import get_group_model_endpoints, require_active_group, update_active_group_for_user
from functions_settings import *
from swagger_wrapper import swagger_route, get_auth_security
@@ -18,7 +18,10 @@ def group_workspaces():
settings = get_settings()
user_settings = get_user_settings(user_id)
public_settings = sanitize_settings_for_user(settings)
- active_group_id = user_settings.get("settings", {}).get("activeGroupOid")
+ try:
+ active_group_id = require_active_group(user_id)
+ except (ValueError, LookupError, PermissionError):
+ active_group_id = None
enable_document_classification = settings.get('enable_document_classification', False)
enable_file_sharing = settings.get('enable_file_sharing', False)
enable_extract_meta_data = settings.get('enable_extract_meta_data', False)
@@ -97,7 +100,12 @@ def set_active_group():
group_id = request.form.get("group_id")
if not user_id or not group_id:
return "Missing user or group id", 400
- success = update_user_settings(user_id, {"activeGroupOid": group_id})
- if not success:
- return "Failed to update user settings", 500
+
+ try:
+ update_active_group_for_user(group_id, user_id=user_id)
+ except LookupError:
+ return "Group not found", 404
+ except PermissionError:
+ return "You are not a member of this group", 403
+
return redirect(url_for('group_workspaces'))
diff --git a/application/single_app/route_frontend_public_workspaces.py b/application/single_app/route_frontend_public_workspaces.py
index 05d5b982a..fa1178d64 100644
--- a/application/single_app/route_frontend_public_workspaces.py
+++ b/application/single_app/route_frontend_public_workspaces.py
@@ -2,6 +2,7 @@
from config import *
from functions_authentication import *
+from functions_public_workspaces import update_active_public_workspace_for_user
from functions_settings import *
from swagger_wrapper import swagger_route, get_auth_security
@@ -116,7 +117,10 @@ def set_active_public_workspace():
workspace_id = request.form.get("workspace_id")
if not user_id or not workspace_id:
return "Missing user or workspace id", 400
- success = update_user_settings(user_id, {"activePublicWorkspaceOid": workspace_id})
- if not success:
- return "Failed to update user settings", 500
+
+ try:
+ update_active_public_workspace_for_user(user_id, workspace_id)
+ except LookupError:
+ return "Workspace not found", 404
+
return redirect(url_for('public_workspaces'))
\ No newline at end of file
diff --git a/application/single_app/route_plugin_logging.py b/application/single_app/route_plugin_logging.py
index 940d540ea..69b7ce35b 100644
--- a/application/single_app/route_plugin_logging.py
+++ b/application/single_app/route_plugin_logging.py
@@ -4,7 +4,7 @@
"""
from flask import Blueprint, jsonify, request
-from functions_authentication import login_required, get_current_user_id
+from functions_authentication import admin_required, login_required, get_current_user_id
from functions_appinsights import log_event
from semantic_kernel_plugins.plugin_invocation_logger import get_plugin_logger
from swagger_wrapper import swagger_route, get_auth_security
@@ -122,10 +122,10 @@ def get_plugin_stats():
security=get_auth_security()
)
@login_required
+@admin_required
def get_recent_invocations():
"""Get the most recent plugin invocations across all users (admin only)."""
try:
- # Note: You might want to add admin role checking here
plugin_logger = get_plugin_logger()
limit = request.args.get('limit', 20, type=int)
@@ -220,6 +220,7 @@ def get_plugin_specific_invocations(plugin_name):
security=get_auth_security()
)
@login_required
+@admin_required
def clear_plugin_logs():
"""Clear plugin invocation logs (admin only or for testing)."""
try:
diff --git a/application/single_app/semantic_kernel_loader.py b/application/single_app/semantic_kernel_loader.py
index 3a2ca4b54..7bdddda2f 100644
--- a/application/single_app/semantic_kernel_loader.py
+++ b/application/single_app/semantic_kernel_loader.py
@@ -36,7 +36,16 @@
from semantic_kernel_plugins.smart_http_plugin import SmartHttpPlugin
from functions_debug import debug_print
from flask import g
-from functions_keyvault import SecretReturnType, keyvault_model_endpoint_get_helper, retrieve_secret_from_key_vault, retrieve_secret_from_key_vault_by_full_name, validate_secret_name_dynamic
+from functions_keyvault import (
+ SQL_PLUGIN_SENSITIVE_ADDITIONAL_FIELDS,
+ SQL_PLUGIN_SENSITIVE_AUTH_FIELDS,
+ SecretReturnType,
+ keyvault_model_endpoint_get_helper,
+ resolve_secret_reference_for_context,
+ retrieve_secret_from_key_vault,
+ retrieve_secret_from_key_vault_by_full_name,
+ validate_secret_name_dynamic,
+)
from functions_global_actions import get_global_actions
from functions_global_agents import get_global_agents
from functions_group_agents import get_group_agent, get_group_agents
@@ -47,7 +56,8 @@
from functions_agent_payload import can_agent_use_default_multi_endpoint_model
from semantic_kernel_plugins.plugin_loader import discover_plugins
from semantic_kernel_plugins.openapi_plugin_factory import OpenApiPluginFactory
-from functions_agent_scope import find_agent_by_scope
+from config import cognitive_services_scope
+from functions_agent_scope import find_agent_by_scope, is_selected_agent_scope_enabled
import app_settings_cache
# Agent and Azure OpenAI chat service imports
@@ -278,6 +288,9 @@ def resolve_authority(auth_settings):
return custom_authority
return AzureAuthorityHosts.AZURE_PUBLIC_CLOUD
+ def resolve_aoai_scope():
+ return str(cognitive_services_scope or "").strip()
+
def resolve_foundry_scope(auth_settings, endpoint=None):
custom_scope = (auth_settings.get("foundry_scope") or "").strip()
if custom_scope:
@@ -317,9 +330,10 @@ def build_token_provider(auth_settings, provider="aoai", endpoint=None):
authority=authority,
)
- scope = "https://cognitiveservices.azure.com/.default"
if provider in ("aifoundry", "new_foundry"):
scope = resolve_foundry_scope(auth_settings, endpoint=endpoint)
+ else:
+ scope = resolve_aoai_scope()
return get_bearer_token_provider(credential, scope)
@@ -1595,6 +1609,27 @@ def create_chat_completion_service():
log_event(f"[SK Loader] load_single_agent_for_kernel completed - returning {len(agent_objs)} agents: {list(agent_objs.keys())}", level=logging.INFO)
return kernel, agent_objs
+def _get_plugin_secret_context(plugin_manifest):
+ """Infer the expected Key Vault scope for a plugin manifest."""
+ if not isinstance(plugin_manifest, dict):
+ return None, None
+
+ plugin_scope = str(plugin_manifest.get("scope") or "").strip().lower()
+ if plugin_scope == "group" or plugin_manifest.get("is_group"):
+ return plugin_manifest.get("group_id"), "group"
+ if plugin_scope == "global" or plugin_manifest.get("is_global"):
+ return plugin_manifest.get("id"), "global"
+ if plugin_scope == "user" or plugin_manifest.get("user_id"):
+ return plugin_manifest.get("user_id"), "user"
+ return plugin_manifest.get("id"), "global"
+
+
+def _is_sql_sensitive_plugin_field(plugin_manifest, field_name):
+ """Return True when an additional field should resolve as a SQL secret."""
+ plugin_type = str((plugin_manifest or {}).get("type") or "").strip().lower()
+ return plugin_type in {"sql_query", "sql_schema"} and field_name in SQL_PLUGIN_SENSITIVE_ADDITIONAL_FIELDS
+
+
def resolve_key_vault_secrets_in_plugins(plugin_manifest, settings):
"""
Resolve any Key Vault secrets in a plugin manifest.
@@ -1606,26 +1641,66 @@ def resolve_key_vault_secrets_in_plugins(plugin_manifest, settings):
if not kv_name:
raise ValueError("Key Vault name not configured in settings")
- def resolve_value(value):
- if isinstance(value, str) and validate_secret_name_dynamic(value):
- resolved = retrieve_secret_from_key_vault_by_full_name(value)
- if resolved:
- return resolved
- else:
- raise ValueError(f"Failed to retrieve secret '{value}' from Key Vault '{kv_name}'")
- return value
-
- resolved_manifest = {}
- for k, v in plugin_manifest.items():
- debug_print(f"[SK Loader] Resolving plugin manifest key: {k} with value type: {type(v)}")
- if isinstance(v, str):
- resolved_manifest[k] = resolve_value(v)
- elif isinstance(v, list):
- resolved_manifest[k] = [resolve_value(item) for item in v]
- elif isinstance(v, dict):
- resolved_manifest[k] = {sub_k: resolve_value(sub_v) for sub_k, sub_v in v.items()}
- else:
- resolved_manifest[k] = v # Leave other types unchanged
+ scope_value, scope = _get_plugin_secret_context(plugin_manifest)
+ resolved_manifest = dict(plugin_manifest)
+
+ auth = plugin_manifest.get("auth", {})
+ if isinstance(auth, dict):
+ resolved_auth = dict(auth)
+ for auth_field in ("key", *SQL_PLUGIN_SENSITIVE_AUTH_FIELDS):
+ value = auth.get(auth_field)
+ if not isinstance(value, str) or not validate_secret_name_dynamic(value):
+ continue
+ try:
+ resolved_auth[auth_field] = resolve_secret_reference_for_context(
+ value,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action"},
+ context_label=f"plugin auth field '{auth_field}'",
+ )
+ except ValueError as exc:
+ log_event(
+ f"[SK Loader] Blocked plugin auth secret resolution for field '{auth_field}': {exc}",
+ extra={
+ "plugin_name": plugin_manifest.get("name"),
+ "plugin_id": plugin_manifest.get("id"),
+ "scope": scope,
+ },
+ level=logging.WARNING,
+ )
+ resolved_auth[auth_field] = ""
+ resolved_manifest["auth"] = resolved_auth
+
+ additional_fields = plugin_manifest.get("additionalFields", {})
+ if isinstance(additional_fields, dict):
+ resolved_additional_fields = dict(additional_fields)
+ for field_name, value in additional_fields.items():
+ if not isinstance(value, str) or not validate_secret_name_dynamic(value):
+ continue
+ if not (field_name.endswith("__Secret") or _is_sql_sensitive_plugin_field(plugin_manifest, field_name)):
+ continue
+ try:
+ resolved_additional_fields[field_name] = resolve_secret_reference_for_context(
+ value,
+ scope_value=scope_value,
+ scope=scope,
+ allowed_sources={"action-addset"},
+ context_label=f"plugin additional field '{field_name}'",
+ )
+ except ValueError as exc:
+ log_event(
+ f"[SK Loader] Blocked plugin additionalField secret resolution for '{field_name}': {exc}",
+ extra={
+ "plugin_name": plugin_manifest.get("name"),
+ "plugin_id": plugin_manifest.get("id"),
+ "scope": scope,
+ },
+ level=logging.WARNING,
+ )
+ resolved_additional_fields[field_name] = ""
+ resolved_manifest["additionalFields"] = resolved_additional_fields
+
return resolved_manifest
def load_plugins_for_kernel(kernel, plugin_manifests, settings, mode_label="global"):
@@ -1897,24 +1972,34 @@ def load_user_semantic_kernel(kernel: Kernel, settings, user_id: str, redis_clie
# Append selected group agent (if any) to the candidate list so downstream selection logic can resolve it
selected_agent_data = selected_agent if isinstance(selected_agent, dict) else {}
+ selected_agent_is_global = selected_agent_data.get('is_global', False)
selected_agent_is_group = selected_agent_data.get('is_group', False)
selected_agent_group_id = selected_agent_data.get('group_id')
conversation_group_id = getattr(g, "conversation_group_id", None)
allow_user_agents = settings.get('allow_user_agents', False)
allow_group_agents = settings.get('allow_group_agents', False)
- if selected_agent_is_group and not allow_group_agents:
- log_event(
- "[SK Loader] Group agents are disabled; skipping group agent load.",
- level=logging.WARNING
- )
- load_core_plugins_only(kernel, settings)
- return kernel, None
- if not selected_agent_is_group and not allow_user_agents:
- log_event(
- "[SK Loader] User agents are disabled; skipping personal agent load.",
- level=logging.WARNING
- )
+ if not is_selected_agent_scope_enabled(settings, selected_agent_data):
+ if selected_agent_is_group:
+ log_event(
+ "[SK Loader] Group agents are disabled; skipping group agent load.",
+ level=logging.WARNING,
+ extra={
+ 'agent_name': selected_agent_data.get('name'),
+ 'allow_group_agents': allow_group_agents,
+ 'is_global': selected_agent_is_global,
+ }
+ )
+ else:
+ log_event(
+ "[SK Loader] User agents are disabled; skipping personal agent load.",
+ level=logging.WARNING,
+ extra={
+ 'agent_name': selected_agent_data.get('name'),
+ 'allow_user_agents': allow_user_agents,
+ 'is_global': selected_agent_is_global,
+ }
+ )
load_core_plugins_only(kernel, settings)
return kernel, None
diff --git a/application/single_app/semantic_kernel_plugins/fact_memory_plugin.py b/application/single_app/semantic_kernel_plugins/fact_memory_plugin.py
index 188d16fac..f38bd37ed 100644
--- a/application/single_app/semantic_kernel_plugins/fact_memory_plugin.py
+++ b/application/single_app/semantic_kernel_plugins/fact_memory_plugin.py
@@ -5,8 +5,12 @@
- Exposes methods for use as a Semantic Kernel plugin (does not need to derive from BasePlugin).
- Read/inject logic is handled separately by orchestration utility.
"""
+import logging
+from flask import g, has_request_context
from typing import Optional, List
+from functions_appinsights import log_event
+from functions_authentication import get_current_user_id
from semantic_kernel.functions import kernel_function
from semantic_kernel_fact_memory_store import FactMemoryStore
@@ -18,6 +22,82 @@ def __init__(self, store: Optional[FactMemoryStore] = None):
self.store = store or FactMemoryStore()
auto_wrap_plugin_functions(self, self.__class__.__name__)
+ def _get_authorized_fact_memory_scope(self) -> dict:
+ """Return the canonical request-scoped fact-memory authorization boundary."""
+ if not has_request_context():
+ raise PermissionError('Fact memory requires an active request context.')
+
+ current_user_id = str(get_current_user_id() or '').strip()
+ if not current_user_id:
+ raise PermissionError('User not authenticated.')
+
+ authorized_context = dict(getattr(g, 'authorized_chat_context', {}) or {})
+ authorized_user_id = str(authorized_context.get('user_id') or current_user_id).strip()
+ if authorized_user_id != current_user_id:
+ authorized_user_id = current_user_id
+
+ authorized_scope_id = str(
+ authorized_context.get('fact_memory_scope_id')
+ or authorized_context.get('active_group_id')
+ or current_user_id
+ ).strip()
+ authorized_scope_type = str(
+ authorized_context.get('fact_memory_scope_type')
+ or ('group' if authorized_context.get('active_group_id') else 'user')
+ ).strip().lower()
+ if authorized_scope_type not in {'user', 'group'}:
+ authorized_scope_type = 'user'
+
+ authorized_conversation_id = str(
+ authorized_context.get('conversation_id') or getattr(g, 'conversation_id', '') or ''
+ ).strip() or None
+
+ return {
+ 'user_id': authorized_user_id,
+ 'scope_id': authorized_scope_id,
+ 'scope_type': authorized_scope_type,
+ 'conversation_id': authorized_conversation_id,
+ }
+
+ def _resolve_authorized_fact_memory_call(
+ self,
+ scope_type: str = '',
+ scope_id: str = '',
+ conversation_id: str = '',
+ ) -> dict:
+ """Normalize tool-call scope arguments against the authorized request scope."""
+ authorized_scope = self._get_authorized_fact_memory_scope()
+ requested_scope_type = str(scope_type or '').strip().lower()
+ requested_scope_id = str(scope_id or '').strip()
+ requested_conversation_id = str(conversation_id or '').strip()
+
+ if (
+ (requested_scope_type and requested_scope_type != authorized_scope['scope_type'])
+ or (requested_scope_id and requested_scope_id != authorized_scope['scope_id'])
+ ):
+ log_event(
+ '[FactMemoryPlugin] Overriding mismatched fact-memory scope in tool call.',
+ extra={
+ 'requested_scope_type': requested_scope_type,
+ 'requested_scope_id': requested_scope_id,
+ 'authorized_scope_type': authorized_scope['scope_type'],
+ 'authorized_scope_id': authorized_scope['scope_id'],
+ },
+ level=logging.WARNING,
+ )
+
+ if requested_conversation_id and requested_conversation_id != authorized_scope['conversation_id']:
+ log_event(
+ '[FactMemoryPlugin] Overriding mismatched fact-memory conversation_id in tool call.',
+ extra={
+ 'requested_conversation_id': requested_conversation_id,
+ 'authorized_conversation_id': authorized_scope['conversation_id'],
+ },
+ level=logging.WARNING,
+ )
+
+ return authorized_scope
+
@kernel_function(
description="""
Store a fact for the given agent, scope, and conversation.
@@ -39,11 +119,16 @@ def set_fact(self, scope_type: str, scope_id: str, value: str, conversation_id:
"""
Store a fact for the given agent, scope, and conversation.
"""
- return self.store.set_fact(
+ authorized_scope = self._resolve_authorized_fact_memory_call(
scope_type=scope_type,
scope_id=scope_id,
- value=value,
conversation_id=conversation_id,
+ )
+ return self.store.set_fact(
+ scope_type=authorized_scope['scope_type'],
+ scope_id=authorized_scope['scope_id'],
+ value=value,
+ conversation_id=authorized_scope['conversation_id'],
agent_id=agent_id,
memory_type=memory_type,
)
@@ -56,8 +141,9 @@ def update_fact(self, scope_id: str, fact_id: str, value: str, memory_type: str
"""
Update a fact value by its unique id and scope_id partition key.
"""
+ authorized_scope = self._resolve_authorized_fact_memory_call(scope_id=scope_id)
update_kwargs = {
- 'scope_id': scope_id,
+ 'scope_id': authorized_scope['scope_id'],
'fact_id': fact_id,
'value': value,
}
@@ -77,8 +163,9 @@ def delete_fact(self, scope_id: str, fact_id: str) -> bool:
"""
Delete a fact by its unique id and the scope_id which is the partition key.
"""
+ authorized_scope = self._resolve_authorized_fact_memory_call(scope_id=scope_id)
return self.store.delete_fact(
- scope_id=scope_id,
+ scope_id=authorized_scope['scope_id'],
fact_id=fact_id
)
@@ -100,7 +187,11 @@ def get_facts(self, scope_type: str, scope_id: str,) -> List[dict]:
"""
Retrieve all facts for the user. Facts are persistent values that provide important context, background knowledge, or user preferences to the AI agent. Use this to get all facts that will be injected as context for the agent.
"""
- return self.store.get_facts(
+ authorized_scope = self._resolve_authorized_fact_memory_call(
scope_type=scope_type,
scope_id=scope_id,
)
+ return self.store.get_facts(
+ scope_type=authorized_scope['scope_type'],
+ scope_id=authorized_scope['scope_id'],
+ )
diff --git a/application/single_app/semantic_kernel_plugins/log_analytics_plugin.py b/application/single_app/semantic_kernel_plugins/log_analytics_plugin.py
index f98c0efaa..7c5a69326 100644
--- a/application/single_app/semantic_kernel_plugins/log_analytics_plugin.py
+++ b/application/single_app/semantic_kernel_plugins/log_analytics_plugin.py
@@ -193,7 +193,6 @@ def _generate_metadata(self) -> Dict[str, Any]:
"description": "Run a KQL (Kusto Query Language) query against the Log Analytics workspace and return the results. Results are chunked for LLMs if needed. Accepts an optional timespan parameter (timedelta, tuple, or hours).",
"parameters": [
{"name": "query", "type": "string", "description": "The KQL query string to execute.", "required": True},
- {"name": "user_id", "type": "string", "description": "User ID for query history tracking (optional).", "required": False},
{"name": "timespan", "type": "any", "description": "Query timespan: timedelta, (start, end) tuple, or number of hours (optional).", "required": False}
],
"returns": {"type": "list[object]", "description": "A list of result rows, each as a dictionary of column values."}
@@ -210,8 +209,7 @@ def _generate_metadata(self) -> Dict[str, Any]:
"name": "get_query_history",
"description": "Return the last N queries run by this plugin instance for the current user. Useful for re-running or editing previous queries.",
"parameters": [
- {"name": "limit", "type": "integer", "description": "Number of queries to return (default 20).", "required": False},
- {"name": "user_id", "type": "string", "description": "User ID for query history tracking (optional).", "required": False}
+ {"name": "limit", "type": "integer", "description": "Number of queries to return (default 20).", "required": False}
],
"returns": {"type": "list[string]", "description": "A list of previous KQL queries, most recent last."}
},
@@ -228,6 +226,21 @@ def _generate_metadata(self) -> Dict[str, Any]:
def get_functions(self) -> List[str]:
return [m["name"] for m in self._metadata["methods"]]
+
+ def _get_authenticated_history_user_id(self) -> Optional[str]:
+ """Return the authenticated user id for query-history persistence."""
+ try:
+ from application.single_app.functions_authentication import get_current_user_id
+ except ImportError:
+ from functions_authentication import get_current_user_id
+
+ try:
+ user_id = str(get_current_user_id() or "").strip()
+ except Exception as exc:
+ logging.warning(f"[LA] Could not resolve authenticated user for query history: {exc}")
+ return None
+
+ return user_id or None
@plugin_function_logger("LogAnalyticsPlugin")
@kernel_function(description="Return a dictionary of all tables and their schemas (column names and types, including Properties virtual columns) in the connected Azure Log Analytics workspace. This combines list_tables and get_table_schema for efficient schema discovery.")
@@ -394,14 +407,13 @@ def col_name(col):
return schema
@plugin_function_logger("LogAnalyticsPlugin")
- @kernel_function(description="Execute a KQL (Kusto Query Language) query against a specific table in the Log Analytics workspace and return the results as a list of rows (each as a dictionary of column values). Use this function after discovering available tables and their schemas to retrieve data. Accepts an optional timespan parameter to limit the query window as a timedelta, tuple of datetimes, or number of hours. Limitations on returns should be specified in the query (ex: take N). Always provide user_id to enable saving the query to Cosmos DB for user history tracking.")
+ @kernel_function(description="Execute a KQL (Kusto Query Language) query against a specific table in the Log Analytics workspace and return the results as a list of rows (each as a dictionary of column values). Use this function after discovering available tables and their schemas to retrieve data. Accepts an optional timespan parameter to limit the query window as a timedelta, tuple of datetimes, or number of hours. Limitations on returns should be specified in the query (ex: take N).")
def run_query(
self,
query: str,
- user_id: Optional[str] = None,
timespan: Optional[Any] = None
) -> Any:
- logging.debug(f"[LA] Running query: {query} with user_id={user_id}, timespan={timespan}")
+ logging.debug(f"[LA] Running query: {query} with timespan={timespan}")
if not self._client:
raise RuntimeError("Log Analytics client not initialized.")
# Determine if this is a control command (starts with '.')
@@ -477,9 +489,9 @@ def col_name(col):
logging.error(f"[LA] Error processing query results: {e}")
return {"error": "Failed to process query results."}
finally:
- # Save to Cosmos query history if user_id is provided
- if user_id:
- self._save_query_history_to_cosmos(user_id, query)
+ history_user_id = self._get_authenticated_history_user_id()
+ if history_user_id:
+ self._save_query_history_to_cosmos(history_user_id, query)
@plugin_function_logger("LogAnalyticsPlugin")
@kernel_function(description="Summarize a result set for LLM consumption, including row count and column names.")
@@ -492,7 +504,8 @@ def summarize_results(self, results: List[Dict[str, Any]]) -> str:
@plugin_function_logger("LogAnalyticsPlugin")
@kernel_function(description="Return the last N queries run by this plugin instance. They should be numbered for the user to allow easy selection.")
- def get_query_history(self, limit: int = 20, user_id: Optional[str] = None) -> List[str]:
+ def get_query_history(self, limit: int = 20) -> List[str]:
+ user_id = self._get_authenticated_history_user_id()
if not user_id:
return []
return self._get_query_history_from_cosmos(user_id, limit)
diff --git a/application/single_app/semantic_kernel_plugins/tabular_processing_plugin.py b/application/single_app/semantic_kernel_plugins/tabular_processing_plugin.py
index 344d092a5..f76edff81 100644
--- a/application/single_app/semantic_kernel_plugins/tabular_processing_plugin.py
+++ b/application/single_app/semantic_kernel_plugins/tabular_processing_plugin.py
@@ -15,11 +15,15 @@
import re
import warnings
import pandas
+from flask import g, has_request_context
from typing import Annotated, Dict, List, Optional, Set
from urllib.parse import urlsplit, urlunsplit
from semantic_kernel.functions import kernel_function
from semantic_kernel_plugins.plugin_invocation_logger import plugin_function_logger
from functions_appinsights import log_event
+from functions_authentication import get_current_user_id
+from functions_group import find_group_by_id, get_user_role_in_group
+from functions_public_workspaces import get_user_visible_public_workspace_ids_from_settings
from config import (
CLIENTS,
TABULAR_EXTENSIONS,
@@ -187,6 +191,179 @@ def _get_blob_service_client(self):
raise RuntimeError("Blob storage client not available. Enhanced citations must be enabled.")
return client
+ def _get_authorized_chat_context(self) -> dict:
+ """Return the canonical request-scoped authorization context for tabular access."""
+ if not has_request_context():
+ raise PermissionError('Tabular processing requires an active request context.')
+
+ current_user_id = str(get_current_user_id() or '').strip()
+ if not current_user_id:
+ raise PermissionError('User not authenticated.')
+
+ authorized_context = dict(getattr(g, 'authorized_chat_context', {}) or {})
+ authorized_user_id = str(authorized_context.get('user_id') or current_user_id).strip()
+ if authorized_user_id != current_user_id:
+ authorized_user_id = current_user_id
+
+ authorized_conversation_id = str(
+ authorized_context.get('conversation_id') or getattr(g, 'conversation_id', '') or ''
+ ).strip()
+ if not authorized_conversation_id:
+ raise PermissionError('Conversation context unavailable for tabular processing.')
+
+ active_group_ids = [
+ str(group_id or '').strip()
+ for group_id in (authorized_context.get('active_group_ids') or [])
+ if str(group_id or '').strip()
+ ]
+ active_public_workspace_ids = [
+ str(workspace_id or '').strip()
+ for workspace_id in (authorized_context.get('active_public_workspace_ids') or [])
+ if str(workspace_id or '').strip()
+ ]
+
+ return {
+ 'user_id': authorized_user_id,
+ 'conversation_id': authorized_conversation_id,
+ 'active_group_ids': active_group_ids,
+ 'active_group_id': str(authorized_context.get('active_group_id') or '').strip() or None,
+ 'active_public_workspace_ids': active_public_workspace_ids,
+ 'active_public_workspace_id': (
+ str(authorized_context.get('active_public_workspace_id') or '').strip() or None
+ ),
+ }
+
+ def _resolve_authorized_scope_arguments(
+ self,
+ user_id: str,
+ conversation_id: str,
+ group_id: Optional[str] = None,
+ public_workspace_id: Optional[str] = None,
+ ) -> dict:
+ """Normalize tool-call scope arguments against the current authorized request context."""
+ authorized_context = self._get_authorized_chat_context()
+ requested_user_id = str(user_id or '').strip()
+ requested_conversation_id = str(conversation_id or '').strip()
+ requested_group_id = str(group_id or '').strip()
+ requested_public_workspace_id = str(public_workspace_id or '').strip()
+
+ if requested_user_id and requested_user_id != authorized_context['user_id']:
+ log_event(
+ '[TabularProcessingPlugin] Ignoring mismatched user_id in tool call.',
+ extra={
+ 'requested_user_id': requested_user_id,
+ 'authorized_user_id': authorized_context['user_id'],
+ },
+ level=logging.WARNING,
+ )
+
+ if requested_conversation_id and requested_conversation_id != authorized_context['conversation_id']:
+ log_event(
+ '[TabularProcessingPlugin] Ignoring mismatched conversation_id in tool call.',
+ extra={
+ 'requested_conversation_id': requested_conversation_id,
+ 'authorized_conversation_id': authorized_context['conversation_id'],
+ },
+ level=logging.WARNING,
+ )
+
+ resolved_group_id = None
+ if requested_group_id:
+ if not self._is_authorized_group_scope(
+ authorized_context['user_id'],
+ requested_group_id,
+ authorized_context=authorized_context,
+ ):
+ raise PermissionError('Tabular processing cannot access that group scope.')
+ resolved_group_id = requested_group_id
+
+ resolved_public_workspace_id = None
+ if requested_public_workspace_id:
+ if not self._is_authorized_public_workspace_scope(
+ authorized_context['user_id'],
+ requested_public_workspace_id,
+ authorized_context=authorized_context,
+ ):
+ raise PermissionError('Tabular processing cannot access that public workspace scope.')
+ resolved_public_workspace_id = requested_public_workspace_id
+
+ authorized_context['user_id'] = authorized_context['user_id']
+ authorized_context['conversation_id'] = authorized_context['conversation_id']
+ authorized_context['group_id'] = resolved_group_id
+ authorized_context['public_workspace_id'] = resolved_public_workspace_id
+ return authorized_context
+
+ def _is_authorized_group_scope(
+ self,
+ user_id: str,
+ group_id: str,
+ authorized_context: Optional[dict] = None,
+ ) -> bool:
+ """Return True when the current user may access the requested group scope."""
+ normalized_group_id = str(group_id or '').strip()
+ if not normalized_group_id:
+ return False
+
+ if authorized_context and normalized_group_id in set(authorized_context.get('active_group_ids') or []):
+ return True
+
+ group_doc = find_group_by_id(normalized_group_id)
+ return bool(group_doc and get_user_role_in_group(group_doc, user_id))
+
+ def _is_authorized_public_workspace_scope(
+ self,
+ user_id: str,
+ public_workspace_id: str,
+ authorized_context: Optional[dict] = None,
+ ) -> bool:
+ """Return True when the current user may access the requested public workspace scope."""
+ normalized_public_workspace_id = str(public_workspace_id or '').strip()
+ if not normalized_public_workspace_id:
+ return False
+
+ if authorized_context and normalized_public_workspace_id in set(
+ authorized_context.get('active_public_workspace_ids') or []
+ ):
+ return True
+
+ visible_public_workspace_ids = set(
+ get_user_visible_public_workspace_ids_from_settings(user_id) or []
+ )
+ return normalized_public_workspace_id in visible_public_workspace_ids
+
+ def _is_authorized_blob_location(self, container_name: str, blob_path: str, authorized_context: dict) -> bool:
+ """Ensure remembered blob locations still fall within the caller's authorized request scope."""
+ source = self._infer_source_from_container(container_name)
+ blob_parts = [part for part in str(blob_path or '').split('/') if part]
+ if not source or not blob_parts:
+ return False
+
+ if source == 'workspace':
+ return blob_parts[0] == authorized_context['user_id']
+
+ if source == 'chat':
+ return (
+ len(blob_parts) >= 2
+ and blob_parts[0] == authorized_context['user_id']
+ and blob_parts[1] == authorized_context['conversation_id']
+ )
+
+ if source == 'group':
+ return self._is_authorized_group_scope(
+ authorized_context['user_id'],
+ blob_parts[0],
+ authorized_context=authorized_context,
+ )
+
+ if source == 'public':
+ return self._is_authorized_public_workspace_scope(
+ authorized_context['user_id'],
+ blob_parts[0],
+ authorized_context=authorized_context,
+ )
+
+ return False
+
def _list_tabular_blobs(self, container_name: str, prefix: str) -> List[str]:
"""List all tabular file blobs under a given prefix."""
client = self._get_blob_service_client()
@@ -2754,9 +2931,25 @@ def _resolve_blob_location(self, user_id: str, conversation_id: str, filename: s
def _resolve_blob_location_with_fallback(self, user_id: str, conversation_id: str, filename: str, source: str,
group_id: str = None, public_workspace_id: str = None) -> tuple:
"""Try primary source first, then fall back to other containers if blob not found."""
+ authorized_context = self._resolve_authorized_scope_arguments(
+ user_id,
+ conversation_id,
+ group_id=group_id,
+ public_workspace_id=public_workspace_id,
+ )
+ user_id = authorized_context['user_id']
+ conversation_id = authorized_context['conversation_id']
+ group_id = authorized_context['group_id']
+ public_workspace_id = authorized_context['public_workspace_id']
source = source.lower().strip()
+
+ if source == 'group' and not group_id:
+ group_id = authorized_context['active_group_id']
+ if source == 'public' and not public_workspace_id:
+ public_workspace_id = authorized_context['active_public_workspace_id']
+
override = self._get_resolved_blob_location_override(source, filename)
- if override:
+ if override and self._is_authorized_blob_location(override[0], override[1], authorized_context):
return override
attempts = []
@@ -2811,6 +3004,29 @@ async def list_tabular_files(
public_workspace_id: Annotated[Optional[str], "Public workspace ID (for public workspace documents)"] = None,
) -> Annotated[str, "JSON list of available tabular files"]:
"""List all tabular files available for the user across all accessible containers."""
+ try:
+ authorized_context = self._resolve_authorized_scope_arguments(
+ user_id,
+ conversation_id,
+ group_id=group_id,
+ public_workspace_id=public_workspace_id,
+ )
+ except PermissionError as exc:
+ log_event(
+ f"[TabularProcessingPlugin] Denied tabular file listing: {exc}",
+ level=logging.WARNING,
+ extra={
+ 'requested_group_id': group_id,
+ 'requested_public_workspace_id': public_workspace_id,
+ },
+ )
+ return json.dumps({"error": str(exc)})
+
+ user_id = authorized_context['user_id']
+ conversation_id = authorized_context['conversation_id']
+ group_id = authorized_context['group_id']
+ public_workspace_id = authorized_context['public_workspace_id']
+
def _sync_work():
results = []
try:
diff --git a/application/single_app/static/css/chats.css b/application/single_app/static/css/chats.css
index 1ab7b2829..6474185b0 100644
--- a/application/single_app/static/css/chats.css
+++ b/application/single_app/static/css/chats.css
@@ -13,11 +13,67 @@
margin: 0 !important;
}
/* Fix for document dropdown visibility */
+#scope-dropdown .dropdown-menu.show,
+#tags-dropdown .dropdown-menu.show,
#document-dropdown .dropdown-menu.show {
display: block !important;
opacity: 1 !important;
visibility: visible !important;
- z-index: 1050 !important; /* Ensure it's above other elements */
+ z-index: 1060 !important; /* Ensure it's above other elements */
+}
+
+.chat-shell-header {
+ display: flex;
+ align-items: center;
+ justify-content: space-between;
+ gap: 0.75rem;
+}
+
+.chat-shell-header-content {
+ gap: 0.25rem;
+ min-width: 0;
+}
+
+.chat-shell-title-row {
+ gap: 0.5rem;
+ justify-content: space-between;
+ min-width: 0;
+}
+
+.chat-shell-title-actions {
+ align-items: center;
+ display: flex;
+ flex: 0 0 auto;
+}
+
+.chat-shell-icon-button {
+ align-items: center;
+ border-radius: 999px;
+ display: inline-flex;
+ height: 2.25rem;
+ justify-content: center;
+ padding: 0;
+ width: 2.25rem;
+}
+
+.chat-shell-status-row {
+ min-height: 1.5rem;
+}
+
+.chat-shell-classifications {
+ min-width: 0;
+}
+
+#chat-sidebar-inline-toggle {
+ border-radius: 999px;
+ flex: 0 0 auto;
+ height: 2.25rem;
+ padding: 0;
+ width: 2.25rem;
+}
+
+body.sidebar-collapsed #chat-sidebar-inline-toggle {
+ color: var(--bs-primary);
}
.chat-searchable-select .dropdown-menu.show {
@@ -27,6 +83,10 @@
z-index: 1050 !important;
}
+.chat-toolbar-mobile-panel .chat-searchable-select .dropdown-menu.show {
+ z-index: 1060 !important;
+}
+
/* Handle dropdown positioning at the edge of viewport */
#document-dropdown.dropup .dropdown-menu {
bottom: 100% !important;
@@ -46,21 +106,78 @@
right: auto !important; /* Prevent right positioning */
}
+.chat-search-panel {
+ border-radius: 0.5rem;
+}
+
+.chat-search-panel-grid {
+ align-items: flex-end;
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.5rem;
+}
+
+.chat-search-panel-field {
+ min-width: 0;
+}
+
+.chat-search-dropdown-button-content {
+ align-items: center;
+ display: inline-flex;
+ gap: 0.5rem;
+ min-width: 0;
+}
+
+.chat-search-dropdown-loading-spinner {
+ flex: 0 0 auto;
+ height: 0.875rem;
+ width: 0.875rem;
+}
+
+@media (min-width: 992px) {
+ .chat-search-panel .offcanvas-header {
+ display: none;
+ }
+}
+
.chat-searchable-select {
min-width: 120px;
}
.chat-toolbar {
display: flex;
- flex-wrap: nowrap;
+ flex-wrap: wrap;
align-items: flex-end;
gap: 0.75rem;
}
+.chat-toolbar-primary-row {
+ align-items: center;
+ display: flex;
+ flex: 1 1 auto;
+ min-width: 0;
+}
+
+#chat-toolbar-desktop-tools-slot,
+.chat-toolbar-selectors-slot {
+ display: flex;
+ min-width: 0;
+}
+
+#chat-toolbar-desktop-tools-slot {
+ flex: 1 1 auto;
+}
+
+.chat-toolbar-action-rail {
+ flex: 1 1 auto;
+ min-width: 0;
+}
+
.chat-toolbar-actions,
.chat-toolbar-controls,
.chat-toolbar-toggles,
-.chat-toolbar-selectors {
+.chat-toolbar-selectors,
+.chat-toolbar-secondary-panel {
display: flex;
align-items: center;
gap: 0.5rem;
@@ -69,7 +186,13 @@
.chat-toolbar-actions {
flex: 1 1 auto;
- flex-wrap: wrap;
+ flex-wrap: nowrap;
+ overflow-x: auto;
+ padding-bottom: 0.125rem;
+}
+
+.chat-toolbar-actions .btn {
+ flex: 0 0 auto;
}
.chat-toolbar-controls {
@@ -77,9 +200,57 @@
flex-wrap: nowrap;
justify-content: flex-end;
align-items: flex-end;
+ gap: 0.75rem;
margin-left: auto;
}
+.chat-toolbar-primary-selector {
+ flex: 0 1 230px;
+ max-width: 230px;
+ min-width: 180px;
+}
+
+.chat-toolbar-primary-selector .chat-toolbar-selector {
+ flex: 1 1 auto;
+ max-width: none;
+ min-width: 0;
+}
+
+.chat-toolbar-primary-surface {
+ align-items: center;
+ display: flex;
+ flex: 1 1 auto;
+ min-width: 0;
+}
+
+.chat-toolbar-primary-surface .chat-toolbar-selector {
+ flex: 1 1 auto;
+ max-width: none;
+ min-width: 0;
+}
+
+.chat-toolbar-tools-surface {
+ flex: 1 1 auto;
+ flex-wrap: nowrap;
+ justify-content: flex-start;
+}
+
+.chat-toolbar-selectors-slot {
+ flex: 0 1 auto;
+}
+
+.chat-toolbar-mobile-panel {
+ --bs-offcanvas-height: min(26rem, 70vh);
+}
+
+.chat-mobile-tools-toggle {
+ align-items: center;
+ border-radius: 999px;
+ flex: 0 0 auto;
+ gap: 0.35rem;
+ white-space: nowrap;
+}
+
.chat-toolbar-toggles {
flex: 0 0 auto;
flex-wrap: nowrap;
@@ -188,21 +359,22 @@
align-items: center;
}
- .chat-toolbar-actions {
- flex: 1 1 100%;
- }
-
.chat-toolbar-controls {
flex: 1 1 100%;
flex-wrap: wrap;
- justify-content: flex-start;
+ justify-content: space-between;
margin-left: 0;
}
+ .chat-toolbar-primary-selector {
+ flex: 1 1 220px;
+ max-width: 280px;
+ }
+
.chat-toolbar-selectors {
flex: 1 1 auto;
flex-wrap: wrap;
- justify-content: flex-start;
+ justify-content: flex-end;
}
.chat-toolbar-toggles {
@@ -210,33 +382,202 @@
}
}
-@media (max-width: 768px) {
- .chat-toolbar {
+@media (min-width: 992px) {
+ .chat-toolbar-mobile-panel {
+ display: none !important;
+ }
+}
+
+@media (max-width: 991.98px) {
+ .chat-search-panel {
+ border-radius: 0;
+ height: 100vh;
+ margin-bottom: 0 !important;
+ max-width: min(92vw, 24rem);
+ }
+
+ .chat-search-panel .offcanvas-header {
+ border-bottom: 1px solid var(--bs-border-color);
+ }
+
+ .chat-search-panel .offcanvas-body {
+ padding: 1rem;
+ }
+
+ .chat-search-panel-mobile-footer {
+ align-items: center;
+ border-top: 1px solid var(--bs-border-color);
+ display: flex;
+ justify-content: flex-end;
+ padding: 0.75rem 1rem calc(env(safe-area-inset-bottom, 0px) + 1rem);
+ }
+
+ .chat-search-panel-mobile-footer .btn-close {
+ margin-left: auto;
+ }
+
+ .chat-search-panel-grid {
+ align-items: stretch;
+ display: grid;
+ gap: 0.75rem;
+ }
+
+ #document-dropdown .dropdown-menu,
+ #scope-dropdown-menu,
+ #tags-dropdown-menu {
+ max-width: none;
+ min-width: 0;
+ width: 100% !important;
+ }
+
+ .chat-shell-header {
+ padding: 0.875rem 1rem !important;
+ }
+
+ .chat-shell-title-row {
+ align-items: flex-start;
+ }
+
+ .chat-shell-status-row {
flex-wrap: wrap;
+ }
+
+ .chat-toolbar {
align-items: center;
+ gap: 0.75rem;
+ }
+
+ .chat-toolbar-primary-row,
+ #chat-toolbar-desktop-tools-slot,
+ #chat-toolbar-desktop-primary-slot,
+ #chat-toolbar-desktop-selectors-slot {
+ display: none;
+ }
+
+ .chat-toolbar-mobile-panel.offcanvas-bottom {
+ border-top-left-radius: 1rem;
+ border-top-right-radius: 1rem;
+ box-shadow: 0 -0.75rem 2rem rgba(0, 0, 0, 0.2);
}
.chat-toolbar-controls {
- flex-basis: 100%;
- flex-wrap: wrap;
- justify-content: flex-start;
+ display: flex;
+ justify-content: flex-end;
margin-left: 0;
+ width: 100%;
}
- .chat-toolbar-selectors {
- flex-basis: 100%;
- flex-wrap: wrap;
+ .chat-mobile-tools-toggle {
+ margin-left: auto;
}
- .chat-toolbar-toggles {
+ .chat-toolbar-mobile-panel .offcanvas-header {
+ border-bottom: 1px solid var(--bs-border-color);
+ padding: 1rem 1rem 0.875rem;
+ }
+
+ .chat-toolbar-mobile-panel .offcanvas-body {
+ display: flex;
+ flex-direction: column;
+ gap: 1rem;
+ padding: 1rem;
+ }
+
+ .chat-toolbar-mobile-tools-slot,
+ .chat-toolbar-mobile-primary-slot,
+ .chat-toolbar-mobile-selectors-slot {
+ width: 100%;
+ }
+
+ .chat-toolbar-mobile-primary-slot .chat-toolbar-primary-surface {
+ width: 100%;
+ }
+
+ .chat-toolbar-mobile-tools-slot .chat-toolbar-tools-surface {
+ align-items: stretch;
+ display: flex;
+ flex-direction: column;
+ gap: 1rem;
+ }
+
+ .chat-toolbar-mobile-tools-slot .chat-toolbar-actions {
+ display: grid;
+ gap: 0.625rem;
+ grid-template-columns: minmax(0, 1fr);
+ overflow: visible;
+ padding-bottom: 0;
+ }
+
+ .chat-toolbar-mobile-tools-slot .chat-toolbar-actions .btn {
+ align-items: center;
+ border-radius: 0.85rem;
+ display: flex;
+ gap: 0.625rem;
+ justify-content: flex-start;
+ min-height: 2.75rem;
+ padding: 0.75rem 1rem;
width: 100%;
+ }
+
+ .chat-toolbar-mobile-tools-slot .search-btn-text,
+ .chat-toolbar-mobile-tools-slot .file-btn-text {
+ display: inline !important;
+ margin-left: 0.375rem;
+ opacity: 1;
+ width: auto;
+ }
+
+ .chat-toolbar-mobile-tools-slot .search-btn:not(.active) i,
+ .chat-toolbar-mobile-tools-slot .search-btn:not(.active) .search-btn-text,
+ .chat-toolbar-mobile-tools-slot .file-btn:not(.active) i,
+ .chat-toolbar-mobile-tools-slot .file-btn:not(.active) .file-btn-text {
+ color: var(--bs-emphasis-color);
+ }
+
+ .chat-toolbar-mobile-tools-slot .chat-toolbar-toggles {
flex-wrap: wrap;
+ justify-content: flex-start;
+ width: 100%;
+ }
+
+ .chat-toolbar-mobile-selectors-slot .chat-toolbar-selectors {
+ align-items: stretch;
+ flex-direction: column;
+ width: 100%;
}
+ .chat-toolbar-mobile-primary-slot .chat-toolbar-selector,
+ .chat-toolbar-mobile-selectors-slot .chat-toolbar-selector,
.chat-toolbar-selector {
flex: 1 1 100%;
- min-width: 0;
max-width: none;
+ min-width: 0;
+ width: 100%;
+ }
+
+ #search-documents-container .flex-shrink-0,
+ #search-documents-container .flex-grow-1 {
+ max-width: none !important;
+ min-width: 0 !important;
+ width: 100%;
+ }
+
+ #search-documents-container .dropdown,
+ #search-documents-container .form-select {
+ width: 100%;
+ }
+
+ .chat-mobile-tools-toggle[aria-expanded="true"] {
+ background-color: rgba(var(--bs-primary-rgb), 0.08);
+ border-color: rgba(var(--bs-primary-rgb), 0.35);
+ color: var(--bs-primary);
+ }
+
+ .chat-toolbar-actions,
+ .chat-toolbar-controls,
+ .chat-toolbar-selectors,
+ .chat-toolbar-toggles {
+ gap: 0.35rem;
}
}
@@ -513,6 +854,11 @@
--bs-tooltip-zindex: 1115;
}
+ #chat-toast-container {
+ top: 16px;
+ z-index: 1100;
+ }
+
.conversation-dropdown.tutorial-force-visible {
opacity: 1 !important;
}
@@ -1288,6 +1634,22 @@ a.citation-link:hover {
color: #000; /* Black text color */
}
+@media (max-width: 991.98px) {
+ #chat-toolbar-mobile-tools-slot .search-btn .search-btn-text,
+ #chat-toolbar-mobile-tools-slot .file-btn .file-btn-text {
+ display: inline !important;
+ margin-left: 0.375rem;
+ opacity: 1 !important;
+ overflow: visible;
+ width: auto !important;
+ }
+
+ #chat-toolbar-mobile-tools-slot .search-btn:not(.active) .search-btn-text,
+ #chat-toolbar-mobile-tools-slot .file-btn:not(.active) .file-btn-text {
+ color: var(--bs-emphasis-color);
+ }
+}
+
/* Message container */
.message {
display: flex;
@@ -1665,6 +2027,13 @@ ol {
/* Ensures elements align nicely at their bottom edges */
align-items: flex-end;
}
+
+@media (max-width: 991.98px) {
+ .chat-input-container {
+ margin-top: 0.25rem;
+ }
+}
+
#prompt-selection-container {
align-self: auto;
}
diff --git a/application/single_app/static/css/navigation.css b/application/single_app/static/css/navigation.css
index c561b19e4..3b17d9dc0 100644
--- a/application/single_app/static/css/navigation.css
+++ b/application/single_app/static/css/navigation.css
@@ -1,18 +1,212 @@
/* Top Navigation Styles */
+:root {
+ --top-nav-height: 66px;
+ --classification-banner-height: 40px;
+}
+
/* Base body padding for top navigation */
body {
- /* Default: navbar height only */
- padding-top: 56px;
+ padding-top: var(--top-nav-height);
overflow-x: hidden;
height: 100%;
}
+.top-nav-bar {
+ -webkit-backdrop-filter: blur(14px);
+ backdrop-filter: blur(14px);
+ box-shadow: 0 0.35rem 1.25rem rgba(15, 23, 42, 0.08);
+ z-index: 1046;
+}
+
+.top-nav-shell {
+ align-items: center;
+ display: flex;
+ gap: 1rem;
+ min-height: var(--top-nav-height);
+}
+
+.top-nav-brand {
+ flex: 0 1 auto;
+ min-width: 0;
+}
+
+.top-nav-brand-link {
+ margin-right: 0;
+ max-width: 100%;
+}
+
+.top-nav-brand-title {
+ display: inline-block;
+ max-width: min(24rem, 45vw);
+ overflow: hidden;
+ text-overflow: ellipsis;
+ white-space: nowrap;
+}
+
+.top-nav-primary-nav {
+ align-items: center;
+ flex-wrap: nowrap;
+ gap: 0.15rem;
+ min-width: 0;
+ overflow-x: auto;
+}
+
+.top-nav-primary-nav .nav-link {
+ border-radius: 999px;
+ padding-left: 0.9rem;
+ padding-right: 0.9rem;
+ white-space: nowrap;
+}
+
+.top-nav-chat-nav {
+ flex: 1 1 auto;
+}
+
+.top-nav-chat-nav .nav-link {
+ padding-left: 0.75rem;
+ padding-right: 0.75rem;
+}
+
+.top-nav-actions {
+ align-items: center;
+ display: flex;
+ flex: 0 0 auto;
+ gap: 0.75rem;
+ margin-left: auto;
+}
+
+.top-nav-mobile-toggle {
+ align-items: center;
+ border-radius: 999px;
+ display: inline-flex;
+ font-size: 1.15rem;
+ height: 2.5rem;
+ justify-content: center;
+ padding: 0;
+ width: 2.5rem;
+}
+
+.top-nav-mobile-drawer {
+ max-width: min(88vw, 22rem);
+}
+
+@media (max-width: 991.98px) {
+ .top-nav-mobile-drawer.offcanvas-start {
+ height: calc(100vh - var(--top-nav-height));
+ top: var(--top-nav-height);
+ }
+
+ body.has-classification-banner .top-nav-mobile-drawer.offcanvas-start {
+ height: calc(100vh - var(--top-nav-height) - var(--classification-banner-height));
+ top: calc(var(--top-nav-height) + var(--classification-banner-height));
+ }
+}
+
+.top-nav-mobile-drawer .offcanvas-header {
+ border-bottom: 1px solid rgba(0, 0, 0, 0.08);
+}
+
+.top-nav-mobile-section-label {
+ color: var(--bs-secondary-color, #6c757d);
+ font-size: 0.72rem;
+ font-weight: 700;
+ letter-spacing: 0.08em;
+ margin: 1rem 0 0.5rem;
+ text-transform: uppercase;
+}
+
+.top-nav-mobile-section-label:first-child {
+ margin-top: 0;
+}
+
+.top-nav-mobile-list .list-group-item {
+ align-items: center;
+ border: 0;
+ border-radius: 0.85rem;
+ display: flex;
+ font-weight: 500;
+ margin-bottom: 0.25rem;
+ padding: 0.85rem 0.95rem;
+}
+
+.top-nav-mobile-list .list-group-item:hover,
+.top-nav-mobile-list .list-group-item:focus-visible {
+ background: rgba(13, 110, 253, 0.08);
+}
+
+.top-nav-user-nav {
+ align-items: center;
+ flex-direction: row;
+}
+
+.top-nav-user-trigger {
+ align-items: center;
+ border-radius: 999px;
+ display: flex;
+ gap: 0.6rem;
+}
+
+.top-nav-profile-avatar {
+ height: 28px;
+ position: relative;
+ width: 28px;
+}
+
+.top-nav-profile-image {
+ height: 28px;
+ object-fit: cover;
+ width: 28px;
+}
+
+.top-nav-initials {
+ font-size: 1rem;
+}
+
+.top-nav-notification-badge {
+ border: 2px solid white;
+ bottom: -2px;
+ display: none;
+ font-size: 0.6rem;
+ font-weight: 700;
+ height: 18px;
+ line-height: 1;
+ padding: 0;
+ right: -2px;
+ width: 18px;
+}
+
+.top-nav-user-meta {
+ flex-direction: column;
+ justify-content: center;
+ line-height: 1;
+ min-width: 0;
+}
+
+.top-nav-user-name {
+ font-size: 1rem;
+}
+
+.top-nav-user-role {
+ font-size: 0.75rem;
+}
+
+.top-nav-user-menu {
+ min-width: 17rem;
+}
+
+.top-nav-menu-section {
+ font-size: 0.75rem;
+ font-weight: 600;
+ letter-spacing: 0.02em;
+ text-transform: uppercase;
+}
+
/* Navigation layout toggle styles */
.nav-layout-toggle {
+ align-items: center;
cursor: pointer;
display: flex;
- align-items: center;
height: 100%;
}
@@ -22,27 +216,27 @@ body {
/* Ensure nav-link containing nav layout toggle is aligned with other nav links */
.nav-item .nav-link.nav-layout-toggle {
- display: flex;
align-items: center;
+ display: flex;
height: 100%;
- padding-top: 0;
padding-bottom: 0;
padding-left: 1rem;
padding-right: 1rem;
+ padding-top: 0;
}
/* Ensure dropdown nav layout toggle aligns with other dropdown items */
.dropdown-item.nav-layout-toggle {
- display: flex;
align-items: center;
+ display: flex;
text-align: left;
}
/* Dark mode toggle styles within navigation */
.dark-mode-toggle {
+ align-items: center;
cursor: pointer;
display: flex;
- align-items: center;
height: 100%;
}
@@ -52,38 +246,84 @@ body {
/* Ensure nav-link containing dark mode toggle is aligned with other nav links */
.nav-item .nav-link.dark-mode-toggle {
- display: flex;
align-items: center;
+ display: flex;
height: 100%;
- padding-top: 0;
padding-bottom: 0;
padding-left: 1rem;
padding-right: 1rem;
+ padding-top: 0;
}
/* Ensure dropdown dark mode toggle aligns with other dropdown items */
.dropdown-item.dark-mode-toggle {
- display: flex;
align-items: center;
+ display: flex;
text-align: left;
}
/* Classification banner adjustments */
#classification-banner + nav.navbar.fixed-top {
- top: 40px; /* Height of the banner */
+ top: var(--classification-banner-height);
}
#classification-banner {
- height: 40px;
- line-height: 40px;
+ height: var(--classification-banner-height);
+ line-height: var(--classification-banner-height);
}
/* Dark mode navbar styling */
[data-bs-theme="dark"] .navbar-light {
- background-color: #343a40 !important;
+ background-color: rgba(33, 37, 41, 0.94) !important;
}
[data-bs-theme="dark"] .navbar-light .navbar-brand,
[data-bs-theme="dark"] .navbar-light .nav-link {
color: #e9ecef;
}
+
+[data-bs-theme="dark"] .top-nav-mobile-drawer {
+ background: #212529;
+ color: #f8f9fa;
+}
+
+[data-bs-theme="dark"] .top-nav-mobile-drawer .offcanvas-header {
+ border-bottom-color: rgba(255, 255, 255, 0.08);
+}
+
+[data-bs-theme="dark"] .top-nav-mobile-list .list-group-item {
+ background: transparent;
+ color: #f8f9fa;
+}
+
+[data-bs-theme="dark"] .top-nav-mobile-list .list-group-item:hover,
+[data-bs-theme="dark"] .top-nav-mobile-list .list-group-item:focus-visible {
+ background: rgba(255, 255, 255, 0.08);
+}
+
+@media (max-width: 991.98px) {
+ :root {
+ --top-nav-height: 64px;
+ }
+
+ .top-nav-shell {
+ gap: 0.5rem;
+ }
+
+ .top-nav-brand-link {
+ max-width: calc(100vw - 8.5rem);
+ }
+
+ .top-nav-brand-title {
+ max-width: min(14rem, 54vw);
+ }
+
+ .top-nav-user-trigger {
+ padding-left: 0.15rem;
+ padding-right: 0.15rem;
+ }
+
+ .top-nav-user-menu {
+ min-width: 15rem;
+ }
+}
diff --git a/application/single_app/static/css/sidebar.css b/application/single_app/static/css/sidebar.css
index 1b3f7eb81..ab7e652ce 100644
--- a/application/single_app/static/css/sidebar.css
+++ b/application/single_app/static/css/sidebar.css
@@ -15,12 +15,21 @@ body.sidebar-nav-enabled {
/* Sidebar container */
#sidebar-nav {
display: none;
+ transition: transform 0.3s ease;
+}
+
+#sidebar-nav.sidebar-collapsed {
+ transform: translateX(-100%);
}
body.sidebar-nav-enabled #sidebar-nav {
display: flex !important;
}
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav {
+ display: flex !important;
+}
+
/* Hide top navbar when sidebar is enabled */
body.sidebar-nav-enabled nav.navbar {
display: none !important;
@@ -53,14 +62,95 @@ body.has-classification-banner #sidebar-nav {
/* Chats top-nav layout: align the fixed sidebar just below the navbar */
nav.navbar.fixed-top + #sidebar-nav {
- top: 66px !important;
- height: calc(100vh - 66px);
+ top: var(--top-nav-height) !important;
+ height: calc(100vh - var(--top-nav-height));
}
/* Account for classification banner when present */
body.has-classification-banner nav.navbar + #sidebar-nav {
- top: 98px !important;
- height: calc(100vh - 98px);
+ top: calc(var(--top-nav-height) + var(--classification-banner-height)) !important;
+ height: calc(100vh - var(--top-nav-height) - var(--classification-banner-height));
+}
+
+body.chat-top-nav-shell {
+ --chat-sidebar-top: var(--top-nav-height);
+}
+
+body.chat-top-nav-shell.has-classification-banner {
+ --chat-sidebar-top: calc(var(--top-nav-height) + var(--classification-banner-height));
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav {
+ width: var(--sidebar-width, 260px);
+ min-width: var(--sidebar-min-width, 220px);
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav #sidebar-user-account {
+ min-width: 0;
+ width: 100%;
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav.show {
+ transform: none !important;
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .offcanvas-header {
+ border-bottom: 1px solid var(--bs-border-color, #dee2e6);
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .chat-sidebar-mobile-header {
+ display: none;
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .chat-sidebar-user-account {
+ background-color: inherit;
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .chat-sidebar-mobile-sections {
+ background-color: inherit;
+}
+
+body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .chat-sidebar-mobile-section + .chat-sidebar-mobile-section {
+ margin-top: 1rem;
+}
+
+@media (min-width: 992px) {
+ body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav {
+ height: calc(100vh - var(--chat-sidebar-top));
+ left: 0;
+ max-width: none;
+ position: fixed;
+ top: var(--chat-sidebar-top);
+ z-index: 1040;
+ }
+
+ body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .offcanvas-header {
+ display: none;
+ }
+}
+
+@media (max-width: 991.98px) {
+ body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav {
+ height: calc(100vh - var(--chat-sidebar-top));
+ max-width: min(88vw, 22rem);
+ top: var(--chat-sidebar-top);
+ z-index: 1045;
+ }
+
+ body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .chat-sidebar-mobile-header {
+ display: flex;
+ }
+
+ body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav .sidebar-short-header,
+ body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav #sidebar-resize-handle {
+ display: none;
+ }
+
+ body.chat-top-nav-shell #sidebar-nav.chat-sidebar-nav #sidebar-user-account {
+ -webkit-backdrop-filter: none;
+ backdrop-filter: none;
+ position: static;
+ }
}
/* Floating expand button positioning when classification banner is present */
@@ -68,6 +158,49 @@ body.has-classification-banner #floating-expand-btn {
top: calc(0.5rem + 40px) !important; /* Start below the classification banner */
}
+#floating-expand-btn {
+ position: fixed;
+ left: 0.5rem;
+ top: 0.5rem;
+ z-index: 1050;
+ min-height: 36px;
+ padding: 0.35rem 0.55rem;
+ border-radius: 999px;
+ align-items: center;
+ justify-content: center;
+ gap: 0.375rem;
+ background-color: var(--bs-body-bg, #fff) !important;
+ border: 1px solid var(--bs-border-color, #dee2e6) !important;
+ box-shadow: 0 2px 8px rgba(0, 0, 0, 0.15) !important;
+ color: var(--bs-body-color, #212529) !important;
+}
+
+#floating-expand-btn.sidebar-floating-expand-visible {
+ display: inline-flex !important;
+}
+
+[data-bs-theme="dark"] #floating-expand-btn {
+ background-color: var(--bs-dark, #212529) !important;
+ border: 1px solid var(--bs-border-color-translucent, rgba(255, 255, 255, 0.15)) !important;
+ box-shadow: 0 2px 8px rgba(0, 0, 0, 0.5) !important;
+ color: var(--bs-body-color, #fff) !important;
+}
+
+#floating-expand-btn:hover {
+ background-color: var(--bs-gray-100, #f8f9fa) !important;
+ transform: scale(1.05);
+ transition: all 0.2s ease;
+}
+
+[data-bs-theme="dark"] #floating-expand-btn:hover {
+ background-color: var(--bs-gray-800, #343a40) !important;
+}
+
+.floating-expand-label {
+ display: none;
+ font-weight: 600;
+}
+
/* When the banner is present, add padding to the top of the main content */
body.sidebar-nav-enabled.has-classification-banner .main-content,
body.sidebar-nav-enabled.has-classification-banner .container,
@@ -75,6 +208,13 @@ body.sidebar-nav-enabled.has-classification-banner .container-fluid {
padding-top: 2rem !important; /* Increased spacing when banner is present */
}
+.main-content,
+.container,
+.container-fluid,
+#main-content {
+ transition: margin-left 0.3s ease-in-out, max-width 0.3s ease-in-out !important;
+}
+
/* Sidebar Conversations List Styles */
#conversations-section {
display: flex;
@@ -173,6 +313,56 @@ body.sidebar-nav-enabled.has-classification-banner .container-fluid {
font-size: 0.9rem;
}
+.sidebar-header,
+.sidebar-short-header {
+ display: flex;
+ flex-direction: column;
+ gap: 0.75rem;
+}
+
+.sidebar-brand-link {
+ display: flex !important;
+ align-items: center;
+ gap: 0.75rem;
+ min-width: 0;
+ margin-bottom: 0 !important;
+}
+
+.sidebar-brand-link img {
+ flex-shrink: 0;
+}
+
+.sidebar-brand-text {
+ flex: 1 1 auto;
+ min-width: 0;
+}
+
+.sidebar-title-truncate {
+ display: block;
+ min-width: 0;
+ overflow: hidden;
+ white-space: nowrap;
+ text-overflow: ellipsis;
+ max-width: 100%;
+}
+
+.sidebar-toggle-row {
+ display: flex;
+}
+
+.sidebar-toggle-control {
+ display: inline-flex;
+ align-items: center;
+ justify-content: center;
+ gap: 0.5rem;
+ min-height: 38px;
+ border-radius: 0.5rem;
+}
+
+.sidebar-toggle-control i {
+ font-size: 0.95rem;
+}
+
.sidebar-conversation-item {
cursor: pointer;
padding: 8px 12px;
@@ -513,9 +703,48 @@ body.sidebar-nav-enabled.has-classification-banner .container-fluid {
/* Ensure navbar-brand font size matches top nav */
#sidebar-nav .navbar-brand {
font-size: 20px !important; /* Match top navbar font size */
+ min-width: 0;
}
#sidebar-nav .navbar-brand .fw-bold,
#sidebar-nav .navbar-brand span {
font-size: 20px !important; /* Ensure child elements also use correct size */
+}
+
+#sidebar-user-account {
+ width: var(--sidebar-width, 260px);
+ min-width: var(--sidebar-min-width, 220px);
+ transition: inherit;
+ -webkit-backdrop-filter: blur(10px);
+ backdrop-filter: blur(10px);
+ background-color: var(--bs-light, #f8f9fa) !important;
+}
+
+[data-bs-theme="dark"] #sidebar-user-account {
+ background-color: var(--bs-dark, #212529) !important;
+ border-color: var(--bs-border-color-translucent, rgba(255,255,255,0.15)) !important;
+}
+
+@media (max-width: 575.98px) {
+ #sidebar-nav {
+ max-width: calc(100vw - 0.75rem);
+ }
+
+ #sidebar-user-account {
+ max-width: calc(100vw - 0.75rem);
+ }
+
+ .sidebar-toggle-control {
+ min-height: 42px;
+ }
+
+ #floating-expand-btn {
+ min-height: 42px;
+ padding: 0.5rem 0.75rem;
+ }
+
+ .floating-expand-label {
+ display: inline;
+ font-size: 0.875rem;
+ }
}
\ No newline at end of file
diff --git a/application/single_app/static/css/workspace-responsive.css b/application/single_app/static/css/workspace-responsive.css
new file mode 100644
index 000000000..cf624e7fd
--- /dev/null
+++ b/application/single_app/static/css/workspace-responsive.css
@@ -0,0 +1,437 @@
+/* Shared responsive workspace styles */
+
+.workspace-page {
+ padding-bottom: 2rem;
+}
+
+.workspace-page-header {
+ align-items: flex-start;
+ display: flex;
+ gap: 1rem;
+ justify-content: space-between;
+ margin-bottom: 1rem;
+}
+
+.workspace-page-title {
+ margin-bottom: 0;
+}
+
+.workspace-page-subtitle {
+ color: var(--bs-secondary-color, #6c757d);
+ margin-bottom: 0;
+}
+
+.workspace-section-switcher {
+ display: none;
+ flex: 0 0 clamp(13rem, 30vw, 18rem);
+ min-width: 13rem;
+}
+
+.workspace-section-switcher--persistent {
+ display: flex;
+}
+
+.workspace-section-switcher .form-label {
+ color: var(--bs-secondary-color, #6c757d);
+ font-size: 0.82rem;
+ font-weight: 600;
+ margin-bottom: 0.35rem;
+}
+
+.workspace-page .nav-tabs {
+ flex-wrap: nowrap;
+ gap: 0.35rem;
+ margin-bottom: 0;
+ overflow-x: auto;
+ overflow-y: hidden;
+ scrollbar-width: thin;
+}
+
+.workspace-page .nav-tabs .nav-link {
+ white-space: nowrap;
+}
+
+.document-item-card {
+ border-color: rgba(15, 23, 42, 0.08);
+}
+
+.document-item-card.is-selected {
+ border-color: rgba(13, 110, 253, 0.45);
+ box-shadow: 0 0 0 0.2rem rgba(13, 110, 253, 0.12);
+}
+
+.document-item-card__header {
+ align-items: flex-start;
+ display: flex;
+ gap: 0.75rem;
+ margin-bottom: 0.85rem;
+}
+
+.document-item-card__check {
+ align-items: center;
+ display: flex;
+ justify-content: center;
+ padding-top: 0.15rem;
+}
+
+.document-item-card__check .document-checkbox {
+ margin: 0;
+}
+
+.document-item-card__title-wrap {
+ flex: 1 1 auto;
+ min-width: 0;
+}
+
+.document-item-card__eyebrow {
+ color: var(--bs-secondary-color, #6c757d);
+ font-size: 0.72rem;
+ font-weight: 700;
+ letter-spacing: 0.04em;
+ margin-bottom: 0.2rem;
+ text-transform: uppercase;
+}
+
+.document-item-card__subtitle {
+ color: var(--bs-secondary-color, #6c757d);
+ font-size: 0.8rem;
+ overflow: hidden;
+ text-overflow: ellipsis;
+ white-space: nowrap;
+}
+
+.document-item-card__status {
+ flex: 0 0 auto;
+}
+
+.document-item-card__summary {
+ color: var(--bs-secondary-color, #6c757d);
+ font-size: 0.8rem;
+ line-height: 1.45;
+ margin-bottom: 0.85rem;
+}
+
+.document-item-card__meta {
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.4rem;
+ margin-bottom: 0.75rem;
+}
+
+.document-meta-pill {
+ align-items: center;
+ background: rgba(13, 110, 253, 0.08);
+ border-radius: 999px;
+ color: var(--bs-body-color, #212529);
+ display: inline-flex;
+ font-size: 0.76rem;
+ font-weight: 500;
+ gap: 0.3rem;
+ padding: 0.28rem 0.6rem;
+}
+
+.document-item-card__badges {
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.35rem;
+ margin-bottom: 0.75rem;
+}
+
+.document-item-card__tags {
+ margin-bottom: 0.85rem;
+}
+
+.document-item-card__progress {
+ margin-bottom: 0.85rem;
+}
+
+.document-item-card__progress-label {
+ color: var(--bs-secondary-color, #6c757d);
+ display: block;
+ font-size: 0.76rem;
+ margin-top: 0.35rem;
+ text-align: right;
+}
+
+.document-item-card .item-card-buttons {
+ align-items: center;
+}
+
+.document-item-card .item-card-buttons .btn {
+ align-items: center;
+ display: inline-flex;
+ justify-content: center;
+}
+
+[data-bs-theme="dark"] .document-meta-pill {
+ background: rgba(110, 168, 254, 0.16);
+ color: #e9ecef;
+}
+
+.workspace-toolbar-actions {
+ align-items: center;
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.5rem;
+}
+
+.workspace-page .filter-buttons-col {
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.5rem;
+ justify-content: flex-end;
+}
+
+.workspace-page .action-dropdown .dropdown-toggle::after {
+ display: none;
+}
+
+.workspace-page .table-loading-row td {
+ color: #6c757d;
+ padding: 1.5rem;
+ text-align: center;
+}
+
+@media (max-width: 991.98px) {
+ .workspace-page-header {
+ flex-direction: column;
+ }
+
+ .workspace-section-switcher {
+ display: flex;
+ min-width: 0;
+ width: 100%;
+ }
+
+ .workspace-page .nav-tabs {
+ display: none !important;
+ }
+
+ .workspace-toolbar-actions {
+ display: grid;
+ width: 100%;
+ }
+
+ .workspace-toolbar-actions .btn {
+ width: 100%;
+ }
+
+ .workspace-page .filter-buttons-col {
+ justify-content: stretch;
+ width: 100%;
+ }
+
+ .workspace-page .filter-buttons-col .btn {
+ flex: 1 1 100%;
+ }
+
+ #documents-table,
+ #group-documents-table,
+ #prompts-table,
+ #group-prompts-table,
+ #agents-table,
+ #group-agents-table,
+ #plugins-table,
+ #group-plugins-table {
+ table-layout: auto !important;
+ }
+
+ #documents-table thead,
+ #group-documents-table thead,
+ #prompts-table thead,
+ #group-prompts-table thead,
+ #agents-table thead,
+ #group-agents-table thead,
+ #plugins-table thead,
+ #group-plugins-table thead {
+ display: none;
+ }
+
+ #documents-table,
+ #group-documents-table,
+ #prompts-table,
+ #group-prompts-table,
+ #agents-table,
+ #group-agents-table,
+ #plugins-table,
+ #group-plugins-table,
+ #documents-table tbody,
+ #group-documents-table tbody,
+ #prompts-table tbody,
+ #group-prompts-table tbody,
+ #agents-table tbody,
+ #group-agents-table tbody,
+ #plugins-table tbody,
+ #group-plugins-table tbody {
+ display: block;
+ width: 100%;
+ }
+
+ #documents-table tr.document-row,
+ #group-documents-table tr.document-row,
+ #prompts-table tbody tr,
+ #group-prompts-table tbody tr,
+ #agents-table tbody tr,
+ #group-agents-table tbody tr,
+ #plugins-table tbody tr,
+ #group-plugins-table tbody tr {
+ background: var(--bs-body-bg, #fff);
+ border: 1px solid rgba(15, 23, 42, 0.08);
+ border-radius: 0.9rem;
+ box-shadow: 0 0.35rem 1rem rgba(15, 23, 42, 0.06);
+ display: block;
+ margin-bottom: 0.85rem;
+ overflow: hidden;
+ padding: 0.85rem 0.95rem;
+ }
+
+ #documents-table tr.document-row td,
+ #group-documents-table tr.document-row td,
+ #prompts-table tbody tr td,
+ #group-prompts-table tbody tr td,
+ #agents-table tbody tr td,
+ #group-agents-table tbody tr td,
+ #plugins-table tbody tr td,
+ #group-plugins-table tbody tr td {
+ border: 0;
+ display: block;
+ max-width: none !important;
+ overflow: visible !important;
+ padding: 0.2rem 0;
+ text-align: left !important;
+ white-space: normal !important;
+ width: 100% !important;
+ }
+
+ #documents-table tr.document-row td::before,
+ #group-documents-table tr.document-row td::before,
+ #prompts-table tbody tr td::before,
+ #group-prompts-table tbody tr td::before,
+ #agents-table tbody tr td::before,
+ #group-agents-table tbody tr td::before,
+ #plugins-table tbody tr td::before,
+ #group-plugins-table tbody tr td::before {
+ color: var(--bs-secondary-color, #6c757d);
+ display: block;
+ font-size: 0.72rem;
+ font-weight: 700;
+ letter-spacing: 0.04em;
+ margin-bottom: 0.15rem;
+ text-transform: uppercase;
+ }
+
+ #documents-table tr.document-row td:nth-child(1)::before,
+ #group-documents-table tr.document-row td:nth-child(1)::before {
+ content: "Status";
+ }
+
+ #documents-table tr.document-row td:nth-child(2)::before,
+ #group-documents-table tr.document-row td:nth-child(2)::before {
+ content: "File";
+ }
+
+ #documents-table tr.document-row td:nth-child(3)::before,
+ #group-documents-table tr.document-row td:nth-child(3)::before {
+ content: "Title";
+ }
+
+ #documents-table tr.document-row td:nth-child(4)::before,
+ #group-documents-table tr.document-row td:nth-child(4)::before {
+ content: "Actions";
+ }
+
+ #documents-table tr.document-row td:nth-child(4),
+ #group-documents-table tr.document-row td:nth-child(4) {
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.5rem;
+ margin-top: 0.35rem;
+ padding-top: 0.55rem;
+ }
+
+ #documents-table tr.document-row td:nth-child(4) .btn,
+ #group-documents-table tr.document-row td:nth-child(4) .btn {
+ flex: 1 1 8rem;
+ justify-content: center;
+ }
+
+ #documents-table tr.document-row td:nth-child(4) .action-dropdown,
+ #group-documents-table tr.document-row td:nth-child(4) .action-dropdown {
+ margin-left: auto;
+ }
+
+ #documents-table tr.document-details-row,
+ #documents-table tr.document-status-row,
+ #group-documents-table tr.document-details-row,
+ #group-documents-table tr.document-status-row {
+ display: block;
+ margin-bottom: 0.85rem;
+ margin-top: -0.55rem;
+ }
+
+ #documents-table tr.document-details-row td,
+ #documents-table tr.document-status-row td,
+ #group-documents-table tr.document-details-row td,
+ #group-documents-table tr.document-status-row td {
+ background: rgba(13, 110, 253, 0.04);
+ border: 1px solid rgba(15, 23, 42, 0.08);
+ border-radius: 0 0 0.9rem 0.9rem;
+ display: block;
+ padding: 0.85rem 0.95rem;
+ width: 100%;
+ }
+
+ #prompts-table tbody tr td:nth-child(1)::before,
+ #group-prompts-table tbody tr td:nth-child(1)::before {
+ content: "Prompt";
+ }
+
+ #agents-table tbody tr td:nth-child(1)::before,
+ #group-agents-table tbody tr td:nth-child(1)::before {
+ content: "Name";
+ }
+
+ #plugins-table tbody tr td:nth-child(1)::before,
+ #group-plugins-table tbody tr td:nth-child(1)::before {
+ content: "Name";
+ }
+
+ #agents-table tbody tr td:nth-child(2)::before,
+ #group-agents-table tbody tr td:nth-child(2)::before,
+ #plugins-table tbody tr td:nth-child(2)::before,
+ #group-plugins-table tbody tr td:nth-child(2)::before {
+ content: "Description";
+ }
+
+ #prompts-table tbody tr td:nth-child(2)::before,
+ #group-prompts-table tbody tr td:nth-child(2)::before,
+ #agents-table tbody tr td:nth-child(3)::before,
+ #group-agents-table tbody tr td:nth-child(3)::before,
+ #plugins-table tbody tr td:nth-child(3)::before,
+ #group-plugins-table tbody tr td:nth-child(3)::before {
+ content: "Actions";
+ }
+
+ #prompts-table tbody tr td:nth-child(2),
+ #group-prompts-table tbody tr td:nth-child(2),
+ #agents-table tbody tr td:nth-child(3),
+ #group-agents-table tbody tr td:nth-child(3),
+ #plugins-table tbody tr td:nth-child(3),
+ #group-plugins-table tbody tr td:nth-child(3) {
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.5rem;
+ margin-top: 0.35rem;
+ padding-top: 0.45rem;
+ }
+
+ #agents-table tbody tr td:nth-child(3) .btn,
+ #group-agents-table tbody tr td:nth-child(3) .btn,
+ #plugins-table tbody tr td:nth-child(3) .btn,
+ #group-plugins-table tbody tr td:nth-child(3) .btn,
+ #prompts-table tbody tr td:nth-child(2) .btn,
+ #group-prompts-table tbody tr td:nth-child(2) .btn {
+ flex: 1 1 7rem;
+ justify-content: center;
+ }
+}
\ No newline at end of file
diff --git a/application/single_app/static/images/custom_logo.png b/application/single_app/static/images/custom_logo.png
index ecf6e6521..a5b440e93 100644
Binary files a/application/single_app/static/images/custom_logo.png and b/application/single_app/static/images/custom_logo.png differ
diff --git a/application/single_app/static/images/custom_logo_dark.png b/application/single_app/static/images/custom_logo_dark.png
index 4f2819457..df9485a93 100644
Binary files a/application/single_app/static/images/custom_logo_dark.png and b/application/single_app/static/images/custom_logo_dark.png differ
diff --git a/application/single_app/static/js/admin/admin_agents.js b/application/single_app/static/js/admin/admin_agents.js
index 5c1daed86..6f9c41458 100644
--- a/application/single_app/static/js/admin/admin_agents.js
+++ b/application/single_app/static/js/admin/admin_agents.js
@@ -20,6 +20,12 @@ let orchestrationSettings = {};
let agents = [];
let selectedAgent = null;
+function escapeHtml(text) {
+ const div = document.createElement('div');
+ div.textContent = text ?? '';
+ return div.innerHTML;
+}
+
// --- Function Definitions ---
async function loadAllAdminAgentData() {
@@ -274,10 +280,13 @@ function renderAgentsTable() {
const isSelected = selectedAgent && agent.name === selectedAgent;
const tr = document.createElement('tr');
let selectedBadge = isSelected ? 'Selected' : '';
+ const safeName = escapeHtml(agent.name || '');
+ const safeDisplayName = escapeHtml(agent.display_name || '');
+ const safeDescription = escapeHtml(agent.description || '');
tr.innerHTML = `
-